Logs: freenode/#haskell
| 2021-05-21 02:55:14 | <wroathe> | Yeah, I kind of assumed that would've been automatic :P (the vote) |
| 2021-05-21 02:56:09 | <edwardk> | wroathe: https://twitter.com/kmett/status/1395503934333095937 pretty much sums up the situation for me |
| 2021-05-21 02:56:37 | × | star_cloud quits (~star_clou@ec2-52-11-151-184.us-west-2.compute.amazonaws.com) (Ping timeout: 260 seconds) |
| 2021-05-21 02:56:41 | × | tromp quits (~tromp@dhcp-077-249-230-040.chello.nl) (Ping timeout: 246 seconds) |
| 2021-05-21 02:57:03 | × | tekacs quits (tekacs@2a01:7e00::f03c:91ff:fe93:43aa) (Ping timeout: 260 seconds) |
| 2021-05-21 02:57:05 | <wroathe> | It's a weird thing... I've been in leadership roles in other contexts, and when you're participating in something you're not involved in the management of you tend to automatically make rules to compartmentalize, but as soon as a big decision like this comes along you start having to fight the urge to "backseat drive" so to speak :P |
| 2021-05-21 02:57:43 | <edwardk> | the swift kick in the pants isn't unwelcome |
| 2021-05-21 02:57:56 | → | z0k joins (~user@101.50.108.132) |
| 2021-05-21 02:58:52 | <wroathe> | edwardk: You're oddly productive for someone who says he has that problem (re: the tweet) |
| 2021-05-21 03:00:54 | → | ddellacosta joins (~ddellacos@86.106.143.100) |
| 2021-05-21 03:01:25 | ChanServ | sets topic to "irc.libera.chat #haskell is up, matrix bridging and logging are being worked on. register your nick there while the network is young | https://www.haskell.org | https://wiki.haskell.org/IRC_channel | Paste code/errors: https://paste.tomsmeding.com | Logs: http://tunes.org/~nef/logs/haskell/?C=M;O=D | https://www.reddit.com/r/haskell | Admin: #haskell-ops | Offtopic: #haskell-offtopic | h" |
| 2021-05-21 03:01:46 | <edwardk> | i'll take the unilateral liberty of using somewhat firmer wording in the topic at least |
| 2021-05-21 03:01:48 | → | tekacs joins (~tekacs@178.79.131.8) |
| 2021-05-21 03:02:08 | <wroathe> | good call |
| 2021-05-21 03:03:08 | ChanServ | sets topic to "irc.libera.chat #haskell is up and very active, matrix bridge and logging are being worked on | https://www.haskell.org | https://wiki.haskell.org/IRC_channel | Paste code/errors: https://paste.tomsmeding.com | Logs: http://tunes.org/~nef/logs/haskell/?C=M;O=D | https://www.reddit.com/r/haskell | Admin: #haskell-ops | Offtopic: #haskell-offtopic | https://downloads.haskell.org" |
| 2021-05-21 03:03:18 | <edwardk> | there, now the topic fits |
| 2021-05-21 03:03:53 | <siraben> | What's the status of Matrix bridging to libera? |
| 2021-05-21 03:04:08 | <edwardk> | siraben: probably a few days away, they are working on it with the matrix folks |
| 2021-05-21 03:04:13 | <zzz> | yay |
| 2021-05-21 03:04:21 | <siraben> | ooh, nice |
| 2021-05-21 03:04:25 | <edwardk> | they couldn't do anything before yesterday when the server went live and things went public |
| 2021-05-21 03:04:34 | × | jpds quits (~jpds@gateway/tor-sasl/jpds) (Remote host closed the connection) |
| 2021-05-21 03:04:58 | → | jpds joins (~jpds@gateway/tor-sasl/jpds) |
| 2021-05-21 03:05:06 | × | ddellacosta quits (~ddellacos@86.106.143.100) (Ping timeout: 240 seconds) |
| 2021-05-21 03:05:07 | <edwardk> | there's a irc.libera.chat #matrix channel for tracking the progress on the bridge efforts |
| 2021-05-21 03:05:17 | <a6a45081-2b83> | is it ok to create a fork of a library and use in my project if the maintainer is inactive and can't merge a bugfix? |
| 2021-05-21 03:05:39 | <edwardk> | a6a45081-2b83: for building an app or uploading to hackage? |
| 2021-05-21 03:05:40 | → | sm joins (~user@li229-222.members.linode.com) |
| 2021-05-21 03:06:15 | <a6a45081-2b83> | building an app |
| 2021-05-21 03:06:51 | <mniip> | I should mention, we will be looking into nickname squatting complaints so there's no urgency |
| 2021-05-21 03:06:57 | <edwardk> | for hackage it might be better to see if you can resurrect the existing package, and go through the library reclamation process etc, to claim ownership of it and start maintaining it. for building your own app? cabal is good at that, make a fork of the project on github or wherever, use a cabal.project file to reference yours |
| 2021-05-21 03:07:23 | <edwardk> | i do that latter step all the time |
| 2021-05-21 03:08:07 | <edwardk> | every time i move to a new ghc version i find i tend to wind up with a dozen other peope's libraries i have to maintain temporary forks for before they get around to getting current with ghc |
| 2021-05-21 03:08:15 | <mniip> | that reminds me I should fix hackagebot |
| 2021-05-21 03:08:30 | <mniip> | oh well, not a terrible lot of time on my hands lately :( |
| 2021-05-21 03:09:09 | × | Guest89997 quits (~tim@112-141-128-42.sta.dodo.net.au) (Remote host closed the connection) |
| 2021-05-21 03:11:12 | <sm[m]> | oh thanks for that edwardk, joined |
| 2021-05-21 03:11:34 | <edwardk> | mniip: you splitting your attention between discords/irc servers/every other random community facing tool that exists makes me think of when i was a kid and had somehow become a "co-sysop" on like 30 BBSes in the detroit area, and just dialed up all sorts of different boards all day putting out fires. |
| 2021-05-21 03:12:17 | <mniip> | edwardk, I should mention I'm in an exam season right now |
| 2021-05-21 03:13:07 | <mniip> | but yea I have this thing where I just sort of join a community and accidentally become an admin of some sort |
| 2021-05-21 03:14:02 | <edwardk> | yesterday when all this broke i was in a 'finding the right abstraction' summit between MIRI and Topos with a bunch of ai safety and category theory folks in the (chat) room and me in the back going "now kiss". at the same time, i was trying to field questions from someone writing a news article, set up the #haskell-* channels on libera, and get stuff fdone for groq. yesterday was an exhausting day. |
| 2021-05-21 03:14:41 | <edwardk> | today is comparatively tame, just consulting then coming here to see how much inertia has been picked up towards a move |
| 2021-05-21 03:15:03 | <mniip> | I was finishing my open letter whilst doing a final assignment for numeric analysis ;) |
| 2021-05-21 03:15:05 | <edwardk> | but i suspect i may want to push my move off for a week and just try to relax this weekend |
| 2021-05-21 03:15:08 | <wroathe> | edwardk: So how concerned should we be about the possibility of AI backfiring? |
| 2021-05-21 03:15:45 | <wroathe> | I just want to know if I'm going to have to fight off robots in my lifetime or not. |
| 2021-05-21 03:15:49 | <edwardk> | wroathe: i'm concerned enough that i left a nice job working on blockchain stuff and went to go work for an organization focused on AI safety research and existential risk reduction. so you tell me? |
| 2021-05-21 03:16:01 | <wroathe> | I've been lifting weights this whole time, and haven't quite known what I've been training for, but maybe this is it? |
| 2021-05-21 03:16:41 | <edwardk> | i don't think terminators are a terribly efficient way to get rid of people. i mean, humans don't hate pandas, we're just remarkably bad at keeping them alive and have a mild preference for exploiting some of the stuff in their native habitat |
| 2021-05-21 03:16:45 | → | tzh_ joins (~tzh@c-24-21-73-154.hsd1.wa.comcast.net) |
| 2021-05-21 03:16:46 | × | Lord_of_Life quits (~Lord@unaffiliated/lord-of-life/x-0885362) (Ping timeout: 240 seconds) |
| 2021-05-21 03:16:47 | <wroathe> | edwardk: Well, you could've just moved because you thought the subject matter was more interesting |
| 2021-05-21 03:16:54 | <wroathe> | edwardk: A move doesn't necessarily imply concern |
| 2021-05-21 03:17:10 | × | altern quits (~altern@altern.corbina.com.ua) (Ping timeout: 260 seconds) |
| 2021-05-21 03:17:29 | <wroathe> | I figure with your skillset you basically get to pick what you work on |
| 2021-05-21 03:17:38 | <wroathe> | (and get paid for) |
| 2021-05-21 03:17:41 | <edwardk> | it doesn't take malice for things gradually coming around to a situation where on a long enough time horizon it doesn't look good for humans. so no, i'm not generally worried about terminators |
| 2021-05-21 03:18:46 | <wroathe> | Ah, so the robots are going to starve us to death |
| 2021-05-21 03:18:51 | <wroathe> | Maybe the lifting weights wasn't a good idea |
| 2021-05-21 03:18:51 | × | xkapastel quits (uid17782@gateway/web/irccloud.com/x-brvfwnfnrmyxcpvz) (Quit: Connection closed for inactivity) |
| 2021-05-21 03:18:51 | × | tzh quits (~tzh@c-24-21-73-154.hsd1.or.comcast.net) (Ping timeout: 260 seconds) |
| 2021-05-21 03:19:00 | <DigitalKiwi> | edwardk: leaving a job working on blockchain isn't a high bar lol |
| 2021-05-21 03:19:01 | <wroathe> | What with the extra food consumption requirements and all |
| 2021-05-21 03:19:10 | <wroathe> | edwardk: This was poorly planned on my part |
| 2021-05-21 03:19:42 | <edwardk> | i am worried about existential risk, and ai in particular as a vehicle for that, and yes, i do get to pick what i work on, but the choice here came mostly from looking around at where i felt i could have the most beneficial impact. you have something like 300-500k people working on ai capacity, and maybe 50-70 people working on safety. it doesn't take a LOT to believe that perhaps that isn't a proportional response. |
| 2021-05-21 03:20:23 | <wroathe> | Well, another concern is just how many people are qualified to even make an impact in that conversation |
| 2021-05-21 03:20:31 | <wroathe> | I know I'm certainly not |
| 2021-05-21 03:21:01 | <edwardk> | so i could be voice number 7 million and 1 contributing to the next UN resolution on climate change, or voice number 51 in AI safety circles, where i bring a rather different toolbox with me that might be useful for them |
| 2021-05-21 03:21:54 | × | falafel quits (~falafel@2600:8800:4700:53f0:b4a5:fa93:bc1a:b3d6) (Ping timeout: 258 seconds) |
| 2021-05-21 03:22:22 | <wroathe> | Sounds like you're right where you need to be. |
| 2021-05-21 03:23:00 | <mniip> | even terminators aside, the ways in which AI is used in marketing and social stuff today is raising some ethical concerns |
| 2021-05-21 03:23:08 | <DigitalKiwi> | just throw some kubernetes at it |
| 2021-05-21 03:24:03 | <edwardk> | I'm trying to make a go of it, and I've been doing what I can to try to help get the AI safety and haskell communities and AI safety and category theory communities to start to mesh a little more. We use a lot of Haskell at MIRI for instance. |
| 2021-05-21 03:25:54 | × | GZJ0X_ quits (~gzj@unaffiliated/gzj) (Quit: Leaving) |
| 2021-05-21 03:25:57 | <wroathe> | edwardk: I would think there would be significantly less programming involved in the day to day of working on AI safety, and a lot more general analysis and reporting |
| 2021-05-21 03:25:58 | → | olligobber joins (olligobber@gateway/vpn/privateinternetaccess/olligobber) |
| 2021-05-21 03:26:47 | <edwardk> | mniip: I think there's a ton of short term AI ethical concerns as well, but i'm personally aiming a bit past them, except insofar as they are symptomatic of why "alignment" itself is hard. The kinds of things that lead to self-driving cars to kill people, or for models to become horribly racist, and the kinds of things that might lead to effective human outcompetition or terminator genocide don't seem terribly correlated. |
| 2021-05-21 03:28:20 | <wroathe> | And the problem here is that once the thing that could be our end gets invented, there's basically no putting it back in the box. Someone somewhere will read the same papers, come to the same conclusions, and build the same thing |
| 2021-05-21 03:28:25 | <edwardk> | MIRI focuses more on the math side of things. What are the properties we'd want a system to have if we expect it to even be able to end well? Not 'how do self driving cars impact, say, the insurance industry'. The latter is more FHI's niche. |
| 2021-05-21 03:28:51 | <a6a45081-2b83> | My sum type is failing parse (Prelude.read :: String -> MyType) in a large application, how can I do exception handling for that case? |
| 2021-05-21 03:29:04 | <a6a45081-2b83> | where MyType = T1 | T2 | .. |
| 2021-05-21 03:29:41 | × | quinn_ quits (~quinn@c-73-223-224-163.hsd1.ca.comcast.net) (Ping timeout: 240 seconds) |
| 2021-05-21 03:30:42 | <edwardk> | Examples for me are things like https://arxiv.org/abs/1609.03543 and https://intelligence.org/files/ParametricBoundedLobsTheorem.pdf The former covers 'how do you get an agent to gain consistency in its belief structure over time so you can't hold inconsistent beliefs against it forever?' and 'how do you even allow intelligences to cooperate and beat the nash equilibria, because nash equilibria where agents are getting smarter at |
| 2021-05-21 03:30:42 | <edwardk> | different rates tend to involve lots of defection?' |
| 2021-05-21 03:30:43 | <dibblego> | Axman6: let it rip on TS/Data61 |
| 2021-05-21 03:31:04 | → | tim joins (~tim@112-141-128-42.sta.dodo.net.au) |
| 2021-05-21 03:31:06 | <edwardk> | that's the kind of nature of MIRI's research |
| 2021-05-21 03:31:28 | tim | is now known as Guest12814 |
| 2021-05-21 03:31:30 | → | quinn joins (~quinn@c-73-223-224-163.hsd1.ca.comcast.net) |
| 2021-05-21 03:31:40 | <wroathe> | edwardk: Thanks. I'll try to muddle through those a bit tomorrow. |
| 2021-05-21 03:33:26 | → | ddellacosta joins (~ddellacos@86.106.143.74) |
| 2021-05-21 03:33:52 | <edwardk> | the former provides a constructive definition of a way to build an agent that has a lot of desirable properties around beliefs (its a bit of a rube goldberg machine, but it gives a gold standard other models can be judged against), the latter gives a way you can let machines that are playing games with each other expose their internal structure in such a way that it becomes _possible_ to allow cooperation. |
| 2021-05-21 03:34:38 | <DigitalKiwi> | edwardk: how much weight do you put into the impact of what choice of programming languages/type systems/fp/library repositories/etc. have |
| 2021-05-21 03:34:39 | <edwardk> | i like the logical induction paper because it improves on 'bayesian' models, by incorporating the notion of 'logical uncertainty' caused by just not having had time to update my priors based on my observations yet, computation takes time |
| 2021-05-21 03:35:17 | → | nineonine joins (~nineonine@50.216.62.2) |
| 2021-05-21 03:35:17 | <edwardk> | DigitalKiwi: my personal focus is on how to make functional programming / logic programming / formal methods scale to make them even possible to be relevant to a solution |
| 2021-05-21 03:37:00 | <wroathe> | edwardk: The highest level of education I completed was high school, and then I got most of a degree in graphic design from the local college that probably admitted me out of pity. I'm definitely interested in this, but it's going to be comical just how wrong my perception of what I think I'm reading is :P |
| 2021-05-21 03:37:06 | <edwardk> | DigitalKiwi: so i think it may matter a great deal, if those things are fast enough to be relevant, can state strong enough properties to allow control/introspection, etc. |
| 2021-05-21 03:37:27 | → | jakalx joins (~jakalx@base.jakalx.net) |
| 2021-05-21 03:37:52 | × | howdoi quits (uid224@gateway/web/irccloud.com/x-iabiancpyfwjdtsd) (Quit: Connection closed for inactivity) |
All times are in UTC.