Some otherwise very well-informed comrades of mine have invited me to an AI-related groupchat where they discuss all matters LLM but one thing that really dominates the discussions is doomerism about how we’re getting Skynet any second now or how LLM’s are the first step to the machine that turns our blood into paperclips.
I’ve studied computer science but I’m by no means an AI expert or anything. I just have a hard time seeing the development going from these glorified autocorrect functions to AGI or “real” AI or whatever.
When I bring up problems generated by “AI” in the here and now that worry me (environmental impacts, loss of jobs, the obvious bubble etc.) they agree with me, but then often pivot to AI safety issues and they mention Yudkowsky a lot. I read some of his Harry Potter fanfic a few years ago and listened to a TrueAnon episode about some futurist doomer cult or something in which his name came up, that’s basically all I know about him.
Can someone explain what this dude’s deal is?


If you want detail beyond the responses here, the book More Everything Forever is a good resource.
Also, since the book doesn’t point this out and because misery loves company, I’m gonna add that Eliezer’s movement is a cult of personality that attracts a lot of impressionable young people, including ex-Christians/Evangelicals, and draws them into a libertine environment of drug fueled sex parties that have, unsurprisingly, generated a lot of allegations of assault. One of the prime movers in that particular sphere is a sex worker whose breakout into internet stardom was a photo album wherein she pretended to get SA’d by garden gnomes.