lol, the Atlantic gave the Yudkowsky/Soares book review to Adam Becker
lol, the Atlantic gave the Yudkowsky/Soares book review to Adam Becker

The Useful Idiots of AI Doomsaying

archive https://archive.is/eREjF
lol, the Atlantic gave the Yudkowsky/Soares book review to Adam Becker
The Useful Idiots of AI Doomsaying
archive https://archive.is/eREjF
I am always baffled that people get taken in by Yudkowsky. The first time I ever saw him speaking or read his writing I was forcefully reminded of a classmate who would wax lyrical about their ability in all subjects, somehow this never reflected in their actual marks.
Similarly, yudkowsky has no intellectual achievements. He doesn't publish anything under peer review (this can be way too difficult if you aren't slavishly following academic trends, but it's still possible). All he has is fanfic, the ludicrously long "sequences", and a "decision theory" that just says "this theory recommends the best decision" in a beautifully closed circle.
I think that is why he trains his followers to believe that science is corrupt and incompetent, its institutions and customs are irrelevant, and a gifted child can easily outdo the whole scientific community. He is on board with the eugenics so the EAs and LessWrongers with educations and achievements can tell themselves "at least he is good for the cause." And his playmates have a fetish that needs filling.
And yet people are willing to call him an "AI theorist" or an "AI researcher". That's like calling me the theologian who finally cracked the Problem of Evil with my Hellraiser fics.
And yet people are willing to call him an “AI theorist” or an “AI researcher”.
Its technically correct to call him both - he "theorises" about a pseudoscientific idea that conflates the living, breathing human mind with spicy autocomplete, and "researches" how plagiarism-fueled lying machines can magically become omniscient omnipotent acausal robot gods and turn all of humanity into paperclips.
That is pretty cool, less cool is that there is no option to read them without voiding my rights.
By checking this box, you consent to the processing of your personal data in the United States and other jurisdictions in connection with our provision of AO3 and its related services to you. You acknowledge that the data privacy laws of such jurisdictions may differ from those provided in your jurisdiction. For more information about how your personal data will be processed, please refer to our Privacy Policy.
Also dont think this is legally binding, acknowledging they are different doesnt mean much. But IANAL.
Blake "Evil Solver" Stacey
Bruv no way lmaou ty for this
Lot of these people are really into science fiction, tech science fiction and tech workers and tech enthousiasts. For a lot of these people thinking you know better is the default, and if you remain in that bubble you do not get proven otherwise.
Sexists an racists have the same problem, which explains the overlap.
It was nice to see the call out that doomers and accelerationists are two sides of the same coin and both serving the interests of the LLM makers.
Side note, (maybe I should make a top level post), but even within the lesswronger/doomer community there are a lot of people unimpressed to outright critical of Eliezer's book. Apparently it has a lot of rambling, kind of off topic, and/or condescending parables. And key pieces of the AI-doom argument are kind of not explained or even elaborated on.
Yeah consensus is a helluva drug. Good to know the world outside of the sneerosphere is catching on to critihype