
We should aim for better elites

Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
See our twin at Reddit
Is Scott and others like him at fault for Trump... no it's the "elitist's" fault!
We should aim for better elites
I am still subscribed to slatestarcodex on reddit, and this piece of garbage popped up on my feed. I didn't actually read the whole thing, but basically the author correctly realizes Trump is ruining everything in the process of getting at "DEI" and "wokism", but instead of accepting the blame that rightfully falls on Scott Alexander and the author, deflects and blames the "left" elitists. (I put left in quote marks because the author apparently thinks establishment democrats are actually leftist, I fucking wish).
An illustrative quote (of Scott's that the author agrees with)
We wanted to be able to hold a job without reciting DEI shibboleths or filling in multiple-choice exams about how white people cause earthquakes. Instead we got a thousand scientific studies cancelled because they used the string “trans-” in a sentence on transmembrane proteins.
I don't really follow their subsequent points, they fail to clarify what they mean... In sofar as "left elites" actually refers t
Effective Altruism’s Democracy Problem (Gone Utilitarian!)
A closer look at how reputation, funding, and influence shape EA discourse
Might unilateral billionaire funding skew priorities?
Should the "epistemically humble" listen to people who disagree with them?
Might it be undemocratic to give some people many times more voting power?
Find out this week on 'Keeping up with the Effective Altruists'
Moldbug has a sad
Dark Enlightenment guru sees his desired revolution unraveling under the weight of its own stupidity
Apparently DOGE isn’t killing enough people (literally or metaphorically)
Sam Altman was fired from OpenAI in 2023 by the nonprofit board. After considerable internal turmoil and external pressure from his venture capitalist mates and from Microsoft, he was reinstated. K…
Big Tech Backed Trump for Acceleration. They Got a Decel President Instead
Effective accelerationists didn’t just accidentally shoot themselves in the foot. They methodically blew off each of their toes with a .50 caliber sniper rifle.
A nice and solid mockery of just how badly e/accs derailed their own plans by getting Trump elected. I'll let the subtitle(?) speak for itself:
Effective accelerationists didn’t just accidentally shoot themselves in the foot. They methodically blew off each of their toes with a .50 caliber sniper rifle.
Scoots hot new AGI goss just dropped, Trump loses 3rd election to Grok in stunning upset
Came across this fuckin disaster on Ye Olde LinkedIn by 'Caroline Jeanmaire at AI Governance at The Future Society'
"I've just reviewed what might be the most important AI forecast of the year: a meticulously researched scenario mapping potential paths to AGI by 2027. Authored by Daniel Kokotajlo (>lel) (OpenAI whistleblower), Scott Alexander (>LMAOU), Thomas Larsen, Eli Lifland, and Romeo Dean, it's a quantitatively rigorous analysis beginning with the emergence of true AI agents in mid-2025.
What makes this forecast exceptionally credible:
The scenario details a transformation potentially more significant than the Indus
How to explain our very good friends to normal humans?
Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.
The problem is that it's all deep in the weeds. Every part of it is "it can't be that stupid, you must be explaining it wrong."
With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to "what is a blockchain and how does it work" is "it's a way to move money around out of the sight of regulators" and maybe "so it's for crooks and con men, and a small number of sincere libertarians" and don't even talk about cryptography or technology.
I dunno what the one sentence explanation is of this shit.
"The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God" is completely true, is the purpose of the whole thing, and is also WTF.
Maybe that and "so he started what turned into a cult and a series of cults"? At this p
LessOnline is a festival celebrating truthseeking and blogging, the totally not race science is just a bonus
May 30th — June 1st | Lighthaven, Berkeley CA | Ticket Prices Increase April 1st
yeah i'm sure Matt Levine, qntm and Wildbow are gonna be champing at the bit to attend wordy racist fest
EAs sad that their previous rich grifters are trying to distance themselves from the movement
In a recent Wired article about Anthropic, there's a section where Anthropic's president, Daniela Amodei, and early employee Amanda Askell seem to su…
Moldy's famous now! yaaaaaay
The “Dark Enlightenment” movement and Curtis Yarvin have curried favor with tech executives in recent years, writes Ed Simon.
Renowned Tumblr folklore expert Strange Æons covers Yud's Potter Fanfic
Click to view this content.
I haven't watched it. I don't know how well she will cover the subject or how deep the rabbit hole she will venture.
All I know is she's delightful and I sure as hell won't read that bilge myself, so I'm looking forward to an entertaining summary.
Edit: I watched it. I had a good time.
it's yet another sign of the end times when you have to consider Nick fuckin Land's existence again. Here's one of his ex-students in the FT going "wtf"
An English magus of anti-democratic neoreaction has become a touchstone for the alt-right
archive: https://archive.is/kM5hX
"The questions ChatGPT shouldn’t answer"
To that which a chatbot cannot speak, it should pass over in silence.
A solid piece on AI and ethics (and the general lack of them), featuring a nice sideswipe at our very good friends.
Race science blogger creates PedoAI, using phrenology to detect child molestors
Introducing PedoAI. The first deep learning driven physiognomy model built to distinguish predatory pedophilia, created from a dataset of 1.2 million criminals
While this linear model's overall predictive accuracy barely outperformed random guessing,
I was tempted to write this up for Pivot but fuck giving that blog any sort of publicity.
the rest of the site is a stupendous assortment of a very small field of focus that made this ideal for sneerclub and not just techtakes
The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence
By Timnit Gebru and Emile P. Torres
Pro-tier sneers by seasoned veterans, get em while they're hot!
Edit: I am reliably informed that it is no longer hot.
Tom Chivers, the science writer who stanned hard for the rationalists and Siskind and started writing for Unherd, is doing gender critical podcasts now
Stuart Ritchie is a Scottish psychologist and anti-transgender activist who frequently logrolls for Jesse Singal. Ritchie and Tom Chivers host the anti-trans podcast The Studies Show. References Si…
broken containment warning
some of the sub’s friends are holding a conference although they’re still not totally comfortable to go public:
but buyers are warned that purchases will “require approval”
aww, the poor babies. even with literal nazis in the whitehouse they still feel uncomfortable to spout their weird shit
hopefully if this thing happens at all, someone documents the everliving hell out of every attendee
"Tracing Woodgrains" starts a eugenics-oriented education policy "think-tank"
A think tank centered on orienting education towards a culture of excellence. Click to read Center for Educational Progress, a Substack publication with thousands of subscribers.
Sneerclubbers may recall a recent encounter with "Tracing Woodgrains", née Jack Despain Zhou, the rationalist-infatuated former producer and researcher for "Blocked and Reported", a podcast featuring prominent transphobes Jesse Singal and Katie Herzog.
It turns out he's started a new venture: a "think-tank" called the "Center for Educational Progress." What's this think-tank's focus? Introducing eugenics into educational policy. Of couse they don't put it in those exact words, but that's the goal. The co-founder of the venture is Lillian Tara, former executive director of Pronatalist.org, the outfit run by creepy Harry Potter look-a-likes (and moderately frequent topic in this forum) Simone and Malcolm Collins. According to the anti-racist activist group Hope Not Hate:
The Collinses enlisted Lillian Tara, a pronatalist graduate student at Harvard University. During a call with