Skip Navigation

Posts
3
Comments
210
Joined
2 yr. ago

  • I’ve finally got around to replying to this but it’s been burning a hole in my subconscious

    I think that’s a naive interpretation of the interests in play here.

    Altman aptly demonstrated that a yes/no on regulations isn’t the money’s goal here, the goal is to control how things get regulated. But at the same time Democrats are hardly “eager to regulate” simpliciter, and the TESCREALs/Silicon Valley can hardly be said to have felt the hammer come down in the past. It may be part of some players’ rhetoric (e.g. Peter Thiel) that the Republicans (both pre- and post-Trump) are their real friends insofar as the Republicans are eager to just throw out corporate regulations entirely, but that’s a different issue: it’s no longer one of whether you can buy influence, it’s a matter of who you choose to buy influence with in the government, or better yet which government you try to put in power.

    It should be noted at this point that mentioning Thiel is hardly out of court, even if he’s not in the LessWrong stream: he shares goals and spaces with big elements of the general TESCREAL stream. He’s put money into Moldbug’s neo-reaction, which is ultimately what puts Nick Land sufficiently on the radar to find his way into Marc Andreesen’s ludicrous manifesto.

    And why should the TESCREALs fear being painted as a satanic cult in the first place? Has that been a problem for anybody but queer people and schoolteachers up to this point? It seems unlikely to me that anyone involved in Open AI or Anthropic is going to just stop spending their absolute oceans of capital for fear that LibsOfTikTok is going to throw the spotlight on them. And why would Raichik do that in the first place? The witch hunters aren’t looking for actual witches, they’re looking for political targets, and I don’t see what’s in it for them in going after some of the wealthiest people on the West Coast except in the most abstract “West Coast elites” fashion, which as we all know is just another way of targeting liberals and queers.

  • Not everything is about your toy train world geopolitical supremacy and there are people far too rich and powerful to give a shit about it who benefit from your believing that it is

    Believe me, I’m not even American: it’s just you, me, and everyone getting the shaft on all five other continents as well

  • The State Department? That unimpeachable organ of American governance?! Why, I don’t even know whether to trust them not to collude with shadowy corporations or not to be duped!

  • It’s really gotta be emphasised that these guys didn’t come out of internet atheism and frankly I would really like to know where that idea came from. It’s a completely different thing which, arguably, predates internet atheism (if we read “internet atheism” as beginning in the early 2000s - but we could obviously push back that date much earlier). These guys are more or less out of Silicon Valley, Emile P Torres has coined the term “TESCREALS” (modified to “TREACLES”) for - and I had to google this even though I know all the names independently - “Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism”.

    It’s a confluence of futurism cults which primarily emerged online (even on the early internet), but also in airport books by e.g. Ray Kurzweil in the 90s, and has gradually made its away into the wider culture, with EA and longtermism the now most successful outgrowths of its spores in the academy.

    Whereas internet atheism kind of bottoms out in 1990s polemics against religion - nominally Christianity, but ultimately fuelled by the end of the Cold War and the West’s hunger for a new enemy (hey look over there, it’s some brown people with a weird religion) - the TREACLES “cluster of ideologies” (I prefer “genealogy”, because this is ultimately about a political genealogy) has deep roots in the weirdest end of libertarian economics/philosophy and rabid anti-communism. And therefore the Cold War (and even pre-Cold War) need for a capitalist political religion. OK the last part is my opinion, but (a) I think it stands up, and (b) it explains the clearly deeply felt need for a techno-religion which justifies the most insane shit as long as there’s money in it.

  • It appears that many of you have been hiding full blown hardout sneers from SneerClub, and I am baffled as to why

  • Come on, you’re talking about America, when did mainstream popular appeal ever limit anyone with money?

  • I have good news for you: the ChatGPT racists got there because the idea isn’t even original to either of them

  • I had a long reply which i think made some errors of interpretation as to what you’re saying. I find this “cancels” language confusing, but I don’t have the energy to do any more in-depth clarification on this thing!

  • Ooooh I get it for Yudkowsky now, I thought you were targeting something else in his comment, on Davis I remain a bit confused, because previously you seemed to be saying that his epistemic luck was in having come up with the term - but this cannot be an example of epistemic luck because there is nothing (relevantly) epistemic in coming up with a term

  • If you thought English was French it would phonetically read something like “pruhccupied” without it, or even more phonetically “prëccupied” (using, funnily enough, the same dots but as in Albanian orthography, which happen to capture the sound quite well). Does this only raise further questions? Well yes.

  • I suppose I must be confused, your saying that the piece was interesting was just because it made you think about the phrase “Gettier attack”?

  • This “Gettier” attack seems to me to have no more interesting content than a “stopped clock”. To use an extremely similar, extremely common phrase, the New York Times would have been “right for the wrong reasons” to call Scott Alexander a racist. And this would be conceptually identical to pointing out that, I dunno, crazed conspiracy theorists suggested before he was caught that Jeffrey Epstein was part of an extensive paedophile network.

    But we see this happen all the time, in fact it’s such a key building block of our daily experience that we have at least two cliches devoted to capturing it.

    Perhaps it would be interesting if we were to pick out authentic Gettier cases which are also accusations of some kind, but it seems likely that in any case (i.e. all cases) where an accusation is levelled with complex evidence, the character of justification fails to be the very kind which would generate a Gettier case. Gettier cases cease to function like Gettier cases when there is a swathe of evidence to be assessed, because already our sense of justification is partial and difficult to target with the precision characteristic of unexpected failure - such cases turn out to be just “stopped clocks”. The sense of counter-intuitivity here seems mostly to be generated by the convoluted grammar of your summarising assessment, but this is just an example of bare recursivity, since you’re applying the language of the post to the post itself.

  • It doesn’t do a bad job of cashing out a fairly strong corollary of utilitarianism which is generally taken to be characteristic of any utilitarian theory worth its salt viz. since each of us is only one person, and the utilitarian calculus calls for us to maximise happiness (or similar), then insofar as each of us only bears moral weight equal to one (presumably equal sized) fraction of that whole, therefore our obligations to others (insofar as the happiness of others obliges us) swamp our own personal preferences. Furthermore, insofar as (without even being a negative utilitarian) suffering is very bad, the alleviation of suffering is a particularly powerful such obligation when our responsibilities to each individual sufferer are counted up.

    This is generally taken to be sufficiently characteristic of utilitarianism that objections against utilitarianism frequently cite this “demandingness” as an implausible consequence of any moral theory worth having.

    So in isolation it makes some sense as shorthand for a profound consequence of utilitarianism the theory which utilitarians themselves frequently stand up as a major advantage of their position, even as opponents of utilitarianism also stand it up for being “too good” and not a practical theory of action.

    In reality it’s a poor description of utilitarian beliefs, as you say, because the theory is not the person, and utilitarians are, on average, slightly more petty and dishonest than the average person who just gives away something to Oxfam here and there.