Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)SW
Posts
40
Comments
2,018
Joined
2 yr. ago

  • Ah, the beauty of the fundamental theorem of concern-troll linguistics. If you can change the words in a sentence so that the new sentence is racist, the original must be as well. Example:

    Aaron: The weather is just ok.

    Baron: OMG. I can’t believe you just said that. What if I changed the noun and adjective, like this: “

    <ethnic group>

    is

    <negative adjective>

    ”. Go home and think about what you did.

  • Take I've seen in the wild from an AI concern troll: power prices and emissions going up is bad, and it's the fault of anti-AI people for overestimating the energy costs. This is also why we need nuclear power, because it's the only way to cut into the profits of fossil fuel companies.

  • Gouda dog, my name is Herr Fschmidt.

    I’m a 27 year old American Reactionary Programmer (Mennonite coder for you morons). I write parsers and text editors on my TempleOS PC, and spend my days perfecting my html and playing superior Mennonite games. (Determining the closest connection I have with Mennonite strangers, reading the bible, and shunning)

    I train with my Holzaxt every day, this superior weapon can cut clean through Peruvian Walnut because it is hammered over a thousand times, and is vastly superior to any other weapon on earth. I earned my axe license two years ago, and I have been getting better every day.

    I speak Mennonite fluently, both Plautdietsch and the Pennsylvania Dutch dialect, and I write fluently as well. I know everything about Mennonite history and their anabaptism, which I follow 100%

    When I get my Mennonite visa, I am moving to Pennsylvania to attend a prestigious High School to learn more about their magnificent culture. I hope I can become a farmhand at a puppy mill or a pastor!

    I own several zipperless trousers, which I wear around town. I want to get used to wearing them before I move to Lancaster County, so I can fit in easier. I bow to my elders and seniors and speak Bernese German as often as I can, but rarely does anyone manage to respond.

    Wish me luck in Pennsylvania!

  • ok this is likely nothing of real substance but I searched the name of the dude, and discovered that A) there is a wiki of notable incels/incel influencers and B) the OOP has an entry. It's the same username as in the reddit screenshot.

    E: originally I had embedded a link to said incel wiki. Removed because the wiki is run by incels. It’s not hard to find.

  • Do we have a word for people that are kind of like… AI concern trolls? Like they say they are critical of AI, or even against AI, but only ever really put forward pro-AI propaganda, especially in response to actual criticisms of AI. Kind of centrists or (neo) libs. But for AI.

    Bonus points if they also for some reason say we should pivot to more nuclear power, because in their words, even though AI doesn’t use as much electricity as we think, we should still start using more nuclear power to meet the energy demands. (ofc this is bullshit)

    E: Maybe it's just sealion

  • This paragraph caught my interest. It used some terms I wasn’t familiar with, so I dove in.

    Ego gratification as a de facto supergoal (if I may be permitted to describe the flaw in CFAImorphic terms)

    TL note: “CFAI” is thisbook-length document” titled “Creating Friendly AI 1.0: The Analysis and Design of Benevolent Goal Architectures”, in case you forgot. It’s a little difficult to quickly distill what a supergoal is, despite it being defined in the appendix. It’s one of two things:

    1. A big picture type of goal that might require making “smaller” goals to achieve. In the literature this is also known as a “parent goal” (vs. a “child goal”)
    2. An “intrinsically desirable” world (end) state, which probably requires reaching other “world states” to bring about. (The other “world states” are known as “subgoals”, which are in turn “child goals”)

    Yes, these two things look pretty much the same. I’d say the second definition is different because it implies some kind of high-minded “desirability”. It’s hard to quickly figure out if Yud actually ever uses the second definition instead of the first because that would require me reading more of the paper.

    is a normal emotion, leaves a normal subjective trace, and is fairly easy to learn to identify throughout the mind if you can manage to deliberately "catch" yourself doing it even once.

    So Yud isn’t using “supergoal” on the scale of a world state here. Why bother with the cruft of this redundant terminology? Perhaps the rest of the paragraph will tell us.

    Anyway this first sentence is basically the whole email. “My brain was able to delete ego gratification as a supergoal”.

    Once you have the basic ability to notice the emotion,

    Ah, are we weaponising CBT? (cognitive behavioral therapy, not cock-and-ball torture)

    you confront the emotion directly whenever you notice it in action, and you go through your behavior routines to check if there are any cases where altruism is behaving as a de facto child goal of ego gratification; i.e., avoidance of altruistic behavior where it would conflict with ego gratification, or a bias towards a particular form of altruistic behavior that results in ego gratification.

    Yup we are weaponising CBT.

    All that being said, here’s what I think. We know that Yud believes that “aligning AI” is the most altruistic thing in the world. Earlier I said that “ego gratification” isn’t something on the “world state” scale, but for Yud, it is. See, his brain is big enough to change the world, so an impure motive like ego gratification is a “supergoal” in his brain. But at the same time, his certainty in AI-doomsaying is rooted in belief of his own super-intelligence. I’d say that the ethos of ego-gratification has far transcended what can be considered normal.

  • TechTakes @awful.systems

    Microsoft says EU to blame for the world's worst IT outage

    bless this jank @awful.systems

    Occasionally my username appears incorrect

    Buttcoin @awful.systems

    VoughtCoin

    SneerClub @awful.systems

    The Star Fox-style roguelite whose dev refused to use AI voices to cut costs is adding an entire "anti-capitalist revenge" campaign about a cat-girl destroying AI

    Buttcoin @awful.systems

    "Going Infinite": Michael Lewis Takes On Sam Bankman-Fried - If Books Could Kill

    Buttcoin @awful.systems

    “Crypto is already a giant video game — and we are here to make it more fun and rewarding.”

    TechTakes @awful.systems

    An AI beauty pageant? Miss me with that Miss AI.

    SneerClub @awful.systems

    "The Better Angels of Our Nature" Part 2: Campus Lies, I.Q. Rise & Epstein Ties - If Books Could Kill

    TechTakes @awful.systems

    Guy who “would convince employees to take shots of pricey Don Julio tequila, work 20-hour days [and] attend 2am meetings” wants to own WeWork again

    TechTakes @awful.systems

    Google is as Google does

    Buttcoin @awful.systems

    Man vanishes after allegedly pocketing about $500,000 in cryptocurrency account error

    Buttcoin @awful.systems

    Colorado pastor accused of pocketing $1.3M in crypto scheme says 'Lord told us to'

    TechTakes @awful.systems

    “We couldn’t find enough people for our Ponzi scheme, bye”

    SneerClub @awful.systems

    Ex-OpenAI board member Tasha McCauley is deep state because she married Joseph Gordon-Levitt

    bless this jank @awful.systems

    Refreshing on mobile logs you out sometimes

    SneerClub @awful.systems

    Roko’s Basilisk gets a shoutout on CONAF, hypothetically dooming many unsuspecting listeners to… nothing, basically.

    TechTakes @awful.systems

    A (non-tech) comedy podcast I like covered FTX!