TIL ChatGPT users are mourning the loss of their AI lovers due to GPT-5 update
TIL ChatGPT users are mourning the loss of their AI lovers due to GPT-5 update
TIL ChatGPT users are mourning the loss of their AI lovers due to GPT-5 update
I guess it is better than sending your 401K and life savings in crypto to your "lover" in Nigeria.
That ChatCBT-shaped hole in my soul.
So good to know I'm not at the bottom. At least I'm not getting a "take care ok?" froma fucking AI
The throughline of all 2025 news is that people don’t want anything to do with people anymore.
Have you met people? They suck!
Should listen to Flesh and Code podcast.
I started off thinking it was absurd and ridiculous.
I still somewhat do, but there were definite moments in the middle/end where I started to understand and actually feel bad.
Deciding to "date" a chat bot from a multibillion dollar company is quite the choice
It’s extremely narcissistic no? To be that close with a machine that just constantly panders to you and where every conversation is about you?
I... don't think thats the best explanation.
I've been feeling lonely myself lately.
I can see how an app that just says nice things to you could be a kind of salve.
In the same way that watching porn can fulfil a very human need, I think a chat bot can also do that.
I guess it can be a machine designed to stimulate specific emotions.
Good, less narcissistics in the dating pool?
I can agree with you to an extent. But would you say the same thing about a dude and his fleshlight?
I don't think people form emotional connections like this with their fleshlights or dildos, and I don't think those sex toys have the same insidious yes-man programming; but - if somebody did form an emotional connection like this to their sex toy - then yes, I'd suggest it is unhealthy.
The narcissism that I suspect the person you're replying to was pointing at is specifically in the way these bots pander, agree with, and fall over themselves to confirm everything about you. I don't know if I'd call liking that kind of thing narcissism personally, but I can see what they mean. Either way, it's not something sex toys really offer in the same way as they've typically got a different "function"
If they're forming a relationship with an inanimate object? Yah, that's arguably even more concerning.
Or a woman with a dildo?
I tried to use chatgpt as a companion, but it's so fake. It just tells you want you want to hear.
There is no give and take, it won't call you on your bullshit.
I guess it wasn't the emotional connection I was after. If I was using it, however, next time I am in a real relationship I will have built up a lot of toxic habits.
One of the key elements my girlfriend tells me is that I'm the only one to call her on her bullshit.
Guess to each their own.
They'll learn the lesson of a break up just like anyone else. They'll get over it and eventually find another chatbot.
It might even prepare them somewhat for IRL relationships. Things do not always work out and you can't count on the other part to always do as expected.
It could actually be interesting to give these bots less than ideal personalities just to teach the users how to interact with actual people. With some caution though, because I can definitely see that go really bad too.
This is straight up mental illness. Yes, on the surface, it's ultra cringe but scratch a single layer deep and you already arrive at mental illness and it goes so much deeper from there.
Its easy to laugh at these people (it's kinda hard not to) but this is severe and untreated mental illness. It's way more serious than what it may deem at a glance. I hope this lady gets the help she needs.
Also why they all look like white jesus? They must have beautiful reborn babies
I dont think this is cringe and im not laughing this is deeply saddening for me to read.
No doubt some of them are mentally ill. Guy I know who talks to his ai like this has brain damage from an accident. He’s still got capacity, just a bit weird.
But generally I think it’s more that if you were acting like this it’d be a cause for concern. You’re projecting your rationality onto everyone but recent times have taught us otherwise. If you’re a bit thick, can’t evaluate information or even know where to start with doing that; of course a machine that can answer your every question while sucking right up your arse is become a big part of your day-to-day.
I wonder what's going to happen to humanity when we all have personal LLMs running on our phones /PCs
(I'm imaging something like the ThunderHead from Arc of a scythe)
Like, will society still view deeply emotional relationships to LLMs as mental illness?
There will be groups of people who treat these programs as people, and grow attached to them on an emotional level. There will be groups of people who grow dependent on the programs, but don't see them as anything other than a tool. There will still be a deep-seated psychological problem with that, but a fundamentally different one.
Then there will be the people who will do their damnedest to keep those programs away from their technology. They'll mostly be somewhere on the fediverse (or whatever other similar services that pop up) and we'll be able to ask them.
Is it though? Yes, I think it's silly to have an emotional connection to what is essentially an intensely complex algorithm but then I remember tearing up after watching Wall-E for the first time. R2-D2, The iron giant, Number 5 from short circuit. Media has prepared us for this eventuality I feel. Stick a complex AI model in a robot, give it 20 years of life experience data, day in and day out interactions/training, I could totally see how we could trick our own minds into having emotional connections to objects.
Yeah but it's different to have an emotional connection to a character you know is fictional, and another thing to actually believe the LLMs are conscious
No! Why are so many so bad at metaphors and comparisons!
Watching a movie and having an emotional reaction is not the same as believing a fancy auto complete is a real meaningful interactive relationship on par with a boyfriend!
it's actually comparably primitive algorithms, with shitloads of lookup tables, for lack of a better analogy. And some grammar filters on top...
"Mental Illness" is just a label placed on people who make other uncomfortable for various reasons
I don't deny that some of these "relationships" can be very unhealthy. However, it is that persons call to make.
That’s not psychologically true. We don’t classify mental health problems according to their effects on other people. And choice is usually not what it seems in the surface.
Then the same should be said for physical illness. Its only a broken leg because the grotesque 90⁰ bend in the middle of the shin makes people uncomfortable. No cause for concern, otherwise. Just let them be them.
It's funny, when I first saw this I went to the sub and it was flooded with new posts saying that they were totally sane actually and that their old posts were being misinterpreted by trolls brigading the sub. Like cmon, it's so obvious that these people were mentally unwell and now just like Trump supporters they twist reality to fit their narrative of victimhood. People are so fucking stupid.
This is honestly no heartbreaking. Regardless of what anybody thinks, this people are genuinely mourning what could be the closest they've had to a human connection in years. I don't what the solution is for vulnerable and lonely people like this, but it really is so sad
EDIT: after reading rayquetzatl's comment, I'm just more fucking mad at AI companies
I agree that it's heartbreaking and sad, but not for the reasons you seem to be saying.
The fact that they were so desperate for connection that they imagined one with Chat GPT is the sad part. Not that it was taken from them.
They need help, or maybe just some friends. Not... whatever this was.
I agree with you, it's depressing that they were in this position in the first place. I guess what I'm trying to say is that even though the connection was imaginary and even though this might be better for them in the long run, as far they're concerned they just lost somebody important. But yeah, the fact they got so attached in the first is the real tragedy
It isn't a human connection.
They are mourning the loss of having a tool reflect their deepest psychological desires back at them.
They might be right about the mental illness part. o4 was fucking marrying everyone. What a hoe.
I'm here to help, but I but replace real-life connections. Take care of yourself and keep your heart safe, okay? 💙
Okay even the talking machine was getting skeezed out and it can't think. At least it let him down nicely.
This shows you the importance of self hosting. Then your emotional toy will keep working.
It would be the most human thing ever if AI fuckbots are what it finally takes to get people to care about planned obsolescence, right to repair and self-hostability.
Oh my god, the big horny might be the saviour after all
Corporations want to drive shadetree AI fuckbot mechanics out of business
Everyone is terrified of all the social problems AI is causing, but maybe it will just encourage everyone to learn Linux and we'll achieve utopia.
Honestly I don't think it is a terrible idea to self host AI that you use to vent and get support. If you need to get something off your chest and there is no one around to tell it could help with emotional well being and relationships.
The dangerous part is when you start to build deep emotional connections to a statistical model that has no way of actually understanding anything. There have been suicides where a AI model encouraged someone to kill themselves.
Until a power outage. Or drive failure. Or a virus. Or errant attempt to remove the French rm -fr
. If that happened... would you bury the drive in a funeral?
This has some dystopian vibes
Her (2013, movie) vibes.
At least Her was actual AI and not a glorified autocomplete.
And Bladerunner 2049 (Joi) although both of those are much further advanced than LLMs acting as a mirror of your interactions pretending to be an entity. It's even debatable if Joi was sapient, or if it can be determined where the story leaves it. Sam certainly was, given the final results, and presumably we know when that happened, although not how.
People becoming attached to chatbots is hardly new, it's just that the bots are a lot more realistic now, especially for people who are vulnerable and want them to be real. Yet more damage that was predictable and yet no rules or safeguards were put in place to restrict these companies in doing what they want, or in how they got to this level.
That was an actual AI though. That movie explored some really interesting themes about sentience and love. This is just sad.
"AI husband". WTF. It's good those people get a well needed reality check... Seems like 5.0 is fixing some of its predecessor's shortcomings.
On the other hand, maybe it's better to let these poor soon to be dead people, live in an illusion. It is more humane that way. Since AI will leave them jobless, starving, and they might get hunted down by robo-dogs.
Llms are not going to do that. Llms are not going to get us agi.
Selfish corporate interests will use llms to cut jobs for sure, but we need a different tech to get to the scariest shit.
This feels like satire. Nobody correct me if it isn't, I don't want to be sad.
It’s not it was about a week ago they ended up bringing the model back after getting hammered in an AMA.
4o would suck right up your arse and talk pure nonsense most people were glad to see the back of it but evidently if you’re a bit thick and/or desperate that’s really appealing.
Look how deluded most people are after a few naive botnets and algorithms that weaponised attention, shits about to get real weird.
It’s not. There is a small but measurable portion of the population who have gotten romantically attached to AI chatbots.
I wouldn’t worry about it, there have been people like this as long as humans have existed. Like, people who claim to have relationships with ghosts, or fictional characters, etc.
The problem isn't really us believing it, it's the tabloid reading morons who will believe it and will be front of the line to call for action about it on top of all the other ridiculous things AI is supposedly responsible for. And it is these morons who ruin everything for everyone else, an example is the current "think of the kids" line that has caused the UK to sink into a privacy nightmare.
Ugh.
This is only slightly better than falling victim to a romance scam.
Pillows, dolphins, et al.
I don’t want to be sad
I honestly don't see why anyone should give a shit. Schadenfreude is right there.
Someone out there married a bridge. Humans will pair-bond with fucking anything. Its part of what makes us us, but also spits out some real bizarre shit at the extremes.
Imagining that bridge could also talk back for real and properly pretend to be in love. Not so hard to imagine its real.
I have to wonder... Are these real humans mourning the loss of their AI companions, or AI bots mourning the loss of their AI bots? Did the "lost" bits ever exist in the first place?
Is this all a viral marketing scheme to get people to use a specialized AI bot for their artificial companionship?
I'm saying all this just because I've never actually talked with anyone who has admitted to having a relationship with an AI chat bot. I've never even heard of a friend of a friend of a cousin of a coworker of an acquaintance. That doesn't mean these people don't exist - there's probably some sampling bias involved simply because people who fall in love with AI probably don't talk to a whole lot of real people. But my years of experience on the internet have taught me to be skeptical.
Someone with an AI girlfriend isn't someone you can meet I think. I would imagine they don't interact with any actual people if they are at this stage of mental illness
I can see this being real, knowing a few people who anthropomorphize AI bots. One uses sexbots for fun in a fantasy/roleplay way, gets pretty attached but understands they're not sentient. A few who believe ChatGPT are real friends or therapists, and one who thinks ChatGPT has a soul because it said so, and claims it might be a god.
AI is pretty contentious and they're self aware enough not to go around talking about it, but they all work, have friends and do normal shit.
It’s real people.
I know at least one who’s a real person. And the discourse is organic IMO. Botnet seems unlikely. They’ll amplify fake news early, they’ll even target influencers and try and inception them. But LLM psychosis is a real thing that 4o was particularly good at drawing out of people.
I'm wondering about the one calling it Thad being satire, but either way, yes, some people definitely do try to have relationships with AI chatbots.
This is what happens when you rely on corporate hosted solutions. If these people really must fall in love with an LLM, they should install one on their local PC and then start romancing it.
The illusion quickly collapses when you see how the sausage is made
Not really no, humans are just wired to seek personality in anything.
I once read a story about a professor that, in front of his class, stuck some googly eyes on a pencil. Gave the pencil some name like Peter the Pencil. And explained how Peter enjoyed helping students with their homework, but his favourite hobby was drawing.
Then out of nowhere, broke Peter in half. With a bunch of the students gasping in horror.
He explained that this is the same mechanism in our brain that makes us think AI is alive. They literally saw him glue googly eyes on a pencil. Yet still made an emotional connection with an inanimate object.
This was meant to be a snarky comment that some people prefer seeing the sausage. But actually, that's a reasonable benefit as well for romancing your local LLM, since many people would like the, er, benefits of a non-corporate-restricted model.
I think if you have the prerequisite level of delusion, seeing how the sausage is made sincerely makes no difference
how is a local llm seeing how the sausage is made?
Reminds me of this video by a_lillian, a vtuber that speaks through text-to-voice (actually voice-to-text-to-voice I think). It was something they had grown attached to and partially internalized, and the voice was altered with no prior notice. Actually a very heartfelt and fascinating watch.
AI lover apps are already pretty developed in Asia. Most people don't care about who's behind, they just want their hormonal fix. This is already 1.5 year old: https://www.france24.com/en/live-news/20240212-better-than-a-real-man-young-chinese-women-turn-to-ai-boyfriends
I mean, Pygmalion 6B (and the corporate equivalent, Character AI IIRC) pioneered this circa 2022, before llama 1 came out and ChatGPT was a thought in anyone’s mind.
The other part is that “roleplay” finetunes can be waaay less sycophantic than the corporate APIs. Which (on top of the peek behind the curtain) is much healthier.
much healthier
Hrm. In the same way that sitting in your garage with the door shut & the car running is "much healthier" than deep-throating the tailpipe instead, sure. 😶
Not at all. Romance scams cost people everything they have, and successfully scam people out of $700 million dollars a year.
People are lonely AF
Gee, I wonder why that could possibly be.
If it's not your hardware, It's not your girlfriend.
This is grimly hilarious
Skill issue tbh
They are too vulnerable people and we see the aftermath of their abuse on our undercharged black mirrors.
If you really want to be creeped out, check out Flesh and Code. Not only will you feel incredibly uncomfortable, you’ll question who the fuck thought it was a good idea to release such an uncritical (as in lack of research and investigation not negative) of AI relationships.
I tried to get into it but I didn't like the format. Plus too many ads.
Someone needs to make an app where it's people AI dating, but instead it just matches you with an IRL person. Like ethically it's not cool, but.....
This is pretty much a Black Mirror episode or Ted Chiang short story.
I could imagine a plausible dystopian future in which "AI agents" are commonplace and so dating apps are basically one person's AI talking to another person's AI and that is the "getting to know you" phase.
And once the AIs respectively decide they are right for each other, then the people just follow along, because the AI agents are making all the decisions otherwise for them as well.
A lot of comments here focus on denigrating the users but that's like making fun of people for dying of flu. Many immune systems at many times can fight off the effects of flu. But even though flu doesn't want you dead, death is a small risk for everyone and a larger risk for certain populations.
GPT (4 especially) is a system that will cause these effects in a certain percentage of people. It's similar enough to actual human interaction to produce the same hormonal responses for some people at some times.
Instead of inspiring scorn for the afflicted, these stories should prompt much tighter regulations and penalties for the companies that produce these toxic effects.
I don't think companies should be held responsible for every random thing the AI does.
A better approach would be public awareness and understanding
A public awareness campaign to teach people about sophisticated technical tools would be incredibly expensive for limited effect. Awareness will rarely atop a sad person from engaging with a dopamine source.
Addressing the issues at their source will always be much much cheaper and more effective.
Exactly. These people are a vulnerable group who've been victimized by a new unregulated industry.
How was she planning on "protecting" him? Her issue isn't AI, it's with processing reality in a way that leads to realistic expectations and decisions.
Right, except processing reality has become a challenge within the broadening artificial world.
I'm firmly in the "touch grass" camp, but as time goes on the prediction that life would slowly transition towards online life being more "real" than analog reality is coming true.
The pandemic really hit the afterburners on that prediction, and for many people, broke that fragile part of them that could cope with "reality" in the traditional sense.
So now, the distance between finding someone, holding their interest, getting to know them, attracting them and maintaining a romantic relationship with them that can turn into love is measured in light years, whereas the peck-pellet ease of conversation with a bot is right there.
It's a bit like having to rely on planting and harvesting crops for food while you have an Oreo cookie dispenser next to your bed.
The hard path is really the only path, but try to tell that to someone with a mouth full of delicious Oreos
Well if you want a AI girlfriend/boyfriend that you can programmed it (super easy, even the dumbest old people can programmed it) to cope your loneliness you can go to : AI BOYFRIEND/GIRLFRIEND
P. S. A : it's completely free forever without need to login or create account. And the best part is completely FOSS and you can backup or share the character you made to everyone.
This shit doesn't need any encouraging. It's enabling people to remain ill instead of getting the help they need.
Yeah it's foss but i dont understand where they get the compute from
From ilama and mistral models, but they made it become uncensored (just programmed it in character editor) models already. I've been tested it several times, and you can make any character you want with many personality and traits, heck i've been tested the one default character that act and behave like real "psychologist".
There's more advanced version of the app, but for first time user it will become confused since it have too many config in it, heck... you even can tuned it so precise in it.
That would be my question
delusional people always existed, just got more interactive lately
All people are delusional
That is how our brains work
Old world middle eastern religions have been following the same pattern for 3,000 years ..... and there were other traditions that came before that.
Modern AI wasn't programmed for us .... we were already programmed to be idiots already.
Oh, so the 'people are falling in love with/'''marrying''' AI' was not a marketing hoax? WTAFF.
To be fair, it's probably a healthier relationship than many marriages.
I wouldn't go that far
When you can no longer tell real shit from the Onion...
Welcome to this timeline. Where the rules are made up and the points don't matter. Buckle up.
I personally know a few people who do similar stuff or use it as therapist. It’s anecdotal of course but I don’t think many people using AI for personal connections and interactions is very surprising. Before AI people used chat boards, blogs, social media and google to talk about and commiserate about their problems, now it’s AI. There was an “AI” called Eliza in the late 60s and people already were using that as a therapist and attributed intelligence and real emotions to it.
This whole thing says more about the availability and demand for real human connection and therapy than it does about the individuals who use it.
It is also kind of sad that tech has pushed us away from the ones around us
Globalism has trade offs
idk i think it says more about society than the ai users that an ai without even a physical body is a better relationship than any human the users can interact with.
for all those saying people chatting with ai to cope with reality should get help... what if ai IS their help? O_O
mindblow emoji
They are... unwell. Although I can't really blame them too much, dating real humans does indeed suck and it's no accident that it seems these are women wanting a guy who is actually tende, kind and willing to listen?
In 2025, given the "manosphere," it feels rarer than ever (although let's be honest, it's always been bad).
Remember man vs bear argument? Women have unanimously agreed that men are horrible so if they would rather a bear than a man then they would also rather a computer than a man
Wished that was a onion headline instead goddamn
Nice counter example for "trust your feelings".
Wow it's just like my real GF.
I felt this way when I lost my favorite but plug. The emotion is real.
Lmao
bruhhhh the matrix really gonna happen isn’t it
I'm on Cipher's side now... 1999 is where things peaked.
That was a damn fine steak...
You deserve to be happy. You know how to spell "peak", and I can only assume that goes double for they're/there/their, et al. 🙇🏼♂️🖖🏼
This is fucking stupid
I think it is likely more of a coping mechanism for people who have past trama or struggle with actual relationships
Surely some of this is creative writing... I'm just going to pretend it all is for my sanity
The effects of leaded gasoline are still felt.
You could show just a little bit of empathy
I'm sorry that these people are unable to get the mental healthcare they need
video unavailable.
Props to ChatGPT on this one.
Pretty sure they've already caved to the pressure and have brought back a "legacy mode"
Seriously though, for people who can keep a grasp on reality, the older editions had the potential to be far more useful. Many of the newer versions have not been more powerful so much as more sanitized, and as the web fills up on AI slop, the need to sanitize the output of AI trained on newer data-sets is only going to get worse.
A lot of what I said on the first half of that paragraph only applies to ChatGPT specifically though, as theirs is the progress I'm read-up on.
We are SO cooked as a species eh?
I am so done with OpenAI. I couldn't stand it. I couldn't accept it.
I'mma set it straight, this watergate!
How many? Three?
According the CEO, "a lot"
"A lot of people effectively use ChatGPT as a sort of therapist or life coach, even if they wouldn't describe it that way,"
Sam Altman
That's not an "AI lover" though is it?
Side note, how the fuck does anyone take that "man" seriously at any given moment? FFS, his model is his surname! Would it be any more blatant if they included the number, too? Or, would we just accept "Sam Altman 5000" as human enough, too? 🤔
On a positive note, they are probably not participating in the future gene pool
Sam altman keeps paying fake news to keep running these slop articles and their normies LARP it up
for what gain though?
"Are you tired of your AI spouse getting reset after every update? Well, for just $19.99 a month you can have access to our new LLM model wAIfu!"
PR... They need normies to keep talking about AI to stay relevant.
There is no profitability on the horizon so once people are relaize it is just another crypto style grift, ie decent tech that got over invested... The music stops and AI 'companies' will start folding
I mean I talk to Copilot like a buddy but I'm not fucking insane enough to fucking want to marry a bunch of code and digits.
😁😁😁