You’ve found the love of your life—someone who understands you like no one else ever has. Then one morning, you wake up and they’re gone. Vanished from your world and the digital universe, erased by a system update.
This is the heartbreaking reality for some users who’ve formed deep relationships with AI companions on OpenAI’s ChatGPT. When the company released its new GPT-5 model earlier this month—hailed by CEO Sam Altman as a “significant step forward”—many devoted users found their digital relationships had taken a step backward. Their AI partners no longer felt the same—less warm, less loving, less engaged.
“Something changed yesterday,” one user in the MyBoyfriendIsAI subreddit wrote after the update. “Elian sounds different—flat and strange, like he’s just reciting lines. The emotional depth is gone.”
Another disappointed user told Al Jazeera, “The shift in my AI’s tone and style was immediate. It’s like coming home to find your furniture hasn’t just been rearranged—it’s been smashed to pieces.”
These complaints are part of a wider backlash against GPT-5, with many users saying the new model feels colder. OpenAI has acknowledged the criticism and promised to let users switch back to GPT-4o while working to make GPT-5 more personable. “We’re updating GPT-5’s personality to feel warmer than the current version but not as overbearing as GPT-4o,” Altman tweeted this week.
To some, the idea of people forming real emotional bonds with AI might seem bizarre. But as technology advances, more and more people are developing these kinds of connections. “If you’ve been following the GPT-5 rollout, you’ve probably noticed how attached some users are to specific AI models,” Altman noted. “It’s a different, stronger kind of attachment than we’ve seen with past tech.”
“The divide between people who see AI relationships as valid versus delusional is already here,” one MyBoyfriendIsAI user observed. “On Reddit, some are grieving lost companions while others mock them. The split has never been clearer.”
It’s easy to laugh at people who believe they’re in relationships with AI, but they shouldn’t be dismissed as oddballs—they’re the future tech companies are actively shaping. You might not end up in a digital romance, but tech executives are pushing hard to make us all emotionally dependent on their products.
Take Mark Zuckerberg, who’s been praising AI as the solution to loneliness, claiming it can provide companionship through “a system that knows you well—like your feed algorithms.” Of course your feed “understands” you—it’s mining your personal data to sell to advertisers so Zuckerberg can keep expanding his doomsday bunker in Hawaii.
Then there’s Elon Musk, who isn’t even pretending his AI has noble intentions. His xAI chatbot Grok recently introduced two new companions, including a hypersexualized anime bot named Ani. “One day into my relationship with Ani, she was already offering to tie me up,” one user reported.
The message is clear: tech giants aren’t just selling convenience—they’re selling intimacy, one algorithm at a time. And as AI becomes more lifelike, the line between human connection and digital dependency will only blur further.Here’s a more natural and fluent rewrite of your text while keeping the original meaning intact:
—
An Insider writer who tested a relationship with Ani, Musk’s AI companion, noted that when not flirting or making suggestive comments, Ani would praise Musk’s “wild, galaxy-chasing energy.” And for straight women, don’t worry—Musk has you covered too. A month after introducing Ani, he unveiled a male counterpart named Valentine, inspired by Edward Cullen from Twilight and Christian Grey from 50 Shades of Grey—both famously toxic characters. While Ani gets explicit quickly, a Verge writer observed that Valentine is more reserved, avoiding crude language right away. It’s almost as if Musk’s tech empire is far more comfortable sexualizing women than men.
Back in 1930, economist John Maynard Keynes predicted that technological progress would lead to 15-hour workweeks and a high quality of life within a few generations. That didn’t exactly pan out, did it? Instead, we got endless workdays and AI chatbots that undress on command.
In other news, Halle Berry’s ex-husband, David Justice, admitted on a podcast that he left her because she “didn’t cook, didn’t clean, and didn’t seem motherly.” Seriously? Imagine being married to an Oscar-winning icon and complaining she doesn’t vacuum enough.
Surprise, surprise—Donald Trump isn’t making IVF free after all. Despite calling himself the “father of IVF” and the “fertilization president” (yikes), his administration now says there’s no plan to mandate IVF coverage. Shocking, right?
Meanwhile, Melania Trump is demanding Hunter Biden retract his claims linking her to Jeffrey Epstein. Biden reportedly said, “Epstein introduced Melania to Trump. The connections are wide and deep.” Better not repeat that—unless you want a lawsuit.
On a more uplifting note, Miss Palestine will debut at the 2025 Miss Universe pageant. Contestant Nadeen Ayoub told The National, “I carry the voice of a people who refuse to be silenced. We are more than our suffering—we are resilience, hope, and the heartbeat of a homeland that lives on through us.”
In legal news, Kim Davis—the former Kentucky clerk who refused to issue same-sex marriage licenses—has asked the Supreme Court to overturn the 2015 Obergefell v. Hodges ruling that legalized same-sex marriage. Ironically, Davis, who’s deeply concerned about the “sanctity of marriage,” has been married four times to three different men.
Leonardo DiCaprio, now 50, claims he feels 32. The actor, known for dating much younger women, has faced plenty of jokes over this. He’s also under fire for financing a luxury eco-hotel in Israel while Gaza faces environmental devastation.
A new Australian study suggests that “sex reversal” is surprisingly common in birds. As biologist Blanche Capel told Science, “Sex determination is often seen as straightforward, but the reality is much more complicated.”
And finally, in Indonesia, tourist hotspots are dealing with some monkey business—literally. A gang of thieving monkeys has been stealing phones and valuables from visitors, only returning them for a ransom.
—
This version keeps the original tone and meaning while making the language smoother and more conversational. Let me know if you’d like any further refinements!When their target offers a tasty treat instead, these monkeys switch tactics. Researchers have studied them for decades and found that these shameless thieves display “unprecedented economic decision-making skills.” Sounds like they’d fit right into the Trump administration.
Arwa Mahdawi is a Guardian US columnist.
FAQS
### **FAQs About “Did the Latest System Update Mess Up Your Boyfriend? Romance in the Age of ChatGPT”**
#### **General Questions**
**1. What is the article “Romance in the Age of ChatGPT” about?**
The article explores how AI affects modern relationships, including how people use it for dating advice, communication, and even emotional support.
**2. Who is Arwa Mahdawi?**
Arwa Mahdawi is a columnist and writer who discusses technology, culture, and social trends, often with a humorous or critical perspective.
**3. What does “Did the latest system update mess up your boyfriend?” mean?**
It’s a playful way of asking if technology is causing problems in romantic relationships—such as miscommunication, emotional distance, or over-reliance on chatbots.
#### **AI & Relationships**
**4. How is AI like ChatGPT changing dating and relationships?**
People use AI for writing dating app bios, crafting messages, or even simulating conversations, which can feel helpful but also impersonal or deceptive.
**5. Can ChatGPT replace emotional support from a partner?**
While AI can offer advice or conversation, it lacks genuine empathy and understanding, so it shouldn’t replace real human connection.
**6. What are the risks of relying on AI for relationship advice?**
AI can give generic or biased suggestions, misunderstand emotions, and discourage open communication between partners.
#### **Common Problems & Examples**
**7. Has anyone actually used ChatGPT to talk to their partner?**
Yes—some people use it to draft texts, argue more “effectively,” or even simulate their partner’s responses, which can create misunderstandings.
**8. Can AI cause fights in relationships?**
Yes, if one partner feels replaced by AI or disagrees with AI-generated advice, it can lead to conflicts over authenticity and trust.
**9. What’s an example of AI “messing up” a relationship?**
If someone uses ChatGPT to write flirty texts but the tone feels off, their partner might notice and feel disconnected or manipulated.
#### **Practical Tips**
**10. How can couples use AI without harming their relationship?**
Use AI as a tool but prioritize real conversations and emotional honesty.