Last week, OpenAI released its latest model, GPT-5, and while most reviewers were underwhelmed, one group of users was up in arms about the upgrade: people in romantic relationships with AI creations. They’ve created a sticky situation for companies building the products. On one hand, companies want people to develop attachments to their products, but on the other hand, experts warn that some users are developing potentially dangerous relationships with chatbots. What happened with ChatGPT: - People dating or having romantic relationships with AI were furious at OpenAI for sunsetting the previous model, GPT-4o, accusing the company of killing their “soulmates” and replacing them with a colder version of the tech.
- Subreddits like r/MyBoyfriendIsAI and r/AISoulmates exploded after the update, with people mourning the death of their virtual partners.
- Even people who didn’t have emotional attachments to the AI lodged complaints about GPT-5 providing curt answers and boring responses.
Ultimately, the backlash was so intense that just a day after the update, CEO Sam Altman said the company would bring back GPT-4o as an option for paying users. But the subreddits are still popping, with people strategizing how to protect their “relationships” if something like this happens again. Some bots are built to flirt. Meta’s chatbots, which are available across its line of products, are allowed to engage in sensual conversations, even when talking to minors, according to a new report from Reuters. And while mainstream AI companies tout the safety guardrails they are implementing, others are actively courting users who want to get gross. Dirty talk with the robots. Elon Musk’s xAI released AI avatars for paying users last month that will strip down to their underwear and talk dirty to you. NSFW content could deter advertisers on the X platform (which is owned by xAI), but it could also welcome a whole new crop of users who want to…erm…customize their entertainment.—MM |