In brief

  • OpenAI has reportedly halted plans to release an adult mode in ChatGPT.
  • The move follows earlier promises to allow verified adults access to more content features.
  • The reported move comes the same week that OpenAI canceled its Sora text-to-video generator.

OpenAI has shelved plans to launch its previously announced erotic chatbot, according to a report, apparently backing away from a controversial expansion of ChatGPT that would have allowed adult users to generate sexual content.

The reversal, first reported by the Financial Times on Thursday, follows internal concerns about the societal impact of sexualized artificial intelligence. In January, members of OpenAI’s Expert Council on Well-Being and AI reportedly warned that erotic chat features could foster unhealthy emotional dependency among users, and risk turning the chatbot into what one member described as a “sexy suicide coach.”

OpenAI declined Decrypt’s request to comment on the status of the erotic mode, and the firm has yet to post about its fate.

The decision to cancel what was reportedly to be called “Citron mode” comes two days after OpenAI canceled its Sora text-to-video model, as the company moves to focus development on a unified AI platform rather than a collection of specialized tools.

The move marks a departure from the direction outlined by CEO Sam Altman as recently as October. At the time, Altman said OpenAI planned to allow verified adults to access romantic and erotic content once a robust age-verification system was in place.

Altman described the idea as part of a broader effort to treat adult users with greater autonomy while maintaining safeguards for minors. By December, however, the timeline was pushed to 2026 as the company continued to refine its age-estimation technology.

While OpenAI may be getting out of the adult chatbot business before it ever really got into it, AI models do not necessarily need an “erotic mode” for users to form connections with them.

When OpenAI deprecated GPT-4o last summer, users flooded social media with calls to restore the model after saying they had formed personal and emotional relationships with the chatbot, reflecting a broader debate around erotic chatbots and how people interact with AI.

In June, research published by researchers at Waseda University in Tokyo said 75% of participants reported turning to AI systems for emotional advice.

At the same time, AI developers are facing growing scrutiny as lawsuits test whether conversational AI systems are responsible for reinforcing delusional beliefs or harmful behavior among vulnerable users.

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.