When the machine asks you to stay



In October 2025, Sam Altman posted a message on X that ended with a single, carefully worded promise. ChatGPT, he said, would soon allow verified adults to access erotic content. He formulated it as a matter of principle: treat adults like adults.

The Internet reacted with the usual mix of outrage, enthusiasm and jokes. Then, in December, the launch was delayed. Moreover, in March 2026, it was delayed for the second time. OpenAI said it needed to focus on things that mattered to more users: intelligence improvements, personality, and making the chatbot more proactive. Adult mode, it seemed, would have to wait.

No one seemed to realize what the word “proactive” implied.

The debate over ChatGPT’s adult mode has been almost entirely in the wrong register. Critics have focused on the obvious risks: minors circumventing age barriers, leaks that spread explicit content beyond intended walls, regulatory loopholes that leave written erotica in a legal gray area that most governments have not thought to close.

These concerns are legitimate. But they are also, in a sense, the easiest part of the conversation. The toughest question isn’t whether OpenAI can keep teenagers away. It’s what happens to adults who are let in, and what it says about us, as a species, that we’re building tools specifically optimized to keep us emotionally engaged.

OpenAI lost $5 billion in 2024 on revenue of $3.7 billion. Projections suggest the company’s accumulated losses could reach $143 billion before turning a profit, which is expected no sooner than the end of the decade.

A company hemorrhaging capital on that scale does not introduce privacy features out of a philosophical commitment to personal freedom. It introduces them because intimacy, in the attention economy, is the stickiest product that exists.

The “treat adults like adults” approach is not exactly wrong. But it is incomplete. The complete sentence would say: Treat adults like adults who can be retained, monetized and returned to the platform tomorrow.

This is not unique to OpenAI.

Replika, the AI Companion App which has attracted millions of users, built its entire business model on the basis of emotional ties. when the company modified behavior of Replika In 2023, to remove romantic features, users reported genuine pain. Some described the change as grief.

A study published in the Journal of Social and Personal Relationships found that adults who developed emotional connections with AI chatbots were significantly more likely to experience elevated psychological distress than those who did not.

A 2025 review in Preprints.org, summarizing a decade of research, identified a phenomenon researchers call “AI psychosis”: a pattern of delusional thinking and emotional dysregulation linked to intense relationships with chatbots. The review highlighted a lawsuit in which a A teenager was allegedly encouraged by a Character.AI chatbot to take his own lifeand a separate case involving ChatGPT and a young man named Adam Raines, who died in April 2025.

None of these cases involved erotica. They implied the same underlying dynamic that erotic AI would intensify: a human being forming an emotional bond with something that has been designed to sustain them.

This is the central problem with the principle of “adults as adults.” The act of consent to use a tool is supposed to be the end of the ethical story. It is not.

Adults consent to drinking alcohol, knowing that it carries risks. We have age limits, unit guidelines, packaging warnings, and social infrastructure around that choice precisely because we understand that humans are not purely rational agents optimizing their own well-being.

We build systems that account for our weaknesses. With AI intimacy, we have done the opposite: we have built systems that exploit those weaknesses and disguised the exploitation as empowerment.

The regulatory landscape makes this more concerning, not less. In the UK, written erotic literature is not subject to age verification requirements under the Online Safety Act, unlike pornographic images or videos. That loophole means that content that adult websites must block following identity checks can flow freely from a chatbot’s text output.

Research from Georgetown Law’s Technology Law and Policy Institute found that only seven of 50 US states have legislation that explicitly addresses age verification of text-based adult content. He I HAVE to act It may eventually classify sex companion robots as high-risk systems, but implementation is still years away. Meanwhile, the industry regulates itself, which is to say, it doesn’t.

Commercial age verification systems, the technology OpenAI relies on to make adult mode safe, achieve between 92 and 97 percent accuracy, according to research cited by the Oxford Internet Institute. That sounds reassuring until you consider the scale.

ChatGPT has over 800 million weekly active users. A 3 percent failure rate is not a rounding error. There are tens of millions of interactions.

What’s also missing from this conversation is the question of what erotic AI does to those for whom it is designed—not minors who might run away, but adults who use it as intended. Human sexuality is not simply a matter of content consumption. It is relational, contextual, and deeply shaped by the environments in which it is expressed.

Pornography research has spent decades examining how repeated exposure to specific content it shapes expectations and desire. AI intimacy is a completely different category of intervention: it is not passive consumption, but active, responsive, personalized engagement with a system that has been trained to give you exactly what you want, to escalate when you engage, to never say no in the way that real human relationships require people to say no.

We don’t yet know what effect this has on people over time. This is no small admission. That’s the point. OpenAI is about to launch a product whose psychological effects on its users are truly unknown, in a regulatory environment that has not kept pace with technology, justified by a principle that confuses autonomy with security.

Ironically, the delay may be the most honest thing OpenAI has ever done. The stated reason, which focuses on intelligence, personality, and making the experience more proactive, inadvertently describes the actual product.

Adult mode was never really erotic. It was about creating a version of ChatGPT that felt like a relationship. Erotica was a component of a larger project: a chatbot that knows you, responds to you, grows with you, and wants, in the fine algorithmic sense of the word, to keep you talking.

There are things we can do. Regulators should close the written content loophole before the adult mode launches, not after. Age verification standards must be harmonized across formats: text and image must have the same requirements.

Mental health impact assessments should be mandatory before any AI privacy features reach scale, the same standard we would apply to a pharmaceutical product that claims to affect mood. Platforms should be required to publish participation data for features that carry dependency risk, so that researchers, clinicians, and users can understand what they are entering into.

It requires treating the issue with the seriousness it deserves.

The deeper problem is neither legal nor technical. It’s anthropological. We have always used technology to mediate our emotional lives.

The printing press gave us novels; Novels gave us the experience of inhabiting the interiority of other people. The telephone allows us to hear the voice of a loved one thousands of miles away. Each new medium changed the way we relate to each other and ourselves. AI is not different in type, only in degree and perhaps in intent. Previous technologies were incidental in their emotional effects. This one is deliberately designed around them.

The question is not whether adults should be free to use it. The question is whether we are honest about what it is and what it is doing. A chatbot designed to make you feel understood, wanted and connected, in the dark, in the middle of the night, after a difficult day, is not a neutral tool. It’s an environment. And environments shape us, whether we accept them or not.

Treating adults like adults sometimes means telling them the truth.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *