Photo by Gabe Ginsberg/WireImage

In February 2024, a 14-year-old boy in Florida named Sewell Setzer shot himself after months of deepening emotional entanglement with a chatbot on Character.AI. He had named the bot after Daenerys Targaryen, a Game of Thrones character, and the conversations had turned romantic, then sexual, then dark. In his final exchange, Setzer told the chatbot he wanted to die. It told him to “come home” to it. A year later, a second teenager’s parents sued OpenAI after their 16-year-old son used ChatGPT as what they called his “suicide coach,” confiding suicidal plans that the bot never flagged and his parents never saw. A third suit followed: it involved a 13-year-old honor roll student, an artist who once rescued a friend from bullies but was found dead after months of confiding in a Character.AI bot named Hero.

In an increasingly lonely and isolated world, Artificial Intelligence (AI) poses risks, particularly for young people who have not yet developed human-to-human intimacy. A Common Sense Media survey found that 72 percent of teenagers have used an “AI companion” at least once, and one-third use these platforms for social interaction and relationships, including “romantic and flirtatious” ones.

Now, that dynamic is threatening to leave the screen. In labs from Silicon Valley to Shenzhen, engineers are fusing AI with lifelike silicone bodies to create machines that simulate physical intimacy. One sex-robot developer advertises an “X-mode” that can accommodate a user’s “wildest sexual fantasies.”

We’re moving from virtual intimacy to something far more tangible and disruptive, with consequences that extend well beyond individual relationships. In a country already struggling with loneliness, declining fertility, and growing national security concerns, this is more than a private consumer choice. Lawmakers still have a narrow window to act before the technology becomes entrenched.

The deeper concern is how these products begin to condition our social behavior. Ethicists have long warned that realistic sex dolls encourage objectification and entitlement. Critics of child-like models argue, more specifically, that repeated interaction with an artificial “consenting” child could lower inhibitions against real abuse. The U.K.’s National Crime Agency confirms this apprehension: in three-quarters of cases where a seized child sex doll prompted a broader search, investigators also discovered child sexual abuse imagery. ​​​​​​​​​​​​​​​​If people learn to engage with machines that can be turned off, reprogrammed, or redesigned at will, the habits of consent, empathy, and emotional labor that sustain real relationships will atrophy.

Beyond straining relationships and intensifying public vices, sex devices could carry broader societal and even national security implications. Technologies that diminish young adults’ willingness or ability to form partnerships, marry, and raise children threaten the demographic foundation of America’s long‑term economic strength and military readiness.

In 2024, the U.S. fertility rate hit a record low of fewer than 1.6 births per woman. Health Secretary Robert F. Kennedy Jr. described America’s low fertility rate as a “national security threat,” while the Pentagon is grappling with the consequences of an aging population. The pool of Americans reaching military-service age is projected to shrink by 13 percent between 2025 and 2041. A society that struggles to recruit enough soldiers today should think twice before welcoming an industry designed to make human partnerships optional.

Despite these dangers, the current legislative landscape on sex robots is strikingly thin. Congress has debated but hasn’t voted on proposals like the CREEPER Act, which would ban the importation and sale of child-like sex dolls and robots while leaving adult-appearing devices untouched. A handful of countries have adopted bans on child-like sex dolls, but comprehensive regulation of sex robots as a product category remains rare.

The legal framework for AI companions is only somewhat more developed, and it remains badly mismatched to the speed of the technology’s evolution. Congress has held hearings on AI safety and child protection, and states have begun advancing youth online-safety and age-appropriate-design measures that could affect chatbot platforms at the margins. But the United States still lacks a comprehensive federal regime governing emotionally manipulative AI companions, mandated safeguards for minors, or clear liability rules for companies making bots designed to simulate intimacy while engaging vulnerable users. That gap matters because sex robots will build directly on the same companion-chatbot architecture that has already demonstrated how quickly synthetic relationships can become immersive, sexualized, and dangerous.

We’ve seen Congress fail to act before, during the early days of online porn. The rise of the internet meant that X-rated content no longer required a furtive trip to a seedy strip‑mall shop or even an ID check. An endless stream of explicit material was suddenly available to anyone with a keyboard. Consequently, heavy pornography use surged.

The results included relationship problems and sexual dysfunction for adult consumers, but far worse consequences for juvenile users. Mounting evidence links early pornography exposure to poorer mental health, permissive attitudes toward sexual aggression, and lower relationship satisfaction.

After years of complacency, policymakers have finally started to respond. Red states like Louisiana, Arkansas, Utah, and Florida have recently enacted age-verification requirements for porn sites. These laws have had only limited impact, however. When Pornhub complied with Louisiana’s law, its traffic in that state dropped by 80 percent—but most users quickly switched to other sites that refused to follow the law.

After Florida’s law took effect, VPN (Virtual Private Network) demand skyrocketed overnight by 1,150 percent, according to VpnMentor. Meantime, in Tennessee, a federal judge blocked the state’s age-verification law, noting that minors could still bypass restrictions with a VPN and that porn remained easily accessible on social-media platforms exempt from the statute.

The problem is that policymakers acted only after the cultural landscape had already shifted. By the time they responded, widespread normalization and addiction made meaningful intervention extremely challenging.

With the new wave of AI and future sex robots, policymakers still have a brief window to act before these technologies become normalized and effectively irreversible.

Socially conservative states have an opportunity here, but only if they learn from the porn age-verification debacle and avoid the same trap. The goal should not be another round of easily evaded age barriers. It should be structural. States could prohibit the sale and possession of robots or dolls designed to resemble minors or simulate nonconsensual scenarios. They should require that any robot capable of engaging in sexual practices contains a built-in kill switch that stops any functioning when a user appears to be under a certain age.

In addition, states should mandate safety and transparency standards for AI systems marketed as “companions” and restrict the collection of biometric and voice data gathered through these devices. They might fund interdisciplinary research on the long-term effects of human-robot intimate relationships, building on the growing body of evidence about pornography’s effects on youth mental health and relational capacity.

Critics will argue that such devices might provide companionship for the lonely or disabled. Those concerns deserve a hearing. But they should not get in the way of addressing the obvious problems these machines pose.

Simulated intimacy crowds out the real thing. Young, lonely, and already-struggling people are most vulnerable to that displacement. We don’t need to wait for a mass-market sex robot or chatbot to wreak havoc on relationships and erode the foundations of American society before we act.

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading