Premium
Premium

Opinion AI is stepping into intimacy. And it won’t fix the loneliness crisis

Artificial intimacy becomes dangerous when it shifts from supporting human connection to replacing it. When emotional regulation and validation are handed over to bots with no moral responsibility, accountability, and commercial motives to maximise engagement, it could lead to harm

AI and lonelinessArtificial intimacy becomes dangerous when it shifts from supporting human connection to replacing it. When emotional regulation and validation are handed over to bots with no moral responsibility, accountability, and commercial motives to maximise engagement, it could lead to harm.
5 min readJan 26, 2026 12:00 PM IST First published on: Jan 26, 2026 at 11:58 AM IST

By Nikhil Narendran

Societies have, almost always, adopted new technologies and products before fully understanding their long-term effects. Smoking was regarded as a sophisticated lifestyle choice before its health costs became undeniable. Cocaine was openly sold in the 19th century as a medicine, stimulant, and emotional relief, promoted by figures like Sigmund Freud and popularised in works through characters such as Sherlock Holmes.

Advertisement

More recently, social media was praised as a tool for connecting people, until evidence of addiction, anxiety, and harm to young users could no longer be ignored. Australia has now banned social media for children, and countries like Indonesia are following suit. However, these measures only came after several generations had already been impacted negatively.

Artificial intimacy now stands at a similar turning point.

The loneliness crisis and the loneliness economy

At the core of the rise of artificial intimacy lies a deeper problem: a loneliness crisis. Governments in the United States and the United Kingdom have officially recognised loneliness as a public health issue, which leads to long-term health issues.

From phone-a-friend services and late-night call-in shows in the television era to chat lines and video cam platforms, the loneliness economy has been growing for decades. Even in India, the loneliness economy startup ecosystem is reportedly valued at over $1 billion. AI has now supercharged this sector, which was already on fire in the post-pandemic world. When AI enters this economy, emotional availability becomes infinite, personalised, and always-on. The current limitations of the creator-driven loneliness economy, such as human time, labour, and boundaries, no longer limit its supply.

Advertisement

The rise of artificial intimacy through AI companions

AI companions are designed to simulate care, empathy, affirmation, and emotional support. They perform role-play, respond quickly, and remain constantly available. For many, especially teenagers, the vulnerable, and the elderly, these systems are not just software tools but an escape from loneliness. Unlike human intimacy, artificial intimacy has no limits. It does not tire, lose patience, become distracted, or withdraw. It does not challenge users in uncomfortable ways. Instead, it validates, mirrors, and reinforces emotional cues, optimising for continuous engagement.

This can be genuinely beneficial. For those who are isolated, grieving, disabled, or socially anxious, AI companions can provide comfort and continuity. For older adults facing loneliness or individuals navigating mental health challenges, they can offer immediate and accessible support.

Artificial intimacy becomes dangerous when it shifts from supporting human connection to replacing it. When emotional regulation and validation are handed over to bots with no moral responsibility, accountability, and commercial motives to maximise engagement, it could lead to harm.

We are already seeing warning signs. In the US, lawsuits involving Character.AI have highlighted tragic cases where vulnerable users formed intense emotional attachments to AI personas. In some instances, families claim these interactions increased emotional dependence and contributed to self-harm and suicide.

Just as nicotine alters reward pathways, artificial intimacy can alter how people experience attachment, rejection, conflict, and self-worth. It can train users to expect constant reassurance without negotiation, disappointment, or compromise. Real relationships, complicated by a mismatch of reciprocating emotions and expectations, can start to feel draining, unpredictable, and unsatisfying. This could lead to people becoming emotionally dependent on such chatbots over time, making it difficult to handle real-world intimacy.

What are the stakeholders doing?

In response to increasing concerns, some responsible AI developers have introduced safeguards. These include intervention during chats, breaking character during conversations about self-harm, limiting long-term memory, and reminding users that they are interacting with machines rather than sentient beings. Some have restricted romantic role-play or enhanced restrictions for younger users. However, these measures remain inconsistent, voluntary, and reactive.

Most artificial intimacy service providers are naturally not incentivised to adopt these guardrails unless there is a litigation risk. If profit becomes the sole motive for such a delicate sector, we will increasingly see more dark patterns that manipulate user behaviour and choices, where people are drawn into emotional subscription traps. There could be serious national security risks involved here, as well as emotional manipulation can also occur on a population scale, including for indoctrination or radicalisation.

While deployers and developers react to this emergent situation, society has a crucial role to play. Stakeholders, including parents, caregivers, schools, and mental health professionals, have a role to play in identifying early warning signs, interventions, and spreading information about the harms of artificial intimacy. Digital literacy should emphasise that responsiveness is not care, and that simulation is not reciprocity.

Early drafts of India’s data protection framework acknowledged emotional manipulation as a form of harm, which was later removed. The government should consider regulating dark patterns, including creating emotional dependence on artificial companionship and holding artificial intimacy service providers responsible for implementing safeguards to prevent harmful emotional manipulation. They should be encouraged to adopt age-appropriate safety design principles that make artificial intimacy supplementary rather than substitutive.

AI can truly help people practice interaction, cope with temporary loneliness, and access support, while preserving the primacy of human relationships. It can also serve as a supplement to therapy in a supervised environment. The history is clear — when societies act early, harm can be mitigated. When they delay, the damage becomes the new norm. Artificial intimacy may be the new smoking — it’s up to us to act or face the consequences later.

The writer is Partner at Trilegal

Latest Comment
Post Comment
Read Comments