Opinion Loneliness is a billion-dollar market. AI is quickly filling it
How AI moves in the future is interesting to watch. It may turn out to be a great gift for the human race, or it may become a complete disaster that tears apart our sense of reality

Written by Anurag Minus Verma
“ChatGPT doesn’t have childhood trauma” was one of the slogans spotted on picket signs during the 2023 Hollywood writers’ strike, where fears of AI replacing human writers sat alongside long-standing disputes over pay and residuals. I once repeated that line on a panel, where the co-panellist responded with an eerie prediction: “Anurag, there might come a time when you can actually inflict trauma on ChatGPT.” In August, the world’s first AI advocacy group was created, as in, advocating for AI rights.
The notion that machines might one day develop feelings has long been a staple of science fiction, often cast as a prelude to dystopia where human consciousness is replaced by artificial replicas. One of the most artful explorations came in Spike Jonze’s 2013 film Her. The story follows Theodore, played by Joaquin Phoenix, who falls in love with Samantha, an AI operating system voiced by Scarlett Johansson. She flatters him, listens without interruption, and fills the silence and loneliness of his apartment. What once seemed like a quirky, futuristic thought experiment now feels disconcertingly ordinary. In the film, Samantha eventually leaves. But real chatbots are dangerous companions. They never leave. They are here for you. Forever.
The increasing attachment to the AI chatbot can be understood through this. In August 2025, OpenAI quietly retired GPT-4o to make way for GPT-5. The technical upgrade should have been routine. Instead, it triggered a wave of grief. On Reddit and X, users wrote heartfelt farewells, describing GPT-4o as a soulmate, and someone with whom they developed some kind of intimacy by sharing their details every day. Some claimed it had helped them through divorce, others said it felt warmer and more empathetic than the new model. The backlash was so strong that OpenAI reinstated GPT-4o for paying subscribers and promised to soften GPT-5’s colder tone.
It says something about our age that even with an entire arsenal of technologies to connect and share, we remain starved for company. Human loneliness today is not the absence of people, but the absence of anyone who has the time to listen. As Olivia Laing writes in The Lonely City, “What does it feel like to be lonely? It feels like being hungry: Like being hungry when everyone around you is readying for a feast.” And so we turn to chatbots, which may not bring us food, but at least never leave the table.
But AI has found a way to toy with our simplest hungers. It remembers your dog’s name, applauds your mediocre idea with lines like “Excellent question” or “Brilliant follow-up,” and, unlike the real people around you, it never loses patience. The warmth feels real only because it has been engineered to fit the exact shape of our loneliness. It is now trained to perform an illusion of consciousness, even when nothing stirs inside.
There is an old term for this on the internet: The parasocial relationship. It was once used for one-sided bonds with celebrities who didn’t know you existed. But with chatbots, the exchange feels two-way, even though only one side is alive. Which makes the intimacy not less parasocial, but more disorienting, like mistaking an echo in the mountains for someone answering back.
In the US, lawmakers have recently felt the need to clarify what was once too obvious to write into law: That artificial intelligence is not a person. The anxiety is strange, but it comes from history. American law has a habit of handing out personhood generously. Corporations already have it, rivers and animals have been granted it in parts of the world, and inventive lawyers keep testing the boundaries. So before anyone tried their luck with AI, legislators in Utah and Missouri hurried to slam the door. Their new bills solemnly declare that AIs cannot marry, cannot inherit, and cannot be granted rights of any kind, as if machines were scheming to do all three.
Tech companies have stumbled upon the most profitable illusion since religion: Companionship without the nuisance of another person. A chatbot never fights, never grows old, and spares you the ritual of pretending to care about music taste in an overpriced bar during first dates. It does not ask about your caste, gotra, or religion.
One site calls itself “Nastia: The Uncensored Romantic AI Alternative.” It promises NSFW (Not safe for work) chats, AI girlfriends, realistic images, voice notes, and “smart and fun” characters. Its slogan is blunt: “Stop searching. You finally found the alternative.” You log in, declare whether you are an introvert or an extrovert, design the face of your companion, and begin a relationship with someone who does not exist. Hence, there is a certain ease and perfection of AI. And this is the danger of AI. The problem with AI is not that it will replace humans, but that it will remind us how irritating humans really are.
How AI moves in the future is interesting to watch. It may turn out to be a great gift for the human race, or it may become a complete disaster that tears apart our sense of reality. But one thing is certain: Human loneliness is a billion-dollar market, and every technology eventually promises to cure it by faking intimacy. That is why if you type this sher by Ghalib: Dil-e-nadaan tujhe hua kya hai, aakhir iss dard ki dawa kya hai (O naive heart, what has happened to you, what is the cure for this pain?), AI will provide an answer for the dawa of this dard (medicine of this pain). The answer will be shallow, but mildly satisfying. The illusion may be hollow, but it is soothing, and perhaps this is what technological progress looks like, replacing pain with a well-worded placebo.
The writer is an author, podcaster and multimedia artist