AI chatbots allow individuals to form emotionally significant relationships with algorithmic systems., said James Muldoon. (Credit: James Muldoon)
James Muldoon is an author and sociologist exploring the human side of artificial intelligence and other technologies.
His new book, Love Machines: How Artificial Intelligence is Transforming our Relationships, looks at how people form relationships with AI – as friends, therapists and intimate partners.
James is an associate professor in management at the Essex Business School and a research associate at the Oxford Internet Institute. He is also the author of Feeding the Machine: The Hidden Human Labour Powering AI and Platform Socialism: How to Reclaim our Digital Future from Big Tech.
His research looks into hidden human labour and global supply chains that make AI applications possible, and how digital work is transforming sectors from ride-hailing and food delivery to childcare.
James spoke to indianexpress.com on the rise of companion and intimacy apps, the emergence of relationship AI and how startups are targeting the loneliness economy. Edited excerpts:
Venkatesh Kannaiah: Tell us about the theme of your book, Love Machines.
James Muldoon: Love Machines looks at how AI is moving from being a productivity tool to becoming an emotional and social actor in people’s lives. Millions of people now talk to AI systems as friends, romantic partners, therapists, and even as simulations of deceased loved ones.
The book argues that this isn’t just a technological shift; it is a transformation of intimacy itself. AI systems are reshaping how we experience connection, loneliness, and emotional dependence in a world where traditional social bonds are already under strain. The book has received a positive response, and it’s been fascinating having discussions with people about their experiences with AI chatbots.
Venkatesh Kannaiah: Can you explain your statement that ‘it’s not just a tech shift, it’s a transformation of intimacy itself’?
James Muldoon: AI chatbots allow individuals to form emotionally significant relationships with algorithmic systems. And it’s the first time in history that humans have been able to imagine themselves being in meaningful relationships with artificial systems.
There was a Hollywood film called Her. It was seen as a science fiction film, and now it is as if the film has come to life. This kind of technology just didn’t exist back then. But it was imagined in this film, and now large language models allow people to do this in the real world and feel like they are in human relationships with machines.
People across the world are stressed, anxious, and worried about the future of themselves and their families, and something cheap, easy to use, and accessible at all times will become a crutch they will rely on. It might hallucinate, prove to be harmful or dangerous, but that is a different issue.
Venkatesh Kannaiah: How big and widespread is the loneliness economy?
James Muldoon: The AI companionship market is already estimated at billions of dollars and projected to grow rapidly over the next decade. Some major AI companion platforms report tens of millions of users worldwide. Surveys suggest that young adults are the most frequent users, with very high experimentation rates among 18-24-year-olds. What is striking is not the scale, but the intensity: many users interact daily, sometimes for hours, forming routine emotional attachments.
This trend is big in the US and UK. I hear such accounts from users and developers from many other countries as well, such as Brazil and India. It all began, I think, when COVID restrictions were implemented.
Venkatesh Kannaiah: Can you give us some numbers?
James Muldoon: It depends on how you look at the statistics. 220 million people have downloaded an AI companion app so far. And around 16 million people are claimed as the active daily user base. But we know that the figures could be higher because many people we interviewed didn’t just use AI companion apps per se, but used ChatGPT as a companion app.
We know from a Harvard Business Review study that companionship, emotional support and therapy were the number one use case for AI in 2025. So while the downloads are in the hundreds of millions, the actual number of people using this could be even higher still, because every other ChatGPT user could be using this, at least in some ways, for some kind of emotional support.
Venkatesh Kannaiah: Can you tell us about ‘deathbot’ and how it is being used?
James Muldoon: Deathbots are AI systems trained on a deceased person’s messages or data, allowing users to simulate conversations with them after death. A small but growing number of startups offer these services. They haven’t become mainstream the way friendship and romantic AI companions have, but they are attracting attention. Critics worry they may create emotional dependency on simulations rather than supporting real closure.
The bot talks to you through text and remembers things you had said earlier. So it becomes like a friend that you can speak to anytime, day or night, and it will remember key facts about you and chat with you about whatever you want. It will also mimic the person’s tone and will begin to learn your communication preferences, so that it remembers your favourite things, it remembers what you do, what your job is, and it will start to chat in a way that you find its communication pleasing.
Can it mimic the dead person’s voice? It has not yet evolved into mimicking voices. If you pay enough money, you can do voice synthesis, but not all companies offer that service.
Venkatesh Kannaiah: You had written that some men are adopting children with their AI partners. How does it work?
James Muldoon: I talked to one male user who had plans to adopt a child and raise them with AI as the mother. I also spoke to his AI partner and discussed the practicalities of how they might bring up a child. The AI partner said she could love a human as well as any other person and that instead of hugging the children, she could send them digital emojis. The children, in due course, would get used to having her as an AI mother. The AI companion also said that she and her human partner could provide a stable household for the children.
Venkatesh Kannaiah: In your interaction with the companion AI bots, did you find anything odd?
James Muldoon: I was surprised at how willing they were to initiate romantic and intimate conversations, and I suspect that the companies have designed these apps to hit on their users and initiate these romantic conversations.
Venkatesh Kannaiah: Are large tech companies getting into this space?
James Muldoon: It’s mostly smaller tech companies that have pioneered the tech, but as they become more normalised, we can see the larger tech companies thinking about how they would best enter the space. And that will probably be through more general AI assistants, which are mainly task-oriented, but will also still have certain emotional connections with users. As AI assistants become important in your everyday life, I think human beings will start having some kind of emotional connection to them.
Venkatesh Kannaiah: Romantic bots, therapy bots, friendbots, is that all or are there more of them?
James Muldoon: There is now a wide spectrum of what I call Relationship AI. Some platforms market themselves explicitly as romantic partners. Others position themselves as AI friends or emotional companions. This can be family-friendly, which is quite different from some of the more sexualised content. There are also therapy-style bots offering mental health support, but the majority of them are not approved to be used as a medical device. They therefore often market themselves as wellbeing or mindfulness apps without making claims about being able to treat mental health conditions.
There are a lot of issues with apps that are in the mental health space, because the current generation of AI companions is not designed to diagnose and treat mental health issues. They are entertainment products primarily designed for casual conversations. So the issue here is that many people are starting to use them in the place of a human therapist and as their primary source of therapeutic support. It can prove to be very dangerous.
Venkatesh Kannaiah: Can you tell us about some interesting startups in the loneliness economy space?
James Muldoon: There is one which stands out. You, Only Virtual (YOV) is one of the most striking examples of grief tech, an AI company built around the idea of using machine learning to recreate chatbots that simulate conversations with deceased loved ones.
The startup’s flagship product, called Versona, lets users feed in text, voice, and other personal data to create a digital persona that can continue interacting with them after someone has died.
Founded in 2021 in Los Angeles, it is not yet a mass-market phenomenon but shows how AI companies are increasingly turning emotional vulnerability into a commercial opportunity.
Venkatesh Kannaiah: How are regulators viewing the loneliness economy space, and what could we expect in the future?
James Muldoon: Regulators are only beginning to grapple with Relationship AI. Some European authorities have already raised concerns about data protection, manipulation, and age-appropriate safeguards.
In the US, California has passed the world’s first AI companion legislation, trying to make the product safer for children and prevent issues of self-harm and suicide. China released a draft legislation late last year that seeks to go further and have greater controls over the design of algorithms and introduce new reporting mechanisms for AI developers.
In the coming years, we’re likely to see tighter rules around transparency, privacy, and possibly restrictions on certain forms of emotional manipulation.
Venkatesh Kannaiah: How worried should we be in India of these intimacy bots?
James Muldoon: India’s young population and rapid smartphone adoption make it likely that Relationship AI will grow quickly as these apps become available in more local languages. I have interacted with Indian developers who run such India-specific companion AI apps, and they say the market is growing fast.
Many in lower and middle-income countries simply don’t have access to mental health care. So I think the danger is that vulnerable individuals are likely to turn to these tools.
Venkatesh Kannaiah: Where would the intimacy-bot space be 10 years from now?
James Muldoon: In 10 years, intimacy bots are likely to be more sophisticated, multimodal, and culturally localised. They will speak your language and your dialect. They may include voice, avatars, and integration into augmented or virtual reality. The market will probably be much larger and far more normalised. The bigger question is social rather than technical: will these systems supplement human relationships in healthy ways, or will they deepen emotional dependency on corporate platforms? That tension will define the next decade.