📣 For more lifestyle news, click here to join our WhatsApp Channel and also follow us on Instagram
AI is writing dating app messages now (Source: Freepik)
Dating apps were once criticised for encouraging people to present overly curated versions of themselves. Today, the concern has shifted to something more complex: people may no longer be writing their own messages at all. According to Scientific American, a growing number of users are turning to artificial intelligence to help them flirt, respond, and sustain conversations on dating platforms. This practice is now referred to as “chatfishing.”
Tools that support chatfishing are becoming easier to access and more challenging to spot. Users can paste messages into chatbots such as ChatGPT, rely on “wingman apps,” or interact with AI coaching features built directly into dating platforms, Scientific American reports.
It adds, “A 2025 study from Norton supports this: six in 10 people who use dating apps believe they’ve encountered at least one conversation written by AI.” Research also shows that people struggle to distinguish between human- and machine-generated text, raising questions about authenticity, consent, and emotional connection in digital dating.
Beyond dates and matches, chatfishing points to a more profound discomfort with text-based intimacy. Humans evolved to connect through voice, facial expressions, and physical presence, yet dating apps reduce attraction to text-based exchanges, in which machines often outperform people.
Dr Sakshi Mandhyan, psychologist and founder at Mandhyan Care, tells indianexpress.com, “When I look at chatfishing through a psychological lens, the key difference is intent. Traditional catfishing is about creating a false identity. Chatfishing is more about masking perceived inadequacy. The person is real, but the emotional voice is outsourced.”
She continues, “I see this often in people who struggle with social anxiety or low relational confidence. They fear saying the wrong thing, sounding boring, or being rejected too quickly. Using AI feels like emotional armour. Psychologically, this connects to low self-efficacy and impression management. People convince themselves it is harmless because they are not lying about who they are. They believe they are just ‘editing’ themselves.”
“I have worked with clients who describe this experience as deeply confusing rather than simply disappointing,” notes Dr Mandhyan, adding that the emotional bond they formed felt real because the brain responds to language, consistency, and perceived understanding. From an attachment perspective, the bond is genuine even if the source is not.
When they meet someone who communicates very differently, Dr Mandhyan stresses, cognitive dissonance sets in. “The mind struggles to reconcile two versions of the same individual. This can lead to self-blame, emotional withdrawal, and difficulty trusting future connections.”
What makes this especially painful is that the loss is ambiguous. “There is no clear villain and no clear closure. The nervous system registers a rupture without a clear explanation, which can increase avoidance and emotional guardedness later,” states the expert.
According to Dr Mandhyan, connection is not about sounding impressive. “It is about being coherent. AI produces fluency, but intimacy comes from emotional congruence. One practical shift is to use AI as a reflection tool rather than a replacement. Think with it, but speak as yourself.”
Another important step, she says, is slowing the pace of emotional escalation. When conversations move too quickly, people attach to language rather than lived interaction. “What I remind people is that real connection allows room for awkwardness. When someone is too perfect in text, it actually raises distance rather than closeness,” Dr Mandhyan says.