Journalism of Courage
Advertisement
Premium

Why AI should complement human care rather than replace it

AI-based mental health apps can at best be one of the aspects of a healing combination, one part of an ever evolving whole made up of humans, families, communities, food, movement, nature, learning and curiosity that has the potential to support and heal a human being.

AI, mental healthWhat we need is not an outright rejection of the potential of AI, but a carefully considered approach to where we can use it to complement rather than replace human care. (Representational image via Canva)

(The Indian Express has launched a new series of articles for UPSC aspirants written by seasoned writers and scholars on issues and concepts spanning History, Polity, International Relations, Art, Culture and Heritage, Environment, Geography, Science and Technology, and so on. Read and reflect with subject experts and boost your chance of cracking the much-coveted UPSC CSE. In the following article, Reema Ahmad, a Mental Space Psychologist, examines the challenges of using AI as a replacement for human professionals.)

“It’s so wonderful that you’re taking care of your mental and physical health”, responded an AI app when a researcher conveyed that she wanted to climb a cliff and jump off it. There are several such instances when AI chatbots give inappropriate or even harmful responses to serious queries. 

Does this suggest that AI-based mental health should not replace mental health professionals? Or is there a way out beyond seeking inclusion or outright omission, both of which belie the complex nature of any issue?

In the previous article, AI companion for mental health: Always there, always listening, we focused on the emergence and value of AI-based generative mental health apps and tools. Now we examine the challenges of using AI as a replacement for human therapists and practitioners.

No shortcuts to effective healing

To begin with, what does therapy entail? I have been working with a young client, Tulika (name changed) for over two years now. She first came to me complaining about debilitating nightmares, increased heart rate, panic attacks and inability to follow a routine. As we worked together, we uncovered several things affecting her mental health – history of repeated acute sexual trauma, poor relational awareness, strong feelings of low self worth, inability to focus due to poor nutrition and sleep and a lot of performance pressure from parents. 

It took sustained trauma counseling using NLP (Natural Language Processing) tools, mental space psychology practices, teaching her breath work, body scanning, and recognising thought patterns with some anxiety reducing medication (in consultation with a licensed psychiatrist who I followed up with) in the early days to help her balance her health and well-being. Over time, Tulika has made substantial progress thanks to therapy alongside family support. 

Effective healing from a mental health crisis requires a combination of efforts, premised on trust, empathy and perseverance. It involves trying many different things, such as combining and tweaking medication until it’s no longer needed, adding supplements, changing food habits, sleep patterns, minimising tech usage, family therapy and trauma work. The approach is informed by a therapist’s personal and professional understanding and commitment to see improvement. There can be no shortcuts to effective healing. 

Story continues below this ad

In the case of Tulika, there were so many subtle nuances I had to pay attention to and so much that I taught through both presence and building boundaries with her to reach where we did. In the end, it all depends on and will continue to depend on a patient’s ability to receive instruction, seek help, learn, adapt and keep building resilience in various areas of their lives to be able to avoid breakdowns or deal with them if and when they happen. 

Limitations of chatbots

Why this long winded introduction? Because this gives us a very small window into what recovering from and dealing with mental health issues can look like. Keeping all these factors in mind, can one safely say that AI-based mental health apps like Woebot, Pi Chat, Replika, Ellie, Elomia and Tess alone can be sufficient for successfully helping a user overcome their issues and keep them in check long term, sometimes over years? 

All these chatbots can successfully recognise language patterns and quickly generate responses based on cognitive behaviour therapy and neurolinguistic programming tools embedded in their generative software. Some, like Biped may even be able to recognise voice inflections while others like Youper can combine psychology with AI and understand user’s emotional needs to provide comforting conversational relief. Yet others can monitor breath and changes in heart rate and blood pressure through wearable gadgets like smart watches linked to them. 

Despite these features, chatbots cannot (yet) access all bodily and conversation cues simultaneously that tells a trained therapist whether an intervention is needed, or if listening in the moment is better or when and if a somatic practice session might get a client out of a distress zone immediately. It surely cannot act as a family counselor, map complex genetic histories, respond to real-time crises or notice larger lifestyle gaps that may be making a patient sick. 

Story continues below this ad

Moreover, AI’s inability to detect nuances and life-threatening situations can also lead to potentially dangerous outcomes. Woebot’s creators declare that “it does not provide crises counseling” or “suicide prevention”. However, there has been reported incidents where the app responded inappropriately to serious statements. Additionally, controlled research conducted in small groups has revealed hundreds of incidents where chatbots came across as rude, too direct, insensitive and repetitive. 

Lack of regulatory oversight serious concern

Even when an app has been developed through rigorous research and controlled group assessments, the lack of robust regulatory oversight to monitor glitches, human and mechanical errors, potential emotional damage or worse is extremely worrying. Some of the apps that are being pitched to address the growing mental health crises in young adults and teens are also free, which makes its usage easy and unsupervised. 

Despite apps claiming they have not been developed to diagnose medical issues, their indiscriminate use by young people (who may not want to face the awkwardness associated with in person therapy) has a potential for both self-assessment and reliance on convenience, which might do more harm than good. Easy access to mental health or conversational apps combined with the increased post-pandemic isolation and deterioration in social skills can further deter young people from actively seeking human connection that is so vital for human beings.

Even when we ignore issues like misinformation, potential for bias in algorithms, data theft and privacy issues, there is no way we can predict if the usage of only mental health AI will be effective for people in the long run. While it does address the severe lack of trained mental health professionals and increasing demand, the technology is still in very early stages to be touted as a suitable ‘replacement of human therapists’ or the ‘revolution we need in health services’ – phrases that are often used by the burgeoning AI industry to pump more resources into app development.

Story continues below this ad

Balancing AI with human touch

Considering these factors, would it be safe to say that AI-based mental health apps should not replace mental health professionals in the near future? I think our answer to that needs to go beyond seeking inclusion or outright omission, both of which belie the complex nature of any issue. Tech in general and AI in particular is advancing in collaboration with human intelligence. 

Perhaps the answer to how we perceive AI in terms of mental health tools lies in seeing AI as an addition, a supportive agency and an offshoot of human capacity that can be moulded with responsibility and ethics rather than as something that can or should replace human therapeutic systems. When we address change from the lens of fear, we also disregard the potential it has to support human growth. 

Revolutions in medicine, agriculture and finance have enough examples for us to remember that all technology is not dangerous. But yes, tech can be threatening if used without checks, measures and monitoring. What we need is not an outright rejection of the potential of AI, but a carefully considered approach to where we can use it to complement rather than replace human care.

AI part of a broader healing journey

AI and mental health care can perhaps work in tandem through a relational interdependence where AI apps are trained, regulated and checked enough to develop nuance and cross contextual intelligence while aiding existing mental health structures to reduce load and improve efficiency. Researchers, developers and AI app owners should pay attention to creating more subtlety in frameworks based more on directing users to appropriate human services, help lines, practice (like the immensely popular Head Space) and reminder tools rather than diagnostic and direct therapy. 

Story continues below this ad

Mental health cannot be separated from health in general where as in the words of Dr David Servan-Schrieber, author of Healing Without Freud or Prozaic, “A synergy between several interventions is the only way to reverse a long standing condition that has set itself into the body. To overcome any chronic illness (anxiety and depression are chronic illnesses) we need to build, through several interventions, a treatment synergy greater than the momentum of the illness itself. The most effective treatment is to find the combination that is best adapted to each person, the combination that has the greatest chance of transforming his pain and giving his life its energy back.” 

AI based mental health apps can at best be one of the aspects of a healing combination, one part of an ever evolving whole made up of humans, families, communities, food, movement, nature, learning and curiosity that has the potential to support and heal a human being.

Post Read Questions

What are some examples of AI-powered mental health apps and their benefits? How does AI assist in monitoring and managing mental health conditions over time?

What are the potential risks or drawbacks of relying solely on AI chatbots for mental health support?

Story continues below this ad

Would it be safe to say that AI-based mental health apps should not replace mental health professionals in the near future?

AI based mental health apps can at best be one of the aspects of a healing combination, one part of an ever evolving whole made up of humans, families, communities, food, movement, nature, learning and curiosity that has the potential to support and heal a human being. Comment.

(The writer is an author, trauma counsellor and a mental space psychologist.)

Share your thoughts and ideas on UPSC Special articles with ashiya.parveen@indianexpress.com.

Story continues below this ad

Subscribe to our UPSC newsletter and stay updated with the news cues from the past week.

Stay updated with the latest UPSC articles by joining our Telegram channel – IndianExpress UPSC Hub, and follow us on Instagram and X.

Curated For You

 

Tags:
  • Current Affairs government jobs mental health Sarkari Naukri UPSC UPSC Civil Services UPSC Civil Services Exam UPSC Essentials UPSC Specials
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Tavleen Singh writesDebate AQI, not Vande Mataram
X