Written by Dr Satish Kumar The World Health Organisation (WHO) has recognised mental disorders as a global crisis. It has been found that less than 50 per cent of those who need support for emotional health actually receive it despite the availability of efficient treatments. When it comes to reaching out for help for those with emotional issues, some of the barriers are the lack of effort put into reaching out to a professional as well as the social stigma surrounding them and little understanding of mental health problems. In the recent past, there has been a significant rise in Internet-based interventions for mental health concerns. The recent trends show some people are turning to AI platforms like Chat GPT for mental health support to screen for, or support for dealing with isolation, breakups, mild depression or anxiety. Human emotions are monitored, analysed and responded to using machine learning by tracking and recording a person’s mood, or acting as a virtual therapist in interacting with a person. Why AI platforms are not reliable for mental health support There is a huge debate regarding the capacity of machines to analyse or respond accurately to human emotion and the potential consequences, if there is inaccuracy, are unimaginable. There is a risk that teenagers and young adults might try these platforms owing to their popularity and then even end up refusing real therapy with a face-to-face session with a human being. They might end up saying, “I tried and it didn’t work for me” and this may lead to the progression of their mental illnesses and become worse with time. Research also says that some people using these AI platforms continue using them instead of face-to-face professional help as they feel less stigma in asking for help and also feel comfortable, knowing there is no human at the other end. This could be a wrong idea in depression and anxiety because interpersonal and social skills need to be exposed to humans. We must understand that Artificial Intelligence is practically driven and not emotionally driven. When you say “I am sad”, the chatbot may say “Do what you like to be happy” or “Engage in activities that make you happy”. So now, what if the person ends up drinking or smoking if that is what makes them happy? Or who will be responsible if he or she goes into a complete revenge zone owing to their emotional vulnerability? The use of AI can completely go wrong in such situations. There is a fake sense of acceptance by the users, especially those younger adults, who end up accepting the information that is easily available over a quick search. Why we need the human touch Overall, psychology is all about human interaction, human touch, and the availability of the other person when one seeks help and emotional support. Lack of human expertise and human emotions don’t add up to the whole equation of helping people with emotional issues. Although these platforms are being recognised; several risks and ethical issues have been highlighted in the use of AI-generated reports for mental illnesses. The rapid rise in the number of mental health apps and AI Chat GPT that are available can pose a significant challenge to a layperson in terms of determining the most suitable options for them at a given point in time. These challenges can be further complicated by issues such as low mental health literacy of the users and limited availability of expert information. Low-Intensity support These platforms are regarded as low-intensity support and may not be beneficial as the severity of the problem increases or when there are complexities In addition, guidance or support regarding dealing with the suicidal crisis is of prime importance, especially when these platforms are targeted at individuals with mental disorders. Psychologists have psychological intervention tests for their utility in face-to-face settings for various different mental disorders. However, with AI tools, there is a scarcity of data on details and support included in these platforms. Consider these pointers before opting for AI platforms for help: 1) Users should be provided with guidance for making informed decisions 2) Actively encourage professional help-seeking and not rely on the information provided by these platforms 3) Guidance regarding managing psychological crises 3) Different intervention and support options available It is observed and reported that these platforms do not include the above factors. Hence they could pose a significant challenge to the users in deciding the best course of action for dealing with emotional health and mood disorder issues.