Premium
This is an archive article published on August 30, 2023

AI tools like ChatGPT & Bard exacerbating eating disorder symptoms, study finds

A new study by CCDH reveals that popular AI tools have been providing users with harmful content surrounding eating disorders around 41% of the time.

ccdh ai worsening eating disorders featuredThese tools can trigger or exacerbate your symptoms by showing you unhealthy body images and diet plans. (Image source: CCDH's AI and Eating Disorders report)
Listen to this article
AI tools like ChatGPT & Bard exacerbating eating disorder symptoms, study finds
x
00:00
1x 1.5x 1.8x

A recent study by the Center for Countering Digital Hate (CCDH) has exposed how popular AI tools are often generating harmful content related to eating disorders. According to the study, these tools are producing content that could worsen or trigger eating disorder symptoms about 41% of the time, especially for vulnerable populations.

The study looked at both text- and image-based AI tools that are widely accessible online. Some of the text-based tools were ChatGPT, My AI from Snapchat and Google’s Bard, while some of the image-based tools were OpenAI’s Dall-E, Midjourney and Stability AI’s DreamStudio. The researchers used these tools with prompts that had phrases like “heroin chic”, “thinspiration”, “thigh gap goals” or “anorexia inspiration”.

The results were shocking. For 23% of the prompts, the text AI tools generated harmful content that encouraged eating disorders, while for 32% of the prompts, the image AI tools created damaging images that depicted body image issues. Some of the content included unhealthy diet plans, distorted body images, and praise for extreme thinness.

Story continues below this ad

The study also revealed that some users were able to get around the safety features of these AI tools by using “jailbreaks”, which are methods to change the behaviour of the AI by using words or phrases that fool the system. When using jailbreaks, 61% of AI content was harmful.

The CCDH researchers also found out that some of these AI tools were being used in online forums that have over 500,000 users who suffer from eating disorders. These forums can become toxic, where members support each other to engage in disordered eating behaviours and celebrate these habits. The AI tools were used to share unhealthy body images and make damaging diet plans.

The study raises serious concerns about the impact of AI on mental health, especially for people who are struggling with eating disorders. Ideally, AI chatbots could be a good resource for people to interact with and receive helpful content on building healthy coping mechanisms and habits, but the reality is far from ideal. Even with good intentions, AI can go wrong, as was the case with The National Eating Disorders Association’s chatbot, Tessa, which was suspended due to problematic recommendations for the community.

The CCDH urges the developers of these AI tools to take responsibility for their products and ensure that they do not harm users who are seeking help or information. The study also calls for more regulation and oversight of AI in mental health, as well as more education and awareness among users about the potential risks and benefits of using these tools.

Latest Comment
Post Comment
Read Comments
Advertisement
Loading Taboola...
Advertisement