Premium

‘Should I break up with my boyfriend?’: ChatGPT to stop giving direct relationship advice; here’s why

OpenAI says the chatbot will now encourage users to reflect on personal decisions instead of making them.

OpenAI emphasised that the company will form an advisory group consisting of experts in human-computer interaction, youth development, and mental health (Representative image/Pexels)OpenAI emphasised that the company will form an advisory group consisting of experts in human-computer interaction, youth development, and mental health (Representative image/Pexels)

With growing concerns around dependency on artificial intelligence (AI), especially in matters of the heart, OpenAI has announced that ChatGPT will no longer offer direct advice on relationship dilemmas, such as whether to break up with a partner.

According to a report in The Guardian, the AI platform will instead guide users to think through their issues, encouraging self-reflection over decision-making. “When you ask something like: ‘Should I break up with my boyfriend?’ ChatGPT shouldn’t give you an answer. It should help you think it through—asking questions, weighing pros and cons,” the company said, as per the report.

OpenAI clarified that the tool will now refrain from answering high-stakes personal questions directly, including those about relationships, and will instead offer support that helps users reach their own conclusions. “New behaviour for high-stakes personal decisions is rolling out soon. We’ll keep tuning when and how they show up so they feel natural and helpful,” the company said in a statement.

Story continues below this ad

To further support this shift, OpenAI is forming an advisory group comprising experts in human-computer interaction, youth development, and mental health. “We hold ourselves to one test: if someone we love turned to ChatGPT for support, would we feel reassured? Getting to an unequivocal ‘yes’ is our work,” said the company’s blog post, The Guardian reported.

Back in May, Altman noted that ChatGPT had become “annoying” and overly sycophantic after users complained about the platform feeding the users to continue the conversation. “The last couple of GPT-4o updates have made the personality too sycophant-y and annoying (even though there are some very good parts of it). We are working on fixes asap, some today and some this week. At some point we will share our learnings from this, it’s been interesting,” Altman said.

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement