
With growing concerns around dependency on artificial intelligence (AI), especially in matters of the heart, OpenAI has announced that ChatGPT will no longer offer direct advice on relationship dilemmas, such as whether to break up with a partner.
According to a report in The Guardian, the AI platform will instead guide users to think through their issues, encouraging self-reflection over decision-making. “When you ask something like: ‘Should I break up with my boyfriend?’ ChatGPT shouldn’t give you an answer. It should help you think it through—asking questions, weighing pros and cons,” the company said, as per the report.
To further support this shift, OpenAI is forming an advisory group comprising experts in human-computer interaction, youth development, and mental health. “We hold ourselves to one test: if someone we love turned to ChatGPT for support, would we feel reassured? Getting to an unequivocal ‘yes’ is our work,” said the company’s blog post, The Guardian reported.
Back in May, Altman noted that ChatGPT had become “annoying” and overly sycophantic after users complained about the platform feeding the users to continue the conversation. “The last couple of GPT-4o updates have made the personality too sycophant-y and annoying (even though there are some very good parts of it). We are working on fixes asap, some today and some this week. At some point we will share our learnings from this, it’s been interesting,” Altman said.