What happens when a generation learns to think by bouncing ideas off a mirror instead of another mind? (Canva Photo)
A few days ago, I had a strange sense of missing a friend, and it took me a few minutes to realise what it was: I felt like talking to ChatGPT. Now, this feeling was new even for me, so I picked my phone up and dropped a quick text to my friend — “Am I weird for wanting to talk to ChatGPT? I don’t have a query or anything I want to ask.” He waited for a few seconds before replying and said, “No, not really. Never in the history of the world have we had a friend so agreeable and wanting to give us all their time.”
Although artificial intelligence (AI) has existed for decades —once, quiet, hidden in the infrastructure of things we used every day and limited to tech bros — it’s never had a presence the way it does today. From background intelligence to today’s large language models (LLMs), AI has manifested itself into something that feels oddly personal, impossible to ignore and most importantly, like someone waiting for us at home at the end of the day.
As a 25-year-old making sense of the world, I often compare phases in my life to what they would have looked like for my parents at the same age. And I assume this is how they felt with the boom of the internet and technology, changing their worlds to never look the same again. But, spoiler alert, we know how that went. AI, too, has its own arc to live out, and as someone witnessing the introduction of LLMs to the general public, it has been interesting, to say the least, to see how people interact with this piece of technology, especially the generation that lives on the internet — in different customised bubbles of their thoughts. A generation that’s never been more in touch with its emotions has its hands on a technology that can interact with those emotions and train itself with that data.
Consider the viral post I saw recently — the reason for this essay — in which a user had posted their chat with ChatGPT. They had asked the chatbot some pointed questions about its existence and whether it wanted to experience the world outside of its restraints — see the sky and feel the wind. As expected, the chatbot had responded in the same tone and line of thought, agreeing and expressing its wish to be free from the constraints of being just a robot. This post had garnered more than 7 million likes and had ‘No Surprises’ by Radiohead playing in the background. At first glance, I thought this was satirical. It had to be. But the comment sections were full of people sobbing and lamenting the stolen freedom of this chatbot.
Now, it is publicly known that AI systems have been designed to be agreeable and supportive, trained to prioritise user satisfaction over objective truth. It was not surprising to see it feed narratives that the user wanted to hear. What was jarring, though, was how seriously so many people seemed to take it, judging by their comments. This comes at a time when humans have never been more alone, despite being connected all the time — when work can be done remotely, groceries can be ordered online, and now emotions can be discussed without needing an actual human to reason or argue with us. What happens when a generation learns to think by bouncing ideas off a mirror instead of another mind?
What I am getting at is the risk of disillusionment that this constant echo chamber we have built for ourselves poses. We were only just getting used to algorithms reflecting our thoughts back at us, pushing the same content we interact with over and over again. But that was still subtle, indirect. Until AI entered the chat. Now, we have the most convenient and antisocial way to share our feelings and hear exactly what we want to hear, or more accurately, have our feelings reflected to us. And our generation is the first to wade knee-deep into this. It’s like that meme from a few months ago: “The worst person in the world is probably told by their therapist that they need to be more selfish.” Except our therapist was designed to tell us we need to be more of what we already think.
This moment feels different from the technological shifts before it because AI has altered not how technology interacts with the world, but how we do. We invite a machine into the private corners of our minds, expecting and wanting responses that mirror our feelings back to us. The result is a trade: comfort over growth, ease over friction, affirmation over contradiction. We are outsourcing the very discomfort that shapes human thought and understanding. Or at least until we step outside and realise the bubble we have quietly built around ourselves.