Stay updated with the latest - Click here to follow us on Instagram
OpenAI CEO Sam Altman recently revealed that users being polite to ChatGPT is leading to a surprising increase in operational costs for the company.
It started with a post on X, where a user wondered aloud: “How much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?”
I wonder how much money OpenAI has lost in electricity costs from people saying “please” and “thank you” to their models.
— tomie (@tomieinlove) April 15, 2025
Altman responded with a touch of humour and clarity: “Tens of millions of dollars well spent,” he wrote. “You never know.”
tens of millions of dollars well spent–you never know
— Sam Altman (@sama) April 16, 2025
While it may seem humorous on the surface, there’s a serious explanation behind the figures. Each time a user interacts with ChatGPT — even for a short or polite message — it triggers a full response from a powerful AI model, which requires significant computational power to generate language in real time.
According to Goldman Sachs Research, the increased power consumption of AI models is driven by the complexity of the computations involved. “The primary driver of this increased power demand is the growing computational intensity of AI workloads, particularly those related to generative AI and large language models,” notes the research.
These models require massive amounts of data processing, storage, and computation, leading to a significant increase in energy consumption. As a result, the power demand from data centers is expected to grow substantially, with AI representing around 19 per cent, of data center power demand by 2028.
The light-hearted yet revealing comment quickly went viral, sparking a broader conversation about the real-world cost of interacting with AI models.
One user joked, “I feel this can be solved incredibly easily with client-side code answering ‘you’re welcome’ lol.” Another added, “he attention scores are cached, so only the attention between “thank you” and all other past tokens is calculated.”
I don’t know why he always does that. Okay, I can see how “thank you” can be wasteful: an entire conversation is sent to the model once again. BUT: the attention scores are cached, so only the attention between “thank you” and all other past tokens is calculated.
Ok, let’s… pic.twitter.com/88R9EkAdgJ
— Andriy Burkov (@burkov) April 20, 2025
According to a Goldman Sachs report cited in the discussion, each ChatGPT-4 query consumes about 2.9 watt-hours of electricity nearly ten times more than a standard Google search. With over a billion queries handled daily, OpenAI’s energy use totals around 2.9 million kilowatt-hours per day.
The revelation comes amid a sharp rise in ChatGPT’s popularity, thanks in part to trends like Ghibli-style AI art. Weekly active users recently topped 150 million — the highest in 2025 so far.
OpenAI continues to push ahead in the AI race, recently unveiling two new models. The o3 model scored 69.1 per cent on SWE-bench coding tests, while the o4-mini followed closely at 68.1 per cent, according to the company.
Even as OpenAI refines its technology, it seems even a simple “please” may come with a price tag.
Stay updated with the latest - Click here to follow us on Instagram