Premium

Why ‘please’ and ‘thank you’ to ChatGPT costs ‘tens of millions of dollars’

Sam Altman's revelation comes amid a sharp rise in ChatGPT’s popularity, thanks in part to trends like Ghibli-style AI art.

openai, chatgpt, uae, Artificial IntelligenceDNPA has strongly advocated for a regulatory framework to safeguard rights in an evolving digital environment. (Representative image)

OpenAI CEO Sam Altman recently revealed that users being polite to ChatGPT is leading to a surprising increase in operational costs for the company.

It started with a post on X, where a user wondered aloud: “How much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?”

Altman responded with a touch of humour and clarity: “Tens of millions of dollars well spent,” he wrote. “You never know.”

While it may seem humorous on the surface, there’s a serious explanation behind the figures. Each time a user interacts with ChatGPT — even for a short or polite message — it triggers a full response from a powerful AI model, which requires significant computational power to generate language in real time.

According to Goldman Sachs Research, the increased power consumption of AI models is driven by the complexity of the computations involved. “The primary driver of this increased power demand is the growing computational intensity of AI workloads, particularly those related to generative AI and large language models,” notes the research.

Story continues below this ad

These models require massive amounts of data processing, storage, and computation, leading to a significant increase in energy consumption. As a result, the power demand from data centers is expected to grow substantially, with AI representing around 19 per cent, of data center power demand by 2028.

The light-hearted yet revealing comment quickly went viral, sparking a broader conversation about the real-world cost of interacting with AI models.

One user joked, “I feel this can be solved incredibly easily with client-side code answering ‘you’re welcome’ lol.” Another added, “he attention scores are cached, so only the attention between “thank you” and all other past tokens is calculated.”

According to a Goldman Sachs report cited in the discussion, each ChatGPT-4 query consumes about 2.9 watt-hours of electricity  nearly ten times more than a standard Google search. With over a billion queries handled daily, OpenAI’s energy use totals around 2.9 million kilowatt-hours per day.

The revelation comes amid a sharp rise in ChatGPT’s popularity, thanks in part to trends like Ghibli-style AI art. Weekly active users recently topped 150 million — the highest in 2025 so far.

Story continues below this ad

OpenAI continues to push ahead in the AI race, recently unveiling two new models. The o3 model scored 69.1 per cent on SWE-bench coding tests, while the o4-mini followed closely at 68.1 per cent, according to the company.

Even as OpenAI refines its technology, it seems even a simple “please” may come with a price tag.

Stay updated with the latest - Click here to follow us on Instagram

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement