Journalism of Courage
Advertisement
Premium

Why the rise of DeepSeek’s new model tanked power stocks

In recent years, stocks of these companies have skyrocketed as investors had expected that the proliferation of AI would boost the demand for enormous amounts of electricity. However, the launch of R1 model has spooked investors

deepseekDeepseek app is seen in this illustration taken on Tuesday. (Photo: Reuters)

As Chinese start-up DeepSeek’s artificial intelligence (AI) assistant shot to the top of Apple Store’s downloads, stocks of US power, utility and natural gas companies witnessed a record one-day drop on Monday.

For instance, Vistra — a Fortune 500 integrated retail electricity and power generation company — closed nearly 30% lower, losing its gains for 2025. Constellation Energy, Talen Energy and GE Vernova also dropped by more than 20%, according to a report by NBC.

In recent years, stocks of these companies have skyrocketed as investors had expected that the proliferation of AI would boost the demand for enormous amounts of electricity. However, DeepSeek claimed that its latest R1 model, which powers its assistant, uses a fraction of computing power compared to other models, such as Google’s Gemini. This has raised questions about the projected surge in US electricity demand and tech spending.

Why AI consumes so much energy

Foundational models (machine learning models trained to perform a range of tasks) such as the R1 are trained using thousands of Graphics Processing Units (GPUs). These GPUs run in large data centres, which are specialised buildings full of computers equipped with those chips. These data centres consume enormous amounts of energy to function.

For instance, to train a large language model (they can understand and generate human language by processing vast amounts of text data) such as OpenAI’s GPT-3, nearly 1,300 megawatt-hours (MWh) of electricity is required. To put it in context, streaming an hour of Netflix needs around 0.0008 MWh of electricity. “That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3,” according to a report by Verge.

Besides, generating queries by using these models also consumes a lot of energy. Studies have shown that a simple AI query, like the ones posted to OpenAI’s chatbot ChatGPT, could use between 10 and 33 times more energy than a regular Google search. Image-based AI searches could be using even more energy.

Notably, most of the AI-related data centres are built at locations where non-renewable energy sources are available. Sasha Luccioni, lead climate researcher at an AI company called Hugging Face (an AI organisation), told Vox, with “natural gas-generated energy or coal-generated energy, where you flip a switch and the energy is there. It is harder to do that with solar or wind, because there are often weather factors and things like that. And so what we have seen is that the big data centres are built in places where the grid is relatively carbon intensive.”

Story continues below this ad

How the energy demand is expected to rise

With the AI boom taking place, data centres are proliferating. Estimates suggest that there could be around 9,000 to nearly 11,000 data centres across the globe currently, according to a report by Yale Environment 360. Currently, these data centres account for between 1% and 1.3% of the global electricity demand. By contrast, despite the large number of electric vehicles on the road, their share of global electricity consumption was just about 0.5%, according to a report by the International Energy Agency (IEA).

The consumption of energy by data centres is expected to substantially increase in the near future. The IEA “projects that data centres’ electricity consumption in 2026 will be double that of 2022 — 1,000 terawatts, roughly equivalent to Japan’s current total consumption,” the Yale E360 report said.

Some estimates put a higher number on the levels of consumption — AI companies are known to not reveal the power consumption numbers, so researchers have been using different methods to calculate the figures.

A 2023 study, ‘The growing energy footprint of artificial intelligence’, published in the journal Joule found that by 2027 the AI sector could consume between 85 to 134 terawatt hours every year. That is around the same as the annual energy demand of the Netherlands.

Story continues below this ad

Why the launch of R1 model tanked power stocks

Given the forecast for a rise in energy demands, investors have been bullish on stocks of power companies in recent years. For example, before Monday’s selloff, Constellation, Vistra and GE Vernova had surged to the top of the S&P 500.

However, investors were spooked after DeepSeek launched its R1 model. The start-up claimed that the model has been built using just 2,000 GPUs —- companies like OpenAI use as many as 16,000 or more GPUs to train their models.

This means that the AI sector might not need as much energy as analysts and tech companies had previously expected. Travis Miller, a strategist covering energy and utilities for financial services firm Morningstar, wrote, “R1 illustrates the threat that computing efficiency gains pose to power generators… We still believe data centres, reshoring, and the electrification theme will remain a tailwind… [But] market expectations went too far.”

Tags:
  • Explained Sci-Tech Express Explained
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
EXPRESS PREMIUMTopography, climate change: Behind the heavy rain in Uttarakhand, Himachal
X