ChatGPT Energy Consumption: Separating Fact from Fiction
ChatGPT, the popular AI chatbot platform from OpenAI, may not be as power-hungry as previously assumed. However, its energy consumption largely depends on how it is being used and the AI models that are answering queries, according to a recent study.
The Controversial 3 Watt-Hours Estimate
A commonly cited stat is that ChatGPT requires around 3 watt-hours of power to answer a single question, or 10 times as much as a Google search. However, this estimate has been disputed by Epoch AI, a nonprofit AI research institute. Using OpenAI’s latest default model, GPT-4o, as a reference, Epoch found that the average ChatGPT query consumes around 0.3 watt-hours – less than many household appliances.
A Closer Look at the Energy Usage
"The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car," Joshua You, a data analyst at Epoch, told TechCrunch. You also pointed out that the author of the report that arrived at the 3 watt-hours estimate assumed OpenAI used older, less-efficient chips to run its models.
A Rebuttal to the Common Misconception
"I’ve seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn’t really accurately describe the energy that was going to AI today," You said. "Also, some of my colleagues noticed that the most widely reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high."
Limitations of the Analysis
While Epoch’s 0.3 watt-hours figure is an approximation, OpenAI hasn’t published the details needed to make a precise calculation. The analysis also doesn’t consider the additional energy costs incurred by ChatGPT features like image generation, or input processing.
Expectations for the Future
You said he does expect baseline ChatGPT power consumption to rise, however. "The AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely – handling much more tasks, and more complex tasks, than how people use ChatGPT today," You said.
The Growing Energy Demands of AI
While there have been remarkable breakthroughs in AI efficiency in recent months, the scale at which AI is being deployed is expected to drive enormous, power-hungry infrastructure expansion. In the next two years, AI data centers may need nearly all of California’s 2022 power capacity (68 GW), according to a Rand report. By 2030, training a frontier model could demand power output equivalent to that of eight nuclear reactors (8 GW), the report predicted.
The Challenge of Reasoning Models
OpenAI’s attention – along with the rest of the AI industry’s – is also shifting to reasoning models, which are generally more capable in terms of the tasks they can accomplish but require more computing to run. As opposed to models like GPT-4o, which respond to queries nearly instantaneously, reasoning models "think" for seconds to minutes before answering, a process that sucks up more computing – and thus power.
Efficiency Gains and Energy Consumption
OpenAI has begun to release more power-efficient reasoning models like o3-mini. However, it seems unlikely, at least at this juncture, that the efficiency gains will offset the increased power demands from reasoning models’ "thinking" process and growing AI usage around the world.
Mitigating Your AI Energy Footprint
To minimize your energy footprint, you can try using smaller AI models like OpenAI’s GPT-4o-mini, and sparingly use them in a way that requires processing or generating a ton of data. "You could try using smaller AI models like [OpenAI’s] GPT-4o-mini," You said, "and sparingly use them in a way that requires processing or generating a ton of data."
Source Link