With newer AI capabilities, power demands will skyrocket. Can companies overlook the surging power consumption and continue on their path to innovation?
According to the Guardian, ChatGPT’s answer to a single question consumes two watt-hours. Or the amount of electricity used by an incandescent bulb in two minutes.
OpenAI’s newer and far capable model, GPT-5, might leverage 20 times that much energy.
This model is a significant breakthrough compared to the previous models. It can reason through complex challenges even human intelligence might not comprehend, answer PhD-level questions, and even help develop websites.
These capabilities are a remarkable upgrade from ChatGPT’s. But not without a substantial cost.
Several research studies are being conducted to highlight the amount of energy AI data centers consume.
It’s set to transform how the world at large consumes power. With a significant increase in the number of data centers, power demand is also set to increase by 3-4% of the overall power supply. Previously, the electric power demand was stagnant, owing to efficiencies such as the use of LED.
But owing to the rising number of AI data centers, electricity demand across the US is projected to increase by 2.4%, with 0.9% going toward AI data centers.
These are expert-backed numbers.
But there has been no report presented by OpenAI on power and energy usage since the launch of its GPT-3 model. They did publish specific figures such as “0.34 watt-hours and 0.000085 gallons of water per query,” but these weren’t tied to any of their models. The content also lacked significant documentation on the mentioned numbers.
Overall, these numbers equate to the power consumed by 1.5 million US homes.
GPT-5 is way more complex in its capabilities than GPT-4. So, given its longer computational time and multimodal reasoning, it’ll require more power.
This is the logical speculation.
And the foundational basis of a study conducted by researchers from the University of Rhode Island. They found out that GPT-5 can use up to 40 hour-watts of energy on a medium-length response, i.e., 1,000 tokens long.
For now, experts hope to benchmark AI resource use. Because beyond AI’s unknown potential hides another question- what are its environmental costs?
To what extent can tech companies continue to develop newer models with enhanced capabilities, draining the overall power usage?

