Chatgpt at 3 WHating. In October 2023 A study Alex de Vries pointed out that a Chatgpt consultation had an estimated energy cost of 3 Wh. His estimate came from comments from Google, whose managers indicated that Chatgpt’s consumption was “probably” 10 times that of a search. And Google herself had revealed that in 2009 each search was 0.3 Wh, and hence the final figure. De Vries, by the way, has previously published another study in which he warned of the worrying bitcoin mining energy consumption. From Vries, yes, he took reference that the average consultations were about 4,000 input tokens and 2,000 output, which would be equivalent to quite long questions and answers, when it is normal to make them shorter.
Efficiency gains whole. Throughout that time many things have changed, both for Google – whose results and infrastructure are very different from that of 15 years ago – and for Chatgpt, which has also gained whole efficiency. It is very likely that Google consultations are now more efficient, but surely those that are also those of Chatgpt and other chatbots.
A new estimate. Epoch AI is a non -profit organization that among other things is responsible for the creation of the Benchmark FrontierMath. This test tries to assess the mathematical computing capacity of AI models, and has become one of the most interesting metrics for its difficulty. Your researchers have now published a study in which they precisely estimate the energy consumption of a chatgpt consultation.
Chatgpt consumes ten times less than what was thought. According to its conclusions, Chatgpt consultations based on the LLM GPT-4O consume about 0.3 watts-room, which is ten times less than what was previously considered. That 0.3 Wh “calculation is in fact relatively pessimistic, and it is possible that many or most requests are even cheaper.”


How they have done the calculation. In Epoch they have been based on known data for their calculations. Thus, they point out that according to Openai a Token equals approximately 0.75 words and that generating a token costs approximately 2 flops. Taking into account the calculation capacity of the GPUS NVIDIA H100 (989 TFLOPS in TF32, 67 Tlops in FP32 Operations) and their consumption (1500 W, although they consider that they actually consume 70% of that average power), the result It is the aforementioned.
Everything has improved. As we pointed out, in Epoch AI, they emphasize that the difference between this estimate and the previous Until better multiply. In addition, in the previous estimate “there was an excessively pessimistic count of the necessary tokens.”
How much are 0.3 Wh? They are equivalent to less than the amount of electricity than a LED bulb or a laptop consume “in a few minutes.” According to the Energy Information Administration of the United States, an average home there consumes 10,500 kWh per year, that is, about 28,000 Wh a day. Even an intensive use of chatgpt does not seem excessively to influence that consumption.
Reason consumes more. Although they take GPT-4O as a reference, they make it clear that using reasoning models such as O1 or O3-mini requires more energy consumption, but for the moment they are less popular.
And train the models, too. These researchers have also highlighted the energy cost of training models such as GPT-4O, which according to their estimates would have been between 20 and 25 MW in a period of three months. That would be equivalent to some 20,000 middle homes in the US.
The general costs are worrisome. Although the data in this study reveal that using chatgpt does not consume as much energy as previously estimated, the problem may be another. The general energy costs of AI are colossal and aim to be much higher in the short term: the Big Tech fever for investing tens of billions of dollars to build data centers. And he does it because all these data centers will have huge needs at the energy level. Eye: Let’s not forget the air conditioning.
Image | Xataka with Freepik Pikaso