The technological conversation revolves around fashions, and there is nothing as fashionable as artificial intelligence. All the countries that want to be part of the conversation are developing their models and tools and it is interesting how geopolitics permeates everything: the US seeking to be sovereign while China wants to monetize now.
But as interesting as the capabilities of one model or another, it is to talk about two concepts that are totally aligned: data centers that feed the enormous amount of calculation necessary to train artificial intelligence and, evidently, Where do they get that absurd amount of energy from?.
And as a result of that conversation a fascinating term has been born: the one with the ‘bragawatts’.
The ‘bragawatts’ as the bragging of AI
Something common when companies like OpenAI or Google announce new data centers focused on AI is that they give a bombastic number about the amount of energy it will consume. RecentlyOpenAI announced a new campus in Michigan that, together with six other also recently revealedthey will need more than 8 GW to operate.
They also talk about money: a plan launched in January of this year of 500 billion dollars and 10 GW of planned capacity.
According to the company, it is “the infrastructure necessary to advance AI and reindustrialize the country.” In Financial Times They have done the math and, with the Michigan project, the company has 46 GW of computing power. As when talking about operations like the purchase of Activision-Blizzard by Microsoft for 75 billion dollars, context is needed because it is difficult to imagine such enormous numbers.
If 1 GW is enough to power 800,000 homes in the United States (with what they spend on air conditioning at any time of the year), these OpenAI data centers would consume as much energy as more than 44 million homes. More context pointed out in the Financial Times: almost three times all the homes there are in California.
And the fact that companies give this power data so happily has led to some coin the term ‘bragawatt’. This neologism is a sarcastic combination between ‘brag’, “to show off”, and ‘watts’, the unit of power. In Spanish it is difficult to find a name, but basically it is a boast, something that some companies use, publicly exaggerating the energy consumption capacities planned for their infrastructures.
There are several reasons why this is done, but as with any type of announcement by companies that are ‘public’ -those listed on the stock exchange-, the objective is to attract the attention of both the press and the technology sector and, above all, investors.
In the economic environment they comment that these bombastic figures are not always met, but beyond the marketing boastthere is a bottom to all this. OpenAI asked the US government to secure 100 GW annually to fuel the country’s different AI developments and NVIDIA explained quite well why estimating the demand for these centers is a problem.
In a recent report, the company commented something very interesting:
Unlike a traditional data center, which runs thousands of unrelated tasks, an AI “factory” operates as a single system. When training a large language model, or LLM, thousands of GPUs perform intensive calculation cycles, followed by periods of data exchange. Everything is done in perfect synchrony that generates an energy profile characterized by massive and rapid load variations.
The electrical consumption of a rack can go from an “idle” state, around 30% utilization, to 100% and back again in a matter of milliseconds. This forces engineers to oversize components to support the maximum current, not the average, which increases costs and space requirements.
When these oscillations are added across an entire data room – which can represent hundreds of megawatts rising and falling in seconds – they pose a significant threat to the stability of the electrical grid, making interconnection with the grid a key bottleneck for the expansion of AI.
Therefore, beyond the aforementioned boasting, there is some substance in those enormous figures that companies give. And what Nvidia says is backed by data. The big technology companies in the United States are taking over important technology centers. nuclear electricity production or with contracts with oil and gas companies.
The coal is re-emerging in full decarbonization to feed the ‘gluttons’ data centers and we are seeing that this focus on LLM is leading large oil companies to give a turn in their plans to adopt renewable energies. AI needs fast energy capable of supporting those performance peaks, and renewables don’t seem like the way to go at the moment.
Since we are dealing with grandiose figures, esteem that, between now and 2029, the world will spend about 3 trillion dollars (“its” three trillion) on data centers. And to give more context, it is what France’s economy was worth in 2024. Yeah Are we talking about a bubble or not?is another topic, but there are those who think that these ‘fanfare’ are very difficult to believe. Also who point that AI will have more impact than technologies so far, including the Internet, so we may need all that energy.
Only time will tell.
Image | İsmail Enes Ayhan
In Xataka | While Silicon Valley seeks electricity, China subsidizes it: this is how it wants to win the AI war


GIPHY App Key not set. Please check settings