The AI ​​has no future without nuclear energy when even Nvidia has begun to pray to Bill Gates reactors

Data centers will be responsible for 10% of the increase in energy demand until 2030, according to the International Energy Agency (IE). The rise of artificial intelligence (AI) What we are living has triggered the proliferation of these facilities In the US, China, Japan, Singapore, India, Germany, Netherlands or Ireland, among other nations. And for the moment there is no indication that invites us to anticipate that this trend will be exhausted in the medium term. A data center dedicated to large AI can exceed 150 MWand, precisely, these are the facilities that are proliferating the most. In fact, in 2024 its global consumption amounted to about 415 TWH, a figure that represents around the 1.5% of global electricity consumption. To solve this challenge and guarantee to data centers the delivery of energy that more and more companies need nuclear. The last one who has done is Nvidia. And is that the company led by Jensen Huang has participated in a financing round Of 650 million dollars to support Terrapower projects, the nuclear energy company founded by Bill Gates in 2006. With this decision NVIDIA adds to the strategy that defends the use of Compact modular reactors (known as SMR for its denomination in English) with the purpose of delivering to the data centers the electricity they need. And, incidentally, put one more leg in a sector with an indisputable growth potential. Terrapower is already building the first Natrium nuclear reactor The nuclear fission reactor that this company has designed is a modular and compact design refrigerated by sodium that uses a molten salts storage system. Because of its characteristics, it is about A fourth generation machine That, according to those responsible, it will be able to generate electricity in half of the cost that a conventional nuclear fission reactor. Whatever the interesting thing is that the first Natrium nuclear reactor in Terrapower is being built in a Wyoming Mining town (USA), and, according to Bill Gates, will be completed in 2030. Nvidia has participated in a financing round of 650 million dollars to support Terrapower projects It sounds good, but we must not overlook that it is a new generation design, so a priori the five years that Terrapower manages seem too optimistic. However, this reactor has an important asset in your favor: on paper Its tuning should be faster and cheaper than that of conventional reactors. In addition, a Spanish public company is participating in the construction of this machine. It is called Ensa (Nuclear Teams, SA), is Cantabrian and has more than five decades of experience in the field of design and manufacturing large components for the nuclear industry. There is no doubt that the fact that Terrapower has decided to ally with it is a boost that will surely reinforce its international image. And, perhaps, he opens the door of other latest generation nuclear energy projects. “This is the first reactor of these characteristics that is manufactured following the highest standards of safety and quality in accordance with the most demanding nuclear regulations,” has declared A Enso spokesman. Interestingly, this Spanish company will participate in The manufacture of the Natrium reactor lid. A last interesting note: currently also intervenes in the construction of ITER (International Thermonuclear Experctor reactor), The experimental reactor of nuclear fusion that an international consortium led by Europe is pointing in the French town of Cadarache. Image | Terrapower More information | The Register In Xataka | “We are already on the last step”: how Spain has done with the key to realize nuclear fusion

The cheap mobile was seriously injured in Europe. The energy label can advise the coup de grace

Today, June 20, 2025, it is a before and after for those phones that are sold in Spain. From this date, every manufacturer who wants to distribute their new products in Europe will have to add The energy label in its box. It is an effort at European level to guarantee compliance with Regulation on Ecological Designaffecting mobile phones, wireless landlines and tablets. In addition to forcing manufacturers to introduce this label, changes relating to the support years arrive They will have to give to every new mobile. These changes in the design, software and product life of the product paint well. But the question is who is going to pay the duck. What changes. From now on, manufacturers have to introduce the new energy labeling in every mobile that they want to put on sale. In it, they are collected data related to the energy efficiency of the samereliability in free fall, autonomy measured per cycle, supported battery cycles, Water and dust resistance protocols and repair index. In addition to informing about ecological design, manufacturers will be obliged to make draft changes throughout the chain process: from how they manufacture the phone to What happens to him with the guarantee time. The updates. Smartphones manufacturers will have to guarantee five years of updates (dry updates, not mandatory version jump) for each and every one of their phones. It is something that great manufacturers such as Samsung, Google, Honor, Xiaomi and Motorola They were already doing with some of their most recent mobiles. The key is precisely in that “some.” Each system update, however small, entails: A team of active software engineers dedicated to the project Quality control processes (internal tests, Google certifications to meet the requirements of each update) Operator certifications in the event that the phone is distributed through a third party Keep the ota infrastructure alive Costs that, in the case of low and medium -end devices, disappeared after the two years that used to be updated. Duplication support will entail an additional expense to companies on devices that leave tight margins: a device of less than 100 euros updating for five years does not seem the best business. And this is just a part of the cake. The new manufacturing standard. The EU focuses on the manufacture of the device. Manufacturers will be obliged to be more resistant to falls, scratches, splashes resistance, and batteries that promise a minimum of the 80% of its total capacity after 800 load cycles. Similarly, spare parts must be guaranteed for at least seven years since the product stops selling. Each and every one of the necessary components to repair the phone must be available to surrender within five to ten days. In summary. Consumers are going to win in two pillars. We will have better phones in the market. We will be able to inform ourselves in a single view of the quality in the ecological design of each of them. The problem is precisely what. Manufacturers will have to support five years. Manufacture, distribute and guarantee spare parts for seven. Introduce better batteries. Create phones more resistant to falls and scratches. Ensure compliance with splash resistance protocols Who pays it. In Europe we are buying increasingly expensive mobiles. So much so that there are more users buying phones of more than 800 euros than users buying mobiles of less than 200 euros. In fact, one of the main reasons why shipments in Europe fell into the first quarter of 2025 has to do precisely The fall in the input range mobile demand. The lowest volume in this segment in the last decade. This segment is precisely one of the most affected will be seen by the normative change: mid -range devices (strip between 399 and 799 euros, according to consultants) and superiors already met a good part of the demands of the European Union. The big question is whether manufacturers will try to absorb these additional costs through the margin in more profitable ranges, or if economic ranges will face slight increases to continue fighting for positive yields. Image | Xataka In Xataka | Apple puts the longevity to repairs. And he has justified it with a 24 -page document

The network did not collapse due to lack of energy, but for lack of control. Renewable continues to be connected as if they were passive

Almost two months after the blackout that disconnected Spain and Portugal, the government has made public a technical report that focuses on a very specific problem: the lack of tension control at critical moments, especially in renewable parks. Is it possible? As the engineer and energy expert Xavier Cugat has pointed out in networks, the debate on voltage control in renewable facilities is not only a technician: There are technologies that already allow it. One of them is SMA’s ‘Q at Night’ system, which allows solar plants to provide reactive energy even during the night. The idea is clear: if a solar plant can continue to support the network even without sun, part of the stability problem can be mitigated. This does not directly solve the lack of inertia, but complements the voltage support and improves the resilience of the system. Reactive energy The principle is simple but effective. SMA photovoltaic investors, equipped with the Q at night function, They remain connected to the network even when they are not generating active energy (that is, when there is no sun). This allows them to inject or absorb reactive energy as needed, contributing to maintain tension within acceptable margins. In case of high penetration. This type of energy is key to avoiding tension instabilities. In this particular case, it is for a network with low presence of conventional plants. Although it does not contribute inertia, it allows plants to support tension balance and remain connected to critical events instead of disconnecting preventively. So the issue of inertia? This is where it is necessary to clarify. The Government report has made it clear that collapse was not a consequence of a frequency fall, but of a cascado of over -overdations. Even in a scenario of greater inertia, over -overdrafts would have also produced, according to the report. Therefore, the lack of inertia was not the direct cause of collapse; The collapse was the one that caused the fall in frequency. During the blackout, different plants were disconnected preventively when detecting overtheions. The problem is that, according to the report, several of these disconnections occurred before even the maximum limits allowed by the regulations were reached. In other words: they did not respond properly to the network conditions. A system not adapted to its own transition The problem seems structural: the electricity grid has not evolved at the same rate as the massive renewable deployment. With 82% clean generation and the least amount of operational synchronous plants throughout the year, the system faced an explosive cocktail: a lot of distributed generation, little centralized control and little response capacity against critical events. In just 12 seconds, the entire Iberian system of the rest of Europe was disconnected. A transition without security network. The blackout was a symptom, not an anomaly. Spain leads the renewable transition, but without a prepared network, each advance becomes vulnerability. The voltage control, the response to incidents and the ability to maintain stability without large machines spinning are the great challenge of the new energy paradigm. Image | Pexels Xataka | 49 days after the blackout, the government has published the official report. Against all prognosis, he points to a culprit

No one has managed to reach the mantle of the earth. China has built a ship to do it and, incidentally, extract energy

The old aspiration of Julio Verne in ‘Journey to the center of the earth’ is still out of our reach, but the explorations under the seabed are increasingly deep. You just have to see the ship that China has just put into service. Meng Xiang. “Dream” in Chinese. A colossal ocean drilling vessel designed and built entirely In the Asian country to pierce the ocean bed descending up to a record distance of 11,000 meters. The objective: penetrate the earth’s crust and reach the mantle, a geological border that until now has only been able to study indirectly, creating new science while exploring new energy sources. A Boat. With 179.8 meters in length and 42,600 tons of displacement, the Meng Xiang is the new largest scientific research vessel in China, which, which consolidated the country’s position as a maritime superpower. Although it will focus on the South China Sea until 2035, the Meng Xiang could operate in any ocean in the world, supporting superstifones and the most extreme maritime conditions. What makes it unique. The true crown jewel is its drilling system: the world’s first hydraulic drilling tower capable of raising up to 907 tons with a double purpose: Perform oil and gas exploration perforationsand at the same time, take samples of geological nuclei for scientific research. Your ability to pierce 11 kilometers will allow you, for the first time, to obtain direct samples of this transition zone. The goal reminds the historical “Mohole Project” of the United States In the 60s, which although he laid the foundations for oceanic drilling, he never achieved his final objective. The new Chinese ship has the technology to get it. The unexplored border. Since the seismologist Croata Andrija Mohorovičić discovered it in 1909, “Moho discontinuity” It has been one of the most coveted borders by geology. It is the limit where the earthly, lighter cortex gives way to the much densest rocks of the mantle. Until now, our knowledge about this crucial layer comes from seismic data and the analysis of minerals expelled by volcanoes. China intends to kill two birds with an unprecedented scientific mission that, in turn, will expand its extractive capacity. What can find down there? In addition to oil, gas hydrates: a vast source of potential energy trapped in the seabed at great depths and low temperatures. Mastering his extraction could redefine the global energy map, in which China wants to be in the lead. Image | Xinhua In Xataka | In China there are scratching size ships sailing thousands of kilometers from the sea. All thanks to your cranes

Energy and space. China has solved them by sinking them into the sea

China has opened in Shanghai The first commercial submarine database fed entirely by marine wind energy. It is an important evolutionary leap after two years of experience with its pilot installation in Hainan. Why is it important. The digital infrastructure is facing Two crisis worldwide: The excessive energy consumption of data centers. The shortage of urban land to expand them. This underwater installation solves both problems of a stroke, because it reduces energy expenditure to 40% while releasing space on the mainland. The context. China already tested the commercial viability of Submarine centers in Hainan Since December 2022, where an installation operates 30 meters deep without registering a single server breakdown in these two and a half years. Microsoft experienced with PROJECT NATICK In Scotland in 2015, but it was Hainan who marked the first real commercial deployment of the world. Shanghai now represents the “version 2.0” of this technology. In figures: Investment reaches 1.6 billion yuan (222.7 million dollars) to create an underwater cluster of 24 megawatts. The natural water cooling system reduces cooling consumption of 40-50% to less than 10% of total consumption. More than 90% of energy will come from marine wind farms. What has happened. Yesterday, Tuesday, June 10, The tripartite agreement was signed Among the authorities of Shanghai and the company Hicloud Technology. The first phase, 2.3 MW, will begin operating in September as a national model project. The second phase will scale up to 24 MW with an energy efficiency (PU) of less than 1.15. And now what. The installation anchors an industrial ecosystem that will support AI, 5G, Internet of industrial things and electronic commerce platforms outside China. The country thus consolidates its leadership in submarine digital infrastructure while other countries remain focused on expanding land centers to use. Outstanding image | Hicloud In Xataka | Saudi Arabia wants to become a new power in data centers. Nothing is clear that I can do it

Sam Altman states that Chatgpt’s water and energy consumption is tiny. The problem is that it does not give evidence of it

An email of 100 words generated by GPT-4 Consume 519 milliliters of water. That was the conclusion to which researchers at the University of California arrived a few months ago after analyzing this OpenAi model. Sam Altman, CEO of the company, has just yielded its own estimate on the consumption of water and energy of each consultation of Chatgpt. And it is very different. 1,000 times less than what was said. According to Altman, an average consultation in Chatgpt consumes much less than what had been indicated in previous studies. Your data are strikingand to understand them makes interesting analogies: “As production automated in data centers automates, the cost of intelligence should approach electricity. (People are usually curious to know how much energy consumes a chatgpt consultation; the average consult (0.32 ml); A previous study of Epoch ai corroborates the data that Sam Altman has now wielded. Source: Epoch AI. And the tests? Those figures mentioned by the OpenAi CEO have a problem: they have no visible support. He throws them without citing sources or explaining where he has taken them out, something that makes it difficult to believe. A Meta executive answered the question of How much consumes the inference AI A year and a half ago, responding that “only two nuclear reactors would be needed to cover it.” But previous studies coincide with Altman. Although he does not mention any evidence, in February, Epoch AI researchers precisely They published a study trying to estimate the energy consumption of chatgpt. In their conclusions they indicated that on average a chatgpt consult Previous report of the researcher Alex de Vries. Since then, of course, many things have happened. Too pessimistic. And as they commented on the study of Epoch AI, the difference comes from the fact that the models are today much more efficient than in 2023, when VRies conducted their study. So is the hardware in which these models are executed, and that estimate was also used a “especially pessimistic” approach. In Openai’s study they also threw an especially pessimistic estimate and pointed out that “most of the requests (A chatgpt) are much cheaper (energetically).” More studies. Another independent study published by Andy Masey in January 2025 reached a similar conclusion and claimed that “using Chatgpt is not bad for the environment.” It was based on EPRI data May 2024 that also estimated a high consumption of 2.9 Wh by chatgpt consultation. Estimated water consumption In data centersfrom A SUNBIRD studyit was also very modest compared to other online activities. Water consumption in data centers for various online activities. Source: Andy Masley. Fifte. Precisely the data of water consumption was another striking in that estimate of Sam Altman. According to him, a chatgpt consultation barely consumed 0.32 ml of water, “a quinceava part of a teaspoon.” The figure suggests that the water needed to refrigerate data centers that process these requests is much less than what was thought only one year ago. And training, what? These estimates focus on the AI ​​inference section, that is, our use of chatgpt that receives a consultation and processes it inferring (generating) a text result. Although Altman does not clarify it, he does not seem to include here the energy and water cost of training AI models, which is very high and makes thousands of Gpus They work at full power For months, with the consequent water expense in data centers to refrigerate all those components that dissipate high heat amounts. As I pointed out The researcher Ethan Mollick, GPT-4 probably used more than 50 GW to be trained, enough to give energy to 5,500 homes in a year. We continue without definitive data. Altman’s claims are as always striking, but the lack of clear evidence makes it difficult to believe these data. Other recent studies are more useful when it comes to reflecting this increasingly lower cost both in energy and water from the use of AI, but there are no accepted standards or a consensus on the true impact of energy and water consumption when using chatgpt or other AI models. Image | Lukáš Lehotský | Village Global In Xataka | The light price is again negative: it is a sign that the system needs a redesign

That is why he has sealed a 20 -year commitment to nuclear energy

Goal, the giant behind Facebook and Instagram knows that artificial intelligence (AI) does not feed only on language models. It also needs energy, a lot and stable. That’s why has sealed an agreement 20 years with constellation energy to supply nuclear electricity Apart from your data centers in the United States. It does not seem a symbolic gesture or a green bet for the gallery: it is a calculated play to shield its infrastructure and maintain the pulse of AI development in an increasingly competitive world. The movement is not less. Technology has reached an agreement for the nuclear plant Clinton Clean Energy Centerlocated in Illinois, feeds its data centers in the region from 2027. It is a reactor that was planned to close in 2017 after years of financial losses, but that could continue to function despite the challenges. According to the Menlo Park headquarters, the agreement will allow the facilities to be kept, preserving more than 1,100 jobs. The nuclear enters the goal solution for finishing The company’s investment will also help make the necessary improvements to increase energy production by 30 megawatts, reaching a total capacity of 1,121 megawatts. It is not clear what percentage of the electricity produced will use the technological giant to boost its AI infrastructure, But it is known that current production reaches about 800,000 American homes. While the energy demand of data centers grows exponentially, the technology industry seeks solutions that do not compromise or Supply stability nor its environmental commitments. In that context, nuclear energy has reappeared as a viable alternative: does not emit carbon dioxide during generation and offers constant production, unlike renewable sources such as solar or wind, which depend on variable climatic conditions. Goldman Sachs had already anticipated This turn. In a recent report, the Investment Bank warned that, if the AI ​​adoption rhythm is maintained, the energy demand linked to this technology could be multiplied by 160 from here to 2030. To maintain that growth, it will be necessary to build between 85 and 90 gigawats of new nuclear capacity in the United States. And that is where nuclear energy comes into play as a high performance solution and low carbon footprint. This goal bet is not isolated. Google has advanced in the same direction With agreements like the one who maintains with Kairos Power, a company that develops Modular reactors (SMR) up to 500 megawatts. Amazon has also taken the step and has signed several agreements to boost the construction of these Small and advanced reactorsincluding projects with Energy Northwest, Dominion Energy and the Startup X-Energy, which works in new generation rapid reactors. Nuclear energy is not free of challenges. Its infrastructure remains expensive and concerns about safety and waste management are still present, especially due to the legacy of previous technologies. But the new generations of reactors, such as SMRs, promise to address many of these obstacles: with safer designs, shorter deployment times and potentially more contained costs. For companies as a goal, the priority seems to be to guarantee a long -term energy supply, without shocks and with the least possible carbon footprint. The agreement with Constellation Energy reflects that reality, but remains to be seen if these plans materialize as planned. It would not be the first time that such a project is complicated: In 2022, one of its previous attempts In Idaho It was pause After detecting the presence of a rare species of bee in the field for infrastructure. Images | Goal | Constellation In Xataka | World record in nuclear fusion: German Reager Wendelstein 7-X has broken all records

IA consumes so much energy that the United States is building data centers directly in natural gas wells

What makes a startup dedicated to building data centers get 11.6 billion dollars in financing. In the case of Crusoe Energy Systems, it all started with an idea as disturbing as profitable: build data centers with natural gas wells. Crusoe Energy’s idea Journalist Emily Chang de Bloomberg visited a few weeks ago The city of Abilene, in Texas, where Crusoe is building the monstrous Stargate data centers, The 500.00 billion project OpenAi, SoftBank and Oracle to develop general artificial intelligence. Stargate arrived at Crusoe Energy thanks to the demonstrated efficiency of the startup in the construction of specialized data centers in AI. Chase Lochmiller, the CEO of Crusoe, explained to Chang how the company was born: “When an oil company opens an oil well, one of the associated by -products is natural gas. And when they do not have access to an pipeline, all this associated gas simply burns in situ. So we had an idea: instead of trying to take that gas to a market where you can sell, we could create a market for gas. We could build mobile and modular data centers, take them direct data”. Crusoe was born in the best possible place to materialize this idea: the country of fracking. But maybe not at the best time to do it. Initially, they chose to build GPUS farms to undermine Bitcoin. When the cryptocurrency market collapsed, they ended up pivoting artificial intelligence. Like crypto mining, AI data centers are not based on CPUS but depend on the parallel processing capacity of thousands of GPUS, mainly Nvidia specialized chips. These new data centers consume much more energy than traditional data centers, so Crusoe started from a key advantage: their direct access to fossil fuels that obtained at the price of bargain. A booming business Oil giants are not oblivious to this trend. Exxonmobil is developing Off-Grid gas plants specifically for data centers with carbon capture technology to reduce emissions. Chevron, meanwhile, It has been associated With Engine No. 1 and Ge Vernova to mount similar facilities. The first will open in 2026, also in Texas. The figures are eloquent: the demand for natural gas for data centers will be increased by 47 GW from here to 2030. Currently, natural gas already feeds about 40% of the loading of data centers in the United States and is expected to continue being the main source of supply up to at least 2030. It is not the preferred energy source, but There are not enough renewable to feed artificial intelligence and Not all data centers They can be connected to a nuclear power plant, another common occurrence in the United States. As for Crusoe, thanks to the initial push of natural gas that were going to burn oil companies they developed their own technologies, such as a closed cooling system that does not need to renew The water evaporating the serversor own gas turbines, such as 360 MWs that are installing in the Stargate project as a backup energy source. Stargate data centers will feed mainly solar panels and wind turbines, which abound in Abilene for the confluence of wind and sunny hours. It is one of the reasons why data centers are being built in this Texas area, in addition to the tax exemptions that local governments are willing to yield in exchange for employment generation. It remains to be seen how many employees have everything built. Image | W.Carter (CCO) In Xataka | Microsoft will reopen a nuclear power plant that has been closed since 2019. It needs it for its artificial intelligence

Germany is installing giant concrete spheres under the sea. You have a good reason: store renewable energy

While France and Germany reinforce their energy alliances with a renewed bet For nuclear energy, within the German country they are developing a completely different system. The focus is on marine depths, with the aim of redefining the way in which renewable energy is stored. Under the sea. A group of researchers the Fraunhofer Institute of Germany They have created the Stensea project (Stored Energy AT Sea acronym). Since 2011, the equipment has worked in a solution to reduce land use, reaching the conclusion of sinking huge concrete spheres into the seabed to store energy. The operation. These spheres sink at 600 and 800 meters deep, where water pressure is so high that it can rotate turbines with great efficiency. Each one measures about 9 meters in diameter and weighs about 400 tons. The idea is that they work as giant batteries: by letting the sea water in, it moves a turbine connected to a generator. To recharge it, water pumps out, using network energy to overcome environmental pressure. A design of what a Stesea plant would be A real test. The system has already been successful at Lake Constanza, and now the next step is marked in the calendar by 2026. It is expected to install a prototype real and 3D printed on the coast of Long Beach, in California. This model can generate about 0.5 megawatts and store up to 0.4 megawatts-Hora, which would be enough to cover the consumption of a middle home in the United States for about two weeks. The future idea is ambitious: building much larger spheres, up to 30 meters in diameter, capable of storing much larger amounts of energy. The objective is to climb the system with spheres up to 30 meters in diameter, which would allow a much larger storage capacity. According to They have detailed Newatlas researchers, the estimated storage cost around the 5 cents per kilowatt-hora, a very competitive figure compared to other current solutions. Renewables in Germany. Although it seems contradictory for its climate, the country has been strongly betting on solar energy, especially In self -consumption facilities. However, it faces an important challenge: intermittent production, or Dunkelflaute. For this reason, projects such as Stesea can act as a shock absorber of the electrical system, because it stores excess renewable energy and releases it when it is most needed. So hydroelectric plants? Unlike traditional pumping storage –which requires mountains and large fresh water reserves-, this system does not need limited elevations or water resources. Its modular design allows it to install it on coasts around the world. In addition, this system has raised An economic advantage since it allows energy arbitration, buying electricity when it is cheap and selling it in moments of high demand. Forecasts The researchers They believe that this technology He has barely shown the tip of the iceberg. They estimate that, if it was deployed on a large scale, it could reach a global storage capacity of about 817,000 gigawatts-Hora. Translated to something more tangible, it would be enough to supply about 75 million homes in Europe for a whole year. However, although the project is presented as a solution to avoid intensive soil use, it does not stop moving that occupation to the seabed. Until now, the approach has been mainly technical, but it would be expected that in future phases rigorous environmental evaluations will be included that analyze its impact on oceanic ecosystems. Image | Stesea Xataka | Europe’s turn to nuclear: Germany and France have signed a pact to reconfigure the continent

Humanity’s plan was to depend less on gas to generate energy. The AI ​​think something very different

When The alarms jumped by him temperature increase globally, it was launched The decarbonization plan. Countries, large technology and automotive companies marked objectives for reduce your carbon emissions with goals set for 2030 and 2050. Have been applying measures for itbut with what the industry – and the planet – did not have the rise of the artificial intelligence and his Energy voracity. One so extreme that there are those who rub their hands: companies that create gas turbines. The threat of renewables. 2024 was a Good year for renewables. Although it is something that has caused an authentic War between Chinese companiesmarket saturation has allowed the panel price Lower considerably. This facilitates the Installation of self -consumption systemsbut it has also allowed huge parks to flourish even in such oil -dependent as Texas. We have witnessed Sorpasso of renewables in Europethere are countries that have worked for months only with renewables And that push of solar panels is making progress in the race for the Green hydrogen. Artificial intelligence. Companies have also adapted their systems to be more efficient, consume less water and even build more respectful and sustainable facilitiesbut in the same way that 2024 was the year of the explosion of renewables, it was also for AI. That is why the main technological have begun to Expand and build data centers all over the world (something that It doesn’t look good everywhere) to be able to meet the current demand for this technology. Change of plans. That high energy consumption has pushed some of the Big Tech to opt for something striking: Operate your own nuclear power plants. Giants such as Amazon, Google or Microsoft have shown their plans to Create or reactivate nuclear centralsbut it’s not the only thing. European oil companies have readjusted your renewable support strategy And there are already seen that, during the AI ​​consumption peaks, the coal burning as gas to meet energy demand. Gas interest. That renewed interest in gas is something that already has consequences, and Siemens is a perfect example. How can we read in Bloombergthe German company estimated a financial result close to balance for this fiscal year, but after the growing interest in gas, they now estimate that their income could grow up to 15%. Siemens Energy manufactures, among other things, gas turbines And in recent months they have seen how this avalanche of investments in data centers with high energy demand has promoted turbine orders. In fact, contrary to what we might think in mind that we wanted to stop depending on the gas, the company has seen that the orders received have doubled during the first three months of the year. And price increase. According to the International Energy Agency, The energy demand of data centers HE will duplicate By 2030 due to AI workloads and, although renewables are in clear expansion, as the supply is intermittent, there are times when they do not meet the constant demand of these data centers. Gas is here a safety net For companies, since it provides constant energy for artificial intelligence infrastructure and coal would be used for peaks of demand. And there are already those who predict that this increase in gas demand will result in an increase in its price for the next winters. And also of carbon emissions, as we are already experiencing with cases such as Microsoft and Google, with Increases of 30% and 50% respectively in recent years. Images | Pexels, Balticservers In Xataka | Putin’s not so secret plan to survive without Europe: a giant gas pipeline to China

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.