Energy companies are switching from oil to MW. The new mine is the support for data centers

Gluttonous artificial intelligence and its demanding data centers are reshaping the decarbonization plans. When the world had begun a journey towards renewableswith countries like Chinaand Europeans betting big, and even some US states getting on the traindata centers arrived with needs that were almost impossible to satisfy. At the end of December 2024 we already have that data center consumption had skyrocketedpushing big technology companies to bet so much on renewable as, above all, for immediate access energy such as gas and even coal. Some were even aiming for nuclear to be able to operate. Shortly after, in January 2025, a Reuters report noted that European energy companies, which had embarked on a path of commitment to renewables, were doubling down on oil and gas. Giants like BP and Shell slowed down their investments in clean energy to return to fossil fuel projects. But it’s not all about where data centers extract energy from, but rather who provides them infrastructure. And that, and not so much oil or gas, may be the next energy mine. The new oil mine In an article of Financial Times It is suggested that the fleeting growth of data centers is generating a market that energy companies do not want to miss. As demand for traditional drilling weakens (although it is something that goes by “neighborhoods”), energy sector groups such as Baker Hughes, Halliburton or SLB are taking advantage to pivot to the data center sector. Not building them, not just supplying energy: supporting logistics. Taking advantage of their knowledge of the energy sector, these large companies would be providing equipment such as turbines and power generation systems to those who own data centers, but they also provide generators, batteries, dissipation systems and all the necessary framework to maintain correct energy efficiency. They would also oversee the team. It is, in short, what they already know how to do, but applied to a new sector such as data centers. Because these three examples are not typical oil companies, but technology providers for other companies to extract gas or oil. All three provide services to companies with oil fields, but also supply technology such as gas turbines, compressors or systems. LNG and they were inside sectors such as new energywith carbon capture and storage systems. All of this resonates with the idea that ‘Big Tech’ had when they began to build huge data centers, until they saw that increasingly demanding equipment needed more immediate and stable sources of energy. Data centers = El Dorado It is estimated that US electricity demand will increase by 90 GW -a real nonsense- from now to 2030 only to power the data centers. Traditional electrical grids may not support this load, and it is at that point that these companies that provide energy services They seem like a key entity. Pivoting toward artificial intelligence infrastructure is “key to the evolution of oil and gas,” said Lorenzo Simonelli, CEO of Baker Hughes. And it makes sense when we see that the number of US oil rigs contracted 7% year-over-year in 2025, margins have contracted and demand for drilling services is in interdict. On a business level, it is a masterstroke. Hypothetically speaking, when the new oil crisis arrives and the fall of the market for both crude oil and gas, companies that have pivoted to data centers, going from being service providers for energy companies to being service providers for ‘Big Tech‘, they will not have to take a turn in their strategy because they will already be where the money will be. Because that’s another question: whether the new MW gold for AI will be a lasting business or a passing fever. Image | freepik and Harpagornis In Xataka | The problem with renewables is what to do when there is excess energy. China believes it has the answer with a unique turbine

The artificial intelligence race is pushing the US towards an unexpected energy solution: looking to the military sphere

The artificial intelligence race is not only being fought in laboratories, chips or data centers, it is increasingly being played in the field of energy. In the United States, the accelerated growth of electrical demand associated with AI has exposed a barely visible fragility: the network is not expanding at the same pace as technological ambitions. This imbalance is forcing us to look beyond conventional solutions and reopen debates that seemed closed, including some that connect directly with the military sphere. What has been put on the table. HGP has submitted an application formal to the United States Department of Energy to redirect two nuclear reactors removed from Navy ships to a civil project linked to data centers in Oak Ridge, Tennessee. The request was channeled through a letter addressed to the Department’s own Office of Energy Dominance Financing, and is part of the so-called Genesis Mission promoted from the White House. According to the documentation, the installation could provide between 450 and 520 megawatts of continuous electricity, aimed at intensive and stable consumption. The main argument in favor of this idea is time. Faced with the construction of new civil reactors, whether large plants or smaller designs, which tend to move on long schedules, or the start-up of large gas plants, also conditioned by permits and infrastructure, the reuse of existing reactors is proposed as a way to gain speed. The logic is simple: start from equipment that is already manufactured and tested, and convert it into a firm supply for the network. It is, at least on paper, a way to add base power while other solutions mature. Behind the scenes of the proposal. The initiative does not come from a newly created startup or from an unknown actor in the energy sector. HGP Intelligent Energy It is a recently created division, but it is presented as part of a developer with previous experience in the US market, supported, according to the company itself, in energy storage projects, electric mobility and development of network-scale assets. At the helm is Gregory Alvaro Forero, president of the division, which appears on your LinkedIn profile as president of HGP Storage since November 2013. That detail helps frame the approach outside of the improvised company pattern. What technology would be reused and at what price. The reactors cited in the proposal come from the US naval nuclear fleet, where aircraft carriers operate with two reactors and submarines typically operate with one. Models A4W, manufactured by Westinghouse, and S8G, developed by General Electric, are mentioned. Adaptation for civil use would have an estimated cost of between one and four million dollars per megawatt, and the project would also require between 1.8 and 2.1 billion dollars in private capital for associated infrastructure. The proposal includes revenue sharing with the Government, a fund for future decommissioning and the intention to request a loan guarantee from the Department of Energy, with a first phase “as soon as 2029”. Just because the idea sounds direct doesn’t mean the path is. Bloomberg notes that Reusing military reactors for civilian use would be unexplored territory, and inevitable questions arise: how is it authorized, who operates, under what standards and with what responsibilities if something fails. Coordination between federal agencies and regulators also comes into play, as well as the logistics of moving and adapting equipment designed for ships, not a grid-connected plant. For now, everything remains at the proposal level. Energy sovereignty as a security argument. HGP tries to support its approach with a framework that goes beyond electricity for data centers. In its materials, the company summarizes the idea with an explicit equation, “Energy Supply Chain Sovereignty = National Defense,” and links supply chain resilience to the country’s ability to secure strategic infrastructure, even noting how geopolitical events or social media posts by managers can affect operations and investments. It is the story with which it seeks political and institutional legitimacy. To reinforce the idea that naval nuclear is not synonymous with improvisation, the context of the World Association of Nuclear Operators enters. According to WANOthe US Navy has accumulated more than 6,200 reactor-years of experience without radiological incidents, with 526 reactor cores, as of 2021. The association attributes that history to the standardization of systems, maintenance and quality of training. It is a relevant fact for the public debate, but it does not close it: a solid record in a military environment does not automatically imply that the jump to civilian use will be immediate or easy. Images | General Dynamics Electric Boat | Igor Omilaev | İsmail Enes Ayhan In Xataka | The race to bring data centers to space promises a lot. Physics says otherwise Images | General Dynamics Electric Boat | Igor Omilaev | İsmail Enes Ayhan In Xataka | The race to bring data centers to space promises a lot. Physics says otherwise

Something is going wrong with AI. The US is turning to energy solutions that it thought were buried to power data centers

The race to develop and operate increasingly powerful artificial intelligence models comes at a cost that is rarely at the center of the technological narrative. It is not in the chips or the software, but in the huge amount of electricity needed to keep active data centers running around the clock. In the United States, this pressure is already being translated into concrete decisions: polluting power plants that were in retirement are being restarted to cover increasing peaks and tensions on the grid. The paradox is evident, the most ambitious advance in the technology sector depends, for the moment, on energy solutions from another era. The problem is not so much an absolute shortage of electricity as a time lag. The demand for data centers linked to AI it’s growing much faster than the ability to launch new electrical generation, especially renewable, in short terms. Building large energy infrastructures takes years, while these complexes can advance in much shorter time frames. Faced with this temporary shock, network operators and electricity companies are turning to what already exists and can be activated immediately, even if it is more polluting. PJM in context. The clash between electricity demand and supply is perceived with special clarity in the PJM region, the largest electricity market in the United States, which covers 13 states and concentrates a very significant part of the country’s data centers. We can understand it as a large regional electricity exchange that coordinates generation, prices and network stability in real time. There, the growth of data centers linked to AI is putting to the test a system designed for a very different consumption pattern, making PJM the first thermometer of a problem that is beginning to appear in other areas. What is a central peaker. The calls central peakeror peak, are facilities designed to come online only during short periods of peak demand, such as heat waves or winter peaks, when the system needs immediate reinforcement. They are not designed to operate continuously, but to react quickly. According to a report According to the US Government Accountability Office, these facilities generate just 3% of the country’s electricity, but they account for nearly 19% of the installed capacity, a reserve that is now being used much more frequently than expected. South view of the Fisk plant in Chicago The case of the Fisk plant, in the working-class neighborhood of Pilsen, in Chicago, illustrates well how this shift translates on the ground. It is an oil-fueled facility, built decades ago and scheduled to be retired next year, that had been relegated to an almost testimonial role. The arrival of new electrical demands associated with data centers changed that equation. Matt Pistner, senior vice president of generation at NRG Energy, explained to Reuters that the company saw an economic argument to maintain the units and that is why it withdrew the closure notice, a decision that returns activity to a location that many residents believed was in permanent withdrawal. When the price rules. The change is not explained only by technical needs, but also by very clear market signals. In PJM, the prices paid to generators to guarantee supply at times of maximum demand skyrocketed this summer, more than 800% compared to the previous year. An analysis by the aforementioned agency shows that about 60% of oil, gas and coal plants scheduled for retirement in the region postponed or canceled those plans this year, and most of them were units peakerjust the ones that best fit in this new scenario of relative scarcity. The bill for this energy shift is paid above all at a local level. The power plants peaker They tend to be older facilities, with lower chimneys and fewer pollution filters than other plants, which increases the impact on their immediate surroundings when they operate more frequently. Coal is also postponed. The phenomenon is not limited to power plants peaker fueled by oil or gas. On a national scale, several utilities have begun to delay the closure of coal plants that were part of their climate commitments. A DeSmog analysis identified at least 15 retirements postponed from January 2025 alone, facilities that together represent about 1.5% of US energy emissions. Dominion Energy offers a clear example: In 2020 he promised to generate all its electricity with renewables by 2045, but after the company projected that data center demand in Virginia will quadruple by 2038, it is now taking a step back. Images | Xataka with Gemini 3 Pro | Theodore Kloba In Xataka | A former NASA engineer is clear: data centers in space are a horrible idea

AI doesn’t just live on chips, it also requires massive energy, so Google has bought an energy company

The AI needs a lot of energy and technology companies are already planning how to power their huge data centers. On the table there are such creative ideas as take them to space either submerge them in the sea to reduce its consumption. Google has opted for a more immediate solution: it has purchased an electricity company for data centers. The agreement. Google has purchased Intersect Powera company dedicated to developing energy infrastructure, including renewable energy sources, for data centers. Google has paid $4.75 billion for the San Francisco-based company, in addition to assuming its debt. According to Sundar Pichai: “Intersect will help us expand our capacity, operate with greater agility in the construction of new power generation facilities in line with the new load of data centers, and reinvent energy solutions to drive innovation and American leadership” Why it is important. The agreements of AI companies are usually focused on computing capacity, not energy. This agreement underscores the importance of energy in AI infrastructure, putting it on the same level as the very chips it powers. Data centers are being developed at a brutal pace and energy is presenting itself as a bottleneck. Satya Nadella already said it: there is no power for so many chips. It’s Google ensuring enough “food” for its chips. Yontersec. Google’s relationship with Intersect began just a year ago, when big tech acquired a minority stake in the company. Under this collaboration, several projects have come to light in their data centers. Both these projects and all Intersect personnel are part of the agreement. What the agreement does not include are other company assets, mainly located in Texas and California, worth 15 billion. These will continue to operate under the Intersect brand. Energy. In 2023, data centers already accounted for 4% of the energy consumption of the entire United States, and at the rate at which they are being built, the figure will continue to increase (there is talk of 12% by 2028). The problem is that US electrical infrastructure cannot support that pace and is having consequences for consumers through price increases in electricity. Google assures that with this agreement it will be able to guarantee “an abundant, reliable and affordable energy supply that allows the construction of data center infrastructures without passing on costs to network customers.” Image | Wikipedia, Intersect In Xataka | Talking about artificial intelligence is talking about energy, and the fashionable term is ‘bragawatts’

When nuclear energy orbited the Earth. The day a Soviet satellite with a reactor fell in Canada and unleashed a crisis

In the late 1970s, the idea that a nuclear reactor could fall from space ceased to be science fiction and became a real problem on the table of several governments. A Soviet satellite with a reactor on board It had lost control and was heading towards the Earth’s atmosphere, without anyone being able to specify where its remains would end up or what consequences the impact would have. In the midst of the Cold War, secrecy and urgency marked decisions. From there, questions arose that remain uncomfortable today: what was a nuclear reactor doing in orbit, why that risk was accepted, and what happens when technology escapes the script. As CBC points outOn January 24, 1978, the Soviet satellite Kosmos-954 re-entered the Earth’s atmosphere after weeks of tracking by American radars. No one knew with certainty where he would fall or in what state his remains would reach the ground. Eventually, fragments of the device were scattered over a vast region of northern Canada, from the Northwest Territories to areas that are now part of Nunavut and northern Alberta and Saskatchewan. What began as an orbital control problem suddenly became an international emergency with scientific, diplomatic and health implications. The day the Cold War left radioactive remains over Canada Kosmos-954 was neither a scientific satellite nor an isolated experimental mission, but one more piece of a Soviet military system designed to monitor the oceans. It was part of the US-A series, designed to locate large ships, especially American aircraft carriers, using radar. To power this system, which is very demanding in terms of energy consumption, the Soviet Union resorted to a compact nuclear reactor, a solution that allowed operate for long periods without depending on solar panels. That technical choice explains why the satellite had fissile material on board and why its loss generated so much concern. The technological heart of Kosmos-954 was a BES-5 reactor, known as “Buk”, developed specifically for Soviet military satellites. This type of reactor used uranium-235 and was designed to power the US-A system radar for the life of the satellite. The BBC estimates that 31 devices were launched with BES-5 for this family of satellites, and places the use of reactors in space until the end of the 1980s, with launches that continued until 1988. That history was not a clean line, according to the BBC: there were previous failures and accidents, including serious problems in one of the first flights in 1970 and the fall of another reactor into the Pacific Ocean after a launcher failure in 1973, in addition to the plan security plan contemplated moving the core into a waste orbit to prevent its return to Earth. Arctic Operational Histories explains that The signs that something was wrong came weeks before re-entry. Tracking systems detected that Kosmos-954 was progressively losing altitude, an anomaly that indicated a serious failure in its orbital control. The United States began to follow its trajectory with special attentionaware that the satellite had a nuclear reactor on board. The big unknown was not only when it would fall, but whether the Soviet security system would manage to separate the core and send it to a safe orbit before the device entered the atmosphere. When it was confirmed that the debris had fallen on Canadian territory, the problem took on a completely new dimension. Authorities knew the fragments were scattered over a vast, largely remote, snow-covered region, making any quick assessment difficult. The first measurements detected radiation in some points, although without a clear map of the contamination. Faced with this uncertainty, Canada had to quickly decide how to protect the population and how to locate potentially hazardous materials in an extreme environment. To confront an unprecedented situation, Canada turned to international cooperation. Operation Morning Light mobilized Canadian and American military personnel, scientists and technicians, many of them from units specialized in nuclear emergencies. From improvised bases in the north, flights equipped with sensors capable of detecting radiation from the air were organized. Each anomalous signal led to more detailed inspections, in a race against time marked by extreme cold and lack of infrastructure. As the search continued, it became clear that the contamination was more complex than expected. Not only visible fragments of the satellite appeared, but also much smaller radioactive particles, difficult to detect and remove. This forced the teams to take extreme precautions expand tracking areas. At the same time, delicate communication work began with the northern communities, who wanted to know what real risks existed for health, water and the fauna on which they depended. As the weeks passed, the operation narrowed its objectives. The official Morning Light phase lasted 84 days, although CBC describes the search effort as extending through most of 1978 and the search covering an area of ​​124,000 square kilometers. In this process, 66 kilograms of remains were recovered and Canada considered the immediate threat to the population and the environment contained. The economic cost was raised and Ottawa claimed 6.1 million dollars from the Soviet Union, which in 1981 agreed to pay half, opening an unusual diplomatic process for an incident of this type. The case of Kosmos-954 was not closed with the removal of the remains from the ground. In the months since, the incident reached international forums and fueled an uncomfortable debate about the use of nuclear power in space. Several countries demanded greater security guarantees and more transparency in programs that, until then, had been developed under strong secrecy. The episode served to reinforce the idea that space accidents do not understand borders and that their consequences could directly affect third countries. Images | Arctic Operational Histories In Xataka | Mars is left with one less line of coverage: NASA loses contact with its key orbital repeater

lots of energy and very cheap

It seemed like the United States had the upper hand in the AI ​​race. Having the most advanced chips is undoubtedly an important asset, but there is something even more critical: having energy to power those chips. And if anyone has energy, it’s China. master move. The control that the US exercises over NVIDIA and other advanced semiconductor manufacturers seemed to make this power a great candidate to win the AI ​​race. However, in this game of geopolitical chess, China has moved a piece that challenges that reality. The Asian giant’s strategic advantage is not in the chips, but in something more fundamental and massive: a colossal and enviable energy supply. Lots of energy and very cheap. Between 2010 and 2024, China increased its energy production more than the rest of the world combined. Last year alone it generated more than twice as much electricity as the United States, which is saying something. That difference has made OpenAI I already spoke of the “electron gap” (electron gap), and that translates into a brutal cost advantage for data centers: while an operator in Virginia pays between 7 and 9 cents per kW/h, their Chinese counterparts pay 3 cents. The long term works. China has shown that Your long-term strategy continues to bear fruit. In this case, this energy advantage is not an accident either, but rather the result of state planning that crystallized in the plan of 2021 known as “Data from the East, Calculation from the West”. What they did was take advantage of the vast energy resources of the country’s interior, especially in regions like Lower Mongolia, to power data centers that serve demand in the more populated eastern part of the country. What were once just steppes are now in many cases infinite wind farms and transmission lines that supply energy to more than 100 data centers in operation or under development. Power makes up for lack of advanced chips. For Chinese companies, access to cheap energy is especially important. In fact, since you cannot match the performance of advanced chips like the H100 with your own chips, what you do is group thousands of your own less advanced chips, taking advantage of the fact that what is “left over” is energy. We have the perfect example in Huawei’s CloudMatrix 384 cluster that makes use of your Ascend chips. It consumes four times more energy, and although that would be an unsustainable waste for the US, for China it is a viable way to compete. Satya Nadella already warned of the problem. China continues to invest in expanding its network and that electron gap can widen. Morgan Stanely predicts that around 560 billion will be spent until 2030, and Goldman Sachs affirms that in 2030 China will have 400 GW capacitytriple what global data centers will need. The room for maneuver to continue expanding that facet without problems. Meanwhile, some executives like Microsoft’s Satya Nadella already warned weeks ago that it doesn’t matter if the US has the most advanced components when there is no power for so many chips. Time is in China’s favor. The contrast between both powers is clear. The US has the technology, but its energy expansion is hampered by bureaucracy and insufficient energy transmission capacity. This has precisely made AI companies look for chestnuts with solutions like SMRbut time is on China’s side because they continue to work tirelessly on the development of its own advanced chips of AI and manufacturing technologies latest generation. The longer that race lasts, the more opportunities there will be for the Asian giant to close the component gap. Image | Antonio Garcia In Xataka | In the midst of a trade war, there is a battle that China has already won: that the world depends on its new energy

a data center that will run on wind energy

In the silent race that the world is waging to dominate digital infrastructure, every movement matters. And Brazil, far from being a spectatoronce again occupies a strategic place. The arrival of the TikTok project in the Brazilian northeast confirms a shift in the world technology map: critical infrastructures are no longer concentrated only in the United States, Europe or Asia, but are beginning to expand towards regions that offer abundant renewable energy and direct international connection. The advertisement. TikTok have decided to install a mega data center in the Pecém Industrial and Port Complex, in the state of Ceará. The company detailed in its press release that it will allocate more than 200,000 million reais —about 32,000 million euros—, the largest investment it has made in Latin America. Of that amount, 108 billion will be allocated exclusively to high-tech equipment until 2035; the rest will finance infrastructure, energy systems and future expansions. Operations are planned for 2027, and local authorities estimate the creation of more than 4,000 jobs. The infrastructure that the AI ​​era demands. Data centers have become the engine that makes AI, cloud and streaming possible. As Wired remembersthe push of artificial intelligence has skyrocketed the demand for computing and has opened a global competition to build larger and more efficient infrastructures. Brazilian interest in attracting data centers is supported by both its renewable energy matrix – cheap and abundant – and connectivity what Fortaleza offersentry point for most the submarine cables that link the country with the United States, Europe and Africa. A data center powered only by wind. For the initial phase, TikTok will work with Omnia, a local data center operator, and with Casa dos Ventos, one of the largest renewable energy developers in the country. The project is presented as an example of digital infrastructure powered entirely by clean energy. TikTok and its partners will build exclusive wind farms to supply the center, which will allow them not to use energy from the public grid. Depending on the platformthis will avoid any pressure on local supply. Technically, the company states that it will use a closed water reuse circuit combined with air cooling to reduce water consumption. However, as the Government of Ceará has pointed outrefrigeration will be 100% air-based, and the use of water will be limited to human activities and maintenance. Furthermore, the installation will incorporate PG25 technologywhich allows servers to operate at higher temperatures with less need for cooling, substantially reducing energy expenditure. The voices that question the project. Not everything is celebrations. The main resistance comes from the Anacé indigenous people, who denounce, as reported by El Paísthat part of the complex would occupy territories that they consider ancestral. Their organizations affirm that no prior consultation was carried out and express concern about the possible socio-environmental impacts: both on the use of water and on the transformation of the territory. TikTok maintains that it complies with Brazilian regulations and emphasizes that its energy and cooling model will minimize any pressure on natural resources. The Government of Ceará add thatThe companies involved must invest 15 million reais per year in the communities around the Pecém complex. On the global board of digital infrastructure. The megaproject is part of a broader strategy. Lula’s Government approved measures to reduce taxes and attract data centers, with the intention of transforming Brazil into a regional digital hub. In parallel, the United States promotes initiatives such as the stargate project to maintain competitiveness in artificial intelligence, while China accelerates the expansion of its technology companies abroad. TikTok, of Chinese origin, thus fits into a delicate diplomatic balance that Brazil tries to maintain. Beyond the economic investment, a data center of this scale raises debates about privacy, digital sovereignty and local data storage, dimensions increasingly present on the Brazilian legislative agenda. The speed of digitization. The TikTok megaproject in Ceará symbolizes the tension of a world that is digitizing at unprecedented speeds: it promises clean energy, employment and modernization, but it also reopens discussions about territory, regulation and environmental memory. Between the technological ambition of a digital power and the concerns of a community that defends its land, Brazil once again places itself at the intermediate point of global forces and local demands. The contrast is inevitable: while institutions celebrate the promise of a future powered by wind and data, indigenous communities in the northeast remember that the technology that connects the world also leaves footprints on the ground they walk on. At this intersection between progress and complaints the true impact of TikTok’s new digital heart in Latin America will be defined. Image | PXHere and Greenwish Xataka | Researchers removed Instagram and TikTok from 300 young people to see if their anxiety decreased. The results speak for themselves

Lava rises hundreds of meters in Hawaii. Under it, a much bigger plan: reactivate geothermal energy

The heat from the depths of the Earth is in the news again. And not only because of the almost unreal images of Kilauea launching jets of lava hundreds of meters high on the Big Island of Hawaii. Also because, while the volcano chains increasingly spectacular eruptive episodes, the United States is rediscovering the energy that those same volcanoes hide beneath the surface. Geothermal energy had been in the background for years. Suddenly, it matters again. Quite a spectacle. First of all, the United States Geological Survey (USGS) has warned that Kilauea is preparing for another high-energy eruptive episode. However, these are not isolated episodes. According to ABC Newsthe volcano has already had 36 and 37 eruptive episodes since December of last year. In some phases, the fountains have reached 300 meters and in others they reached 457 meters, a height comparable to a 100-story skyscraper. Even so, the entire phenomenon remains contained. All activity remains within the crater, away from homes or structures. That does not detract from the power of the figures: according to the USGSepisode 37 expelled 6.3 million cubic meters of lava in just nine hours, at a rate of around 190 m³ per second. But behind the show, another debate is beginning to make its way. Hawaii’s untold potential. In fact, as the Hawaii Tribune-Herald recallsSince 1993, the state has had a commercial geothermal plant, Puna Geothermal Venture, located precisely in the East Rift Zone of Kilauea. The University of Hawaii estimates that this facility produces five times more electricity than one of the state’s leading solar parks using 80% less land. The problem is that Hawaii has never tapped into that potential. The reasons combine real volcanic risksexploration costs and cultural resistance of communities for which drilling is a form of desecration of Pele, the volcano goddess. However, the context has changed. Kilauea’s continued activity brings back to the table a question that seemed shelved: should Hawaii use the heat that fuels its volcanoes to power its electrical grid? A door that begins to open. The University of Hawaii has been insisting on it for years. According to their analysis, all major islands could have usable geothermal resources, although knowledge outside Kilauea remains limited. Your Play Fairway project, funded by the Department of Energyhas already drawn the first deep heat maps beyond Puna. The pressure is now political. According to the Hawaiian mediathere are three state agencies competing for funding to re-explore the island in search of new deposits. 80 million public dollars are requested to map resources, drill test wells and reopen the way to a geothermal expansion that has been stalled for decades. The plan includes drilling outside of Puna, on the Big Island, but also in Maui and Oahu, where the resources would be deeper. As the volcano flares up and spills lava in nine-hour episodes, Hawaii looks under its feet: not at the magma, but at the heat that drives it. America’s geothermal renaissance. This local turn coincides with a national renaissance. According to a report by WoodMackenziegeothermal investment in North America soared 85% by 2025 in the first quarter alone, with $1.7 billion in public funds. The reason is not in the volcanoes, but in technology. The analysis points out three innovations that are transforming the sector: According to that analysisthe United States could have 500 gigawatts of geothermal capacity, a figure capable of reconfiguring the country’s energy matrix. However, there is still more. The hidden engine: data centers and AI. As TechCrunch detailedthis underground energy could cover two-thirds of the electrical consumption of the new data centers that will be built in the United States between now and 2030. And the technology giants are already taking positions. In fact, the cases are beginning to multiply as is Meta has signed an agreement with Californian startup XGS Energy to generate 150 MW of geothermal electricity by 2030 using a closed-loop system that prevents water leaks. Also Google has done the same partnering with Fervo Energy. Geothermal energy is no longer a marginal experiment: it is an energy outlet for the infrastructure that supports artificial intelligence. The question left by the volcano. As Kilauea continues its choreography—inflating, roaring, and shooting lava to heights not seen since the 1980s—Hawaii and the rest of the country look downward toward the primeval heat pulsing beneath the crust. Where nature shows its wildest power, technology sees promise: a forgotten energy resurfacing as the United States the more you need electricity continuous, abundant and clean. Image | Pexels and Rjglewis Xataka | Tenerife seeks to turn on its lights with the heat from the subsoil: this is its great commitment to geothermal energy

In 2011 Japan closed the largest nuclear power plant on the planet. Now he has decided to reopen it in the midst of the energy debate

The nuclear debate, which Japan thought closed, returns to the scene. The authorization of the governor of Niigata to reactivate Kashiwazaki-Kariwa, the largest atomic plant in the world, has set off alarms: citizen distrust, the shadow of Fukushima and doubts about whether TEPCO is the right company to lead the country’s new energy stage are emerging. A new nuclear revival? The Kashiwazaki-Kariwa plant, managed by Tokyo Electric Power Company (TEPCO), has not produced a single kilowatt since 2012. The closure was a direct consequence of the 2011 tsunami and the three meltdowns from Fukushima Daiichia blow that left reactors with similar designs under suspicion. That technical coincidence was enough to keep its seven reactors on hold for more than ten years, despite the fact that the plant was essential for the electricity supply of northeastern Japan. According to Japan TimesHideyo Hanazumi has authorized a step-by-step reactivation that will start with reactor 6—one of the most recent and powerful—and that, later, will also include reactor 7. Altogether, the complex exceeds 8,000 MW of capacity, a figure that not only imposes: it maintains it as the largest nuclear facility on the planet. A significant change for the Japanese country. Kashiwazaki-Kariwa has gone from a technical project to a strategic move. As reported by the Financial TimesTokyo trusts that its reactivation will contribute to lowering the electricity bill and ensuring energy sources with fewer emissions, at a time complicated by the Russian invasion of Ukraine and the fall of the yen, which makes fossil fuel imports more expensive. Japan, which before Fukushima generated almost 30% of its electricity with atomic plants, fell to practically zero after the disaster. Since then 14 reactors have reopened and others await local or regulatory approvals. The government aims for nuclear energy to once again represent 20% of the mix in 2040. In addition, TEPCO would improve its annual accounts by around 100 billion yen thanks to the restart, according to Japan Forwardat a time when it continues to face enormous costs for the dismantling of Fukushima Daiichi. The reactivation process. The restart will begin with unit 6, which already has fuel loaded and will begin commercial operations before March of next year. To move forward, TEPCO must respond to the Government’s demands, which include updating all security systems and improving emergency evacuation plans. The process has not been easy. As detailed by Japan Timesthe plant passed safety reviews in 2017, but then suffered a veto from the Nuclear Regulatory Authority due to deficiencies in anti-terrorist measures, lifted in 2023. In addition, TEPCO had to incorporate biometric controls and correct security flaws after new internal incidents. Is there controversy? Yes, and a lot. According to a survey cited by the BBC50% of Niigata residents support the revival, while 47% oppose it. However, almost 70% express their concern because the person operating the plant is the same company that caused the accident. From Japan Times He adds that the rejection intensifies in some of the towns located within 30 kilometers of the plant, where the majority fear a new disaster or distrust the company. Another source of discomfort, also pointed out by this medium, is that the electricity generated is not used in Niigata, but in the Tokyo region. The political dimension is equally tense. Hanazumi, aware of the sensitivity of her decision, has announced that he will submit his continuity as governor to the vote of the prefectural assembly, the only body that can remove him. But there is something else at play. The reopening of Kashiwazaki-Kariwa is seen as a pillar to ensure the country’s energy security and avoid possible power outages in Tokyo. It would also allow reducing electricity rates that have increased notably since 2011. At the same time, Japan is not only restarting reactors: it is also is planning the construction of new plants with fourth generation reactors, which would mark a new chapter in the country’s energy policy. More than a return to the atom. The country that one day vowed not to depend on atomic energy again has ended up returning to it, driven by necessity, geopolitics and the urgency to decarbonize. It remains to be seen if this decision will also ignite the confidence of a citizenry that still carries the memory of Fukushima or if, on the contrary, the return to the atom will deepen a division that has been open for more than a decade. Although the governor’s approval is the decisive step, there are still procedures: the prefectural assembly must debate and vote on the decision in December, and the Japanese nuclear regulator must complete the formal procedures for reactivation. Image | IAEA Imagebank Xataka | In 2011, Japan promised itself not to bet on nuclear energy again. Until he met reality

Reopening nuclear power plants sounds very spectacular, but Google has a plan B in case it’s not enough: solar energy

Data centers for are insatiable monsters those who are responsible for them must feed. OpenAI, Meta, Microsoft, xAI, Anthropic and Google are burning money riding colossal data centers for training and management of artificial intelligence. But these installations are not expensive to set up: they are also expensive to maintain. They require a considerable amount of energy to functionand Google has just received a ‘shot’ of renewables. All thanks to a direct connection to the largest system in the United States. Renewables to power AI. Google and TotalEnergies have just signed a agreement of energy purchases for 15 years. The contract stipulates that the energy company will deliver 1.5 TWh of electricity from its Montpelier solar plant, in Ohio, to Google. The plant is still under construction and they estimate that it will have a capacity of 49 MW, but the most important thing is that it will be connected directly to the electricity system. PJM. It is the largest network operator in the United States. It covers 13 states and data centers are representing a relevant portion of the operator’s pie: in its last annual auction, the load of these facilities PJM capacity sale triggered at 7.3 billion dollars, 82% more. Astronomical needs. In the statement from TotalEnergies, the company that this agreement illustrates its ability to meet the growing energy demands of the major technology companies. The problem is that it is not enough. If we focus on Google, the consumption of its data centers was 30.8 million megawatt hours of electricity. The company has been focused on AI for years, but the recent ‘boom’ has made it double what its centers consumed in 2020 (14.4 million MWh). Currently, data centers are estimated to account for 95.8% of Google’s total electricity budget. But it’s not just Google: the International Energy Agency esteem that global data centers consumed 415 TWh last year, representing approximately 1.5% of global electricity consumption. It seems little put in percentage, but Spain consumed in 2024 231,808 GWh, or 231 TWh, in 2024. The data centers of a handful of companies alone consumed twice as much as an entire country. And the estimate is that this data center consumption will double by 2030, reaching 945 TWh. Renewables are not enough. Now, although renewables are a support for the total energy required by data centerssolar and wind power have two limitations: intermittency and variability. Generation depends on weather conditions and time of day, meaning it fluctuates dramatically even throughout the same day. This instability clashes head-on with the high reliability and availability requirements of data centers. These are installations that must operate continuously and cannot assume cuts or Unforeseeable drops in supplysince AI or cloud storage would suffer the consequences. These renewables require backup batteries, but it is complicated and expensive to have such a large number of batteries just to power data centers. Pulling the gas and looking at the nuclear. That’s where other sources come into play. On the one hand, nuclear. In October 2024, Google signed the world’s first corporate agreement to acquire nuclear energy from SMR reactors. The first will come into operation in 230 and it is expected that, together, they will be able to satisfy the technology company with 500 MW of capacity by 2035. On the other hand, natural gas. In October of this year, the Broadwing Energy Center project began, a new natural gas power plant that will have a capacity of 400 MW and is scheduled to come into play at the end of 2029. Decarbonization and pressure. And the big question is… doesn’t the use of gas for AI clash with the technology companies’ objectives of achieving decarbonization percentages for both 2030 and 2050? We have already seen that oil companies have been getting off the renewables bandwagon because they have seen that fossil fuels are still relevant in the technology industry, but in the case of Google, they rely on the fact that projects like the Broadwing Energy Center They will have CCS systems. This means that it will have carbon capture system that will be able to permanently “sequester” 90% of the emissions. It means burying the problem, literally, since the CO₂ will be stored a mile underground. In 2020, before the AI ​​boom, the company established the goal of operating with carbon-free energy 24 hours a day, seven days a week by 2030. It will be interesting to see how they plan to offset these emissions thanks to renewables, but the IAE estimates that the demand for data centers will not stop growing in the short term and that adds another problem: a increased pressure on the electrical grid which is added as another element to manage. Because the big underlying problem is that the demand for energy is growing at a faster rate than the capacity to generate new electricity, and it is something that has an impact on companies’ bills, but also in homes. Images | Unsplash, Google Data Center In Xataka | China does not have a spending problem with AI. What it has is a huge income gap compared to its main rival

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.