It took a hacker two and a half hours to steal thousands of personal data from Endesa customers. Endesa took a week to notify

Endesa Energy has confirmed a cyberattack on its trading platform that has exposed critical information of millions of customers. The breach includes identity documents, bank accounts and data from electricity and gas contracts, which places those affected at risk of fraud and identity theft. What exactly happened. A cybercriminal has managed to circumvent the security measures of Endesa’s commercial platform and access sensitive customer information related to their energy contracts. According to has recognized the company in communications sent to those affected, during the security breach contact information, ID and IBAN numbers from bank accounts would have been extracted. The company ensures that the access passwords have not been compromised. The magnitude of the incident. The hacker responsible, who identifies himself as “Spain,” posted on January 4 on BreachForums, a popular forum in the dark webdetails of the attack claiming to have obtained more than 1 TB of information corresponding to more than 20 million people, according to reported the Digital Shield medium. The cybercriminal assured this medium that he had gained access in less than two and a half hours, and has gone so far as to leak data samples from a thousand clients to demonstrate the authenticity of the stolen information. What type of data is at stake. The hacker claims to have obtained basic personal data (names, surnames, postal addresses and contact information), financial information (IBAN, billing data and account history), energy data (CUPS, active electricity and gas contracts, supply point information) and regulatory data. The risks for clients. Although Endesa considers it “unlikely” that the theft will result in “a high-risk impact on the rights and freedoms of users,” the company warns of several real dangers in its official statement. Cybercriminals could try to impersonate customers, post the data on digital forums, or use it for phishing and spam campaigns. Josep Albors, Director of Research and Awareness at ESET Spain, explains that “the risk does not end with the notification of the breach” and that the exposed information can be reused for months or years to launch targeted fraud. Endesa’s response. The energy company has taken almost a week to publicly acknowledge the incident since the leak became known. The company claims to have immediately activated security protocols, blocked compromised access and notified the competent authorities of the case. In addition, it has enabled telephone lines to resolve doubts: 800 760 366 for Endesa Energía customers and 800 760 250 for those of Energía XXI, its distributor in the regulated market. We have contacted the company to find out more information about it, so we will update the article in case of news. What should those affected do? The problem with this security breach is that the data is surely used for advertising campaigns. phishing and targeted spam. As explained by ESET, the first thing we should keep in mind as affected parties is to distrust any communication that appears to come from Endesa and that includes links, attachments or urgent requests, always contacting the company through official channels. This has not been the case, but it never hurts to frequently review bank accounts to detect unauthorized movements and change passwords, even if the company claims that they have not been compromised, activating security protocols whenever possible. two factor authentication. Free and useful websites like ‘Have I Been Pwned‘ allow us to check if the data has appeared in other known breaches by entering our email. The extortion attempt. According to account According to Escudo Digital, the hacker has tried to negotiate directly with Endesa through emails, although at the moment he has not set a specific ransom figure. The cybercriminal, who says he is not affiliated with any group of ransomware known, has received offers from third parties of up to $250,000 for half of the database, although he claims to have not sold anything yet. “I prefer to wait for Endesa to decide,” he told the media. A worrying trend. Just like they count From the media Expansión, this attack places Endesa on the growing list of large Ibex 35 companies that have suffered cyberattacks in recent months. Companies such as Iberdrola, Iberia, Repsol and Banco Santander have been victims of similar incidents that have compromised customer data. And they have not been the only ones, since cyberattacks and data leaks They are now much more common. In the case of Endesa it seems that we will have to wait for the company to offer more information on the matter. Cover image | Endesa In Xataka | OpenAI just assumed an uncomfortable truth about AI browsers: there is one type of attack that is impossible to block

Amazon’s new data center will be installed in La Puebla de Híjar

AWS, Amazon Web Services, has chosen La Puebla de Híjar (935 inhabitants) to build its first data center in the province of Teruel. The multinational has secured 70 hectares next to the N-232 and plans to start works in autumn 2027, as reported Aragon Newsdependent on the public body CARTV. Why is it important. The project places a province historically relegated on the European technological map and confirms the strategy of Aragon as hub of digital infrastructure. With 100 MW of power already guaranteed and access to the water of the Ebro and the Gaén canal, the complex specialized in AI thus avoids electrical saturation problems that grip the metropolitan area of ​​Zaragoza. The figure. The investment is around 5,000 million euros, according to sources in the technology sector consulted by local media such as Teruel Diary. It would be the fourth main AWS hub in the community, after Huesca, Villanueva de Gállego and El Burgo de Ebro. The context. Aragón has managed to mobilize more than 47,000 million euros in data centers, according to a study by the Basilio Paraíso Foundation presented in September. The community could become the third European market in the sector, only behind London and Frankfurt. Bigger words. Yes, but. The project arrives surrounded by conflicts regarding water supply: Amazon needs 350,000 cubic meters of water per year for cooling, and although it plans to draw directly from the Ebro, it requires a backup source. Negotiations with the community of Gaén irrigators have been stuck for months. The irrigators insist that they will not take “any step” that compromises the territory’s water future. Between the lines. The choice of Teruel is not accidental. The metropolitan area of ​​Zaragoza suffers a collapse in the capacity of its electrical substations that has slowed down other projects. The availability of energy (100 MW through the Endesa network with connection to the Híjar substation) has been decisive. What has Aragón done to become a leader? The community has developed a recruitment strategy based on three pillars: Energy availability: Aragon has prioritized the reserve of electrical capacity for strategic industrial projects. Administrative streamlining: the figure of the Declaration of General Interest of Aragon (DIGA) allows these megaprojects to be processed as Plans of General Interest, shortening bureaucratic deadlines. It already happened with Stellantis and CATLand with Microsoft. Logistics infrastructure: The N-232 functions as a backbone, connecting the data centers from Huesca to Bajo Martín. The money trail. AWS has already mobilized more than 15,000 million euros in Aragonwith a forecast of creating 6,800 direct jobs. Companies such as Microsoft, QTS and the Aragonese company Forestalia have also opted for the region. Only Forestalia has planned invest an additional 12 billion in three new centers in Magallón, Botorrita and Alfamén. And now what. AWS must present the documentation for the DIGA to the regional government. The British engineering company Arup, in charge of the project, will finalize the application in the coming weeks. The final agreement on water with the irrigators remains pending, an obstacle that may delay the planned schedule. The project will transform the Venta del Barro industrial estate, which already employs a thousand people from the Bajo Aragón regions. In Xataka | The problem with data centers is not that they are running out of water or energy: it is that they are running out of copper Featured image | The Puebla de Híjar

The US electrical grid does not support so many data centers so they have had an idea: disconnect them to avoid blackouts

One third of all data centers in the world They are in the US and that is putting a huge burden on the electrical grid. One of the consequences that consumers are noticing is the price increases on the invoice, But electricity operators already foresee another problem: blackouts. What is happening. They tell it in WSJ. The US power grid is beginning to become strained, with grid operators expecting blackouts during periods of high demand. The solution they propose to avoid this is to make data centers disconnect from the network and use their own energy reserves temporarily. The technology companies have not been amused and talk about “discriminatory measures.” Why is it important. In 2023, data centers already consumed 4% of all the country’s electricity and the forecasts are that by 2028 that percentage will increase to 12%. The electrical grid is not prepared to support so much demand and, although it is already expanding, the pace of construction of new data centers is faster. Network operators face a difficult dilemma: powering data centers while maintaining supply to consumers. ‘Kill switch’. PJM Interconnection It is the organization that oversees the energy market in the Midwest, where they have already suffered from the problem of price increases. The concern that blackouts will occur is on the table and PJM has proposed that technology companies create their own energy sources or accept that their supply will be cut off if the network becomes too saturated. They are not the only ones who have raised something like this. With demand expected to double by 2035, Texas passed a law last year that contemplates a ‘kill switch’ that allows large consumers, such as data centers, to be disconnected at times when the network is under “extreme stress.” What the technologies say. As we said, the companies that own these data centers have not been very happy with the proposal. The Data Center Coalitionof which companies such as Google, Microsoft and AWS are part, have stated that the proposal is discriminatory since data centers need a reliable and stable network. They also warn that depending on their own energy reserves could have a negative environmental impact, by forcing them to use solutions such as diesel generators. Waiting times. There is an intermediate scenario in which technology companies can obtain benefits if they accept these conditions. As the electrical infrastructure does not support so much demand, data centers have to wait several years to be connected to the network, normally between 3 and 5 years, although there have been cases up to 8 years. Southwest Power Pool, the grid operator in Texas, has offered data centers a deal: give them access to the grid sooner in exchange for agreeing to be disconnected during times of high demand. According to a recent study Funded by Google, data centers that have more flexible connections (i.e., those that build their own power sources and accept temporary disconnections) typically connect to the grid several years faster than those that do not. Bring your own energy. Despite the reluctance towards that off button, generating your own energy is the most realistic solution and the one towards which the industry seems to be moving. Google recently bought an electrical company in order to obtain its own energy. Others big tech Amazon, Microsoft, Oracle or xAI are also exploring create your own energy solutions such as natural gas and solar panels. Image | Google In Xataka | Drastically reducing data center consumption is crucial for AI. And China has had an idea: submerge them in the sea

It was the second worst value on the IBEX 35 in 2025, but it achieved its best portability data in history

Telefónica has closed its busy 2025 with two opposite faces: in bag plummeted 11% and ended up as the second worst value on the IBEX 35, only ahead of Puig. But on the street he won the battle: captured almost 200,000 mobile lines of the competition, its best historical record in portability. Why it is important. This contradiction explains well how the market no longer rewards only commercial success. Investors demand financial visibility, robust cash flow and a clear roadmap. Telefónica has achieved the first, but with the change of presidency a year is still in process of the rest. The turning point. Everything changed on November 4th. At his Capital Market Day, in which he took the opportunity to Publish your five-year strategic planthe operator announced a dividend cut in half (from 0.30 to 0.15 euros per share) and cash projections lower than expected. Investors immediately punished it: less dividend, less cash and little clarity about some operations. The backdrop. The stock market punishment contrasts with Telefónica’s best commercial year in a long time. The sum of Movistar and O2 portability quintupled in 2024 and consolidated the leadership of the leading telecom in the premium segment of the Spanish mobile market. Digi led the total market with 783,000 net lines, dominating the low cost. MásOrange lost 513,000 mobile customers, its worst result. Vodafone Spain gave up 435,000 lines, even with the sum of Finetwork. Telefónica’s commercial success is explained by a pincer strategy: Digi sweeps through increasingly cheaper rates, so Movistar and O2 have entrenched themselves in the highest value segments, where customers pay more and remain loyal. But that victory has not translated into stock market metrics. Only Puig has had a worse 2025 on the IBEX than the telecom company. Yes, but. The theory of “IBEX dogs” suggests that 2026 should be a better year for Telefónica. The most punished values ​​usually recover the following year and the analyst consensus sets a target price of 4.04 euros per share, 16% above current levels. Besides, The IBEX 35 has closed its best year since 1993. The index closed with a revaluation of 49%, driven mainly by banks and their record results. And now what. Telefónica faces 2026 with a more austere discourse on balance sheet and debt. The key is no longer in the strategic announcement, but in its execution. The market has discounted the dividend cut and what remains is to demonstrate that the new remuneration policy, linked to cash flow, is sustainable over time. For now, the year starts with an ERE that will cost 2,500 million euros and will save about 600 annually starting in 2028. Will that be enough to convince investors without sacrificing the commercial capacity that has allowed them to gain customers again? In Xataka | Telefónica has gone from 67,000 workers in 1997 to 25,000 today. And his plan is clear: go even lower Featured image | Telephone

a third of the world’s data centers are in a single country

Currently there are more than 11,000 data centers operating worldwidewhich is said soon. Seeing the huge investment by technology companies, The figure is going to grow exponentially in the coming years. Now, thanks to the interactive map of Data Center Map We know where they are. An overwhelming majority of them are in the northern hemisphere, with one country accounting for almost a third of the total. United States rules USA To no one’s surprise, the country with the largest number of data centers is the United States. Considering that the major cloud infrastructure companies are American, this is also not surprising. In total they have 4,303 data centers spread throughout the territory, but not on a regular basis: there are regions in which the concentration is brutal. In the state of Virginia alone there are a whopping 668 data centers, which is more than Germany, the second country on the list with 494 centers. The weather too We already know that data centers consume a lot of energy and much of it goes into cooling their components. The hotter it is outside, the more it will cost to cool it and therefore the more energy is consumed, as well as water. According to the American Society of Heating, Refrigerating and Air Conditioning Engineers, The ideal temperature for a data center is between 18 and 27 degrees Celsius. Location has a notable impact on electricity and water expenses, which is why technology companies usually choose places with lower temperatures to set up their infrastructure. The south also wants its piece of the pie Indonesia It is striking that, despite the temperature recommendation, there are many data centers in countries where heat is a problem. Rest of World has done an extensive analysis about this phenomenon and estimates that at least 600 facilities are operating in areas outside the optimal range. In fact, following the list of countries with the highest number of data centers, we see that Indonesia is in third place with 184 facilities, followed by Brazil with 196. Both have a average temperature of more than 26 degrees, which means that for much of the year temperatures exceed that threshold. Singapore A striking case is that of Singapore, where the average temperature is more than 28 degrees. It has 78 data centers, a low figure compared to those we have mentioned, but they are concentrated in a very small area, which makes it one of the countries with a higher data center density. Other countries where demand for data centers is increasing are IndiaVietnam and the Philippines, all of them with quite hot climates. The heat challenge Why build in such hot areas? For many countries, data being within their own borders is more important than optimal operating temperature. The risk that arises is that, with the temperatures increasing year after yearwhat is now a manageable situation can become a difficult problem to solve, especially in areas such as Southeast Asia and the Middle East. They say in Rest of World that precisely in Singapore there is an initiative in which more than 20 technology companies and universities participate with one objective: to develop a refrigeration system Specific for humid and hot climates. The most common cooling system is air, but in these areas it is most effective to use a hybrid cooling system that uses air when possible and water when it is hotter. In some areas with extreme temperatures such as the United Arab Emirates, they are even considering build them underground. In China they are testing an even more radical solution: build a data center under the sea. Image | ChatGPT, with data from Data Center Map In Xataka | Aragón is not afraid of AI: it has just approved three more new mega data centers in full commitment to renewables

Energy companies are switching from oil to MW. The new mine is the support for data centers

Gluttonous artificial intelligence and its demanding data centers are reshaping the decarbonization plans. When the world had begun a journey towards renewableswith countries like Chinaand Europeans betting big, and even some US states getting on the traindata centers arrived with needs that were almost impossible to satisfy. At the end of December 2024 we already have that data center consumption had skyrocketedpushing big technology companies to bet so much on renewable as, above all, for immediate access energy such as gas and even coal. Some were even aiming for nuclear to be able to operate. Shortly after, in January 2025, a Reuters report noted that European energy companies, which had embarked on a path of commitment to renewables, were doubling down on oil and gas. Giants like BP and Shell slowed down their investments in clean energy to return to fossil fuel projects. But it’s not all about where data centers extract energy from, but rather who provides them infrastructure. And that, and not so much oil or gas, may be the next energy mine. The new oil mine In an article of Financial Times It is suggested that the fleeting growth of data centers is generating a market that energy companies do not want to miss. As demand for traditional drilling weakens (although it is something that goes by “neighborhoods”), energy sector groups such as Baker Hughes, Halliburton or SLB are taking advantage to pivot to the data center sector. Not building them, not just supplying energy: supporting logistics. Taking advantage of their knowledge of the energy sector, these large companies would be providing equipment such as turbines and power generation systems to those who own data centers, but they also provide generators, batteries, dissipation systems and all the necessary framework to maintain correct energy efficiency. They would also oversee the team. It is, in short, what they already know how to do, but applied to a new sector such as data centers. Because these three examples are not typical oil companies, but technology providers for other companies to extract gas or oil. All three provide services to companies with oil fields, but also supply technology such as gas turbines, compressors or systems. LNG and they were inside sectors such as new energywith carbon capture and storage systems. All of this resonates with the idea that ‘Big Tech’ had when they began to build huge data centers, until they saw that increasingly demanding equipment needed more immediate and stable sources of energy. Data centers = El Dorado It is estimated that US electricity demand will increase by 90 GW -a real nonsense- from now to 2030 only to power the data centers. Traditional electrical grids may not support this load, and it is at that point that these companies that provide energy services They seem like a key entity. Pivoting toward artificial intelligence infrastructure is “key to the evolution of oil and gas,” said Lorenzo Simonelli, CEO of Baker Hughes. And it makes sense when we see that the number of US oil rigs contracted 7% year-over-year in 2025, margins have contracted and demand for drilling services is in interdict. On a business level, it is a masterstroke. Hypothetically speaking, when the new oil crisis arrives and the fall of the market for both crude oil and gas, companies that have pivoted to data centers, going from being service providers for energy companies to being service providers for ‘Big Tech‘, they will not have to take a turn in their strategy because they will already be where the money will be. Because that’s another question: whether the new MW gold for AI will be a lasting business or a passing fever. Image | freepik and Harpagornis In Xataka | The problem with renewables is what to do when there is excess energy. China believes it has the answer with a unique turbine

why OpenAI is installing Boeing 747 engines in its data farms

Just three years ago, Blake Scholl, CEO of aviation company Boom Supersonic, had a linear business plan: He would first build the supersonic plane of the future and, much later, retrofit its engines to generate power. However, a phone call changed the order of factors and revealed the desperation of the technology industry. On the other end of the line was Sam Altman. The OpenAI CEO’s message was a direct plea: “Please, please, please get us something.” Altman wasn’t looking for plane tickets; I was looking for electrical power. This anecdote, reported to the Financial Timessummarizes the state of emergency in the sector: artificial intelligence is advancing at breakneck speed, but it has hit the wall of physical infrastructure. While the AI evolves in monthspermits to connect to the electrical grid can take up to ten years in some regions. Faced with this paralysis, the industry has opted for “Plan B” which consists of bypassing the grid and manufacturing its own energy on site. The tall price of urgency. This strategic shift has profound consequences. The first is economic, the “delay” is expensive. According to BNP Paribas analystspower from a gas plant built for Meta in Ohio costs about $175 per megawatt hour, nearly double the average cost for an industrial customer. The second is environmental. Mark Dyson, from Rocky Mountain Institutewarns that the emissions of these plants are much worse than those of the general network, which combines efficient gas with renewables. Despite this, the urgency is such that the authorities are giving in. In Virginia, the world’s data center heartland, it is considering relaxing emissions rules to allow generators to run more frequently. Even polluting plants that were in retirement, like the Fisk plant in Chicagohave canceled their closure to feed the demand for AI. From the sky to the data center. The most surprising solution comes from aeronautical engineering through aeroderivative turbines. The ProEnergy Company are buying motor cores CF6-80C2 of the iconic Boeing 747 to rebuild them as ground power units. A single one of these turbines generates 48 megawatts, enough for a city of 40,000 homes. It is not an isolated case. GE Vernova already supplies this technology for the gigantic Stargate (OpenAI/Microsoft) data center in Texas. Blake Scholl himself confirmed that it will sell Crusoe turbines “practically identical” to those of his supersonic planes to finance his aeronautical project. The return of diesel. Beyond aviation turbines, the sector is rescuing the most reviled fuel: diesel. The manufacturer Cummins has already sold 39 gigawatts of energy to data centers, doubling their capacity this year. What was once emergency equipment for power outages is now in demand as a primary energy source. The situation has escalated to the US Government. Secretary of Energy, Chris Wright, suggested on Fox News an almost war economy measure: requisition the backup generators from data centers or large stores like Walmart to turn them over to the network when the general system falters. The ignored alternative: Is smoke necessary? Not everyone agrees that the return to the fossil is inevitable. A study by researchers at Stripe, Paces and Scale Microgrids maintains that the future is in “off-grid” solar microgrids. According to their calculations, a system with 44% solar energy is already as cheap as gas, and one with 90% renewables would surpass nuclear projects in profitability. The advantage is speed since these solar farms can be built in less than two years in desert areas from Texas or Arizona. Giants like Google have taken note, buying the electric company Intersect Power for 4.75 billion dollars to protect its clean supply and not depend on the network. However, the majority industry prefers diesel and known gas due to a matter of technical inertia, due to the prosaic fear that the cloud will go out if the sun does not shine. AI goes physical. The industry finds itself in a technical paradox. To power the most advanced software on the planet, big technology companies are resurrecting combustion engines and burning fossil fuels on a massive scale. Although these “bridge turbines” allow AI to continue growing today, experts cited by the Financial Times They warn that this fever could cool as the tech giants reduce their capital spending. For now, the cloud has had to come down to earth. The future of artificial intelligence, ironically, depends not only on brilliant code, but on who controls the underground and who manages to turn on enough “plugs” so that the greatest technological revolution of our era is not left in the dark. Image | freepik and Harpagornis Xataka | The exorbitant deployment of data centers for AI has a new problem: salt caverns

The exorbitant deployment of data centers for AI has a new problem: salt caverns

In the collective imagination, artificial intelligence is an ethereal cloud of algorithms. The reality is much more complex and what we know for sure is that an energy eater that needs to “eat” constantly. Satya Nadella, CEO of Microsoft, has summarized with unusual crudeness: “The problem is no longer that it is missing Nvidia chips, but that there are not enough plugs.” And so that these plugs have power 24 hours a day with the 99.999% reliability that the sector demands, Big Tech has ended up looking where no one expected: thousands of meters below the ground, towards the salt caverns. When the bits hit the underground. The AI ​​race has entered a “slow start” phase in the construction of these underground caverns, which could hinder the rollout of data centers. According to Fortunethe reason is mathematical since these digital infrastructures do not tolerate interruptions and require extreme reliability. To guarantee this constant flow, natural gas has become the indispensable backup. However, as they explain, it is not enough to produce gas; you have to save it. Industry projections indicate that only about half of the storage that will be needed to meet future demand has been planned. Without these artificial caves dug thousands of meters below the surface, hyperscalers (Google, Amazon, Meta) are left at the mercy of gas pipelines, vulnerable to corrosion, landslides or extreme weather events. But why salt caverns? The technical answer lies in flexibility. As detailed by experts in Fortunethere are two ways to store gas: in depleted oil fields or in salt caverns. The former are cheaper, but structurally slow. The gas is injected in summer and extracted in winter, following a classic seasonal cycle. AI, on the other hand, does not understand seasons. Their demand peaks are constant, sudden and difficult to predict. The salt caverns, created by injecting water to leach the mineral, act as a high-pressure lung: they allow gas to be injected and extracted with a much higher frequency, adapting to the volatility of the electrical grid that powers the servers. The “supercycle 2.0”. Given this scenario, companies like Enbridge they have taken the lead. Greg Ebel, CEO of the company, has confirmed that they are expanding their facilities in Egan (Louisiana) and Moss Bluff (Texas). “This demand dramatically changes the economics of supply,” he said. But it is not enough. Jack Weixel East Daley Analytics analystwarns that double the capacity currently planned is needed. Projects such as the Freeport Energy Storage Hub (FRESH), in Houston, They seek to connect up to 17 gas pipelines to a new salt dome by 2028, but construction times—often exceeding four years—clash with the urgency of AI. For his part, Jim Goetz, CEO of Trinity Gas Storage, defines it as the “storage supercycle 2.0”. His company has just reached the final investment decision (FID) to expand its capacity in East Texas, seeking to support critical infrastructures such as Stargate, the titanic $500 billion project from OpenAI and Microsoft. The shadow of a doubt. The underlying question is not only whether the salt caverns work—they work—but what type of energy system they are consolidating. Natural gas is fast, flexible and reliable, but it also introduces new dependencies and risks. According to analystsgas infrastructure on the Gulf Coast is especially vulnerable to extreme weather events. A direct hurricane over Texas or Louisiana can disrupt production, exports and transportation at the same time. In that scenario, even with gas available in other regionsthe lack of nearby storage can leave data centers without electrical backup. Added to this is the question of price. The sustained increase in demand to fuel data centers, LNG exports and reindustrialization is already pushing up gas and electricity bills. Without enough storage capacity, that volatility is amplified. As the sector points out, storage acts as a buffer; when it is missing, the peaks transferred directly to the consumer. Furthermore, the criticism is more structural since AI is pushing to prolong dependence on fossil fuels just when governments and companies were committed to reducing it. Look beyond the gas. Aware of this physical limit, large technology companies are no longer looking only at salt caverns and gas pipelines. They look for any firm source of electricity that does not depend exclusively on the traditional energy market. An example is Fervo Energy, a geothermal startup that has just closed one of the largest financing rounds in the sector, with Google as an investor and client. His commitment to advanced geothermal —constant electricity 24 hours a day—reflects the extent to which AI is redrawing the energy map. This is not an immediate or universal solution, but it is a clear signal: the problem is no longer technological, but energy-based. A problem only in the United States? The United States is the epicenter, but not the only scenario. The clash between AI and energy is global, although responses vary. In Europe, the rise of AI is leading to rethinking the closure of gas and coal plants. Some electricity companies are negotiating to convert old plants into data centers, taking advantage of their access to the network, water and already depreciated infrastructure. The logic is the same: firm, immediate and available energy. China, for its part, has chosen another path. Beijing not only promotes underwater data centers either large energy clusters in interior provinces, but directly subsidizes the electricity that powers its AI. The objective is to reduce the “fuel” of digital models and compensate for the lower energy efficiency of national chips compared to those from Nvidia. The return to the underground. In all cases, the pattern repeats itself. Renewables are growing, but not fast enough or with the stability necessary to sustain the demand for AI in the short term. Gas – with salt caverns, temporary turbines or recycled plants – becomes the inevitable crutch. In our race to create an intelligence that lives on the plane of ideas, we have ended up returning to mining, drilling, and the depths of the Earth. The future … Read more

The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation

The construction of data centers is proliferating so much that although the largest in the world They are in Kolos (Norway), in The Cidatel (United States) and China, you can find them now even in Botorritain the province of Zaragoza. The limit is the sky. Or well, not even that: because Silicon Valley has been put between eyebrows set up data centers in space. And the main big tech companies are making moves to achieve this. Former Google CEO Eric Schmidt bought rocket company Relativity Space with that objective. Nvidia has supported the startup Starcloud in its project to launch the first NVIDIA H100 GPU into space a few weeks ago and Elon Musk has even condensed how he would do it in a tweet: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” He when Jeff Bezos slipped it in a prediction at the Italian Tech Week: We will see “giant training clusters” of AI in orbit in the next 10 to 20 years. The moon is a gift from the universe The next question would be “why?”. The reality is that there is no shortage of reasons. AI is a real energy guzzler and as demand does not stop growingspace offers a couple of differential advantages over Earth: almost unlimited energy and free cooling. On the one hand, in space we have a sun-synchronous orbit where solar panels receive energy almost continuously. On the other hand, you can install a radiator so large that the space functions as a kind of ‘infinite heat sink at -270°C’. The enormous amounts of water essential for cooling on Earth would not be needed. Let’s face it, today there are no plans to have data centers in space. But not too far away: University of Central Florida research professor and former NASA member Phil Metzger esteem that perhaps within a decade it could be economically viable. But its viability is so clear that it considers that taking AI servers into space are “the first real business case that will give way to many more“in the face of a future human migration beyond Earth. So for now, they try it on Earth. Consequence: that Donald Trump declare an energy emergency due to the enormous electricity demand expected for the coming years. As the power grid catches up (or tries to), AI companies have decided to move from a passive to a proactive position: Meta is going to become an electricity marketer. xAI by Elon Musk is using gas turbines as energy sources temporary. OpenAI is pushing to the United States government to lend a hand to electricity companies to add 100 gigawatts per year. That figure doesn’t say much, but it is astronomical: what OpenAI is asking for is that The United States built almost an entire Spain (around 145 GWh considering the 129 GW consolidated at the end of 2024 plus the solar and wind deployment of 2025) every year and a half in terms of infrastructure. AI is growing faster than electrical bureaucracy is advancing How could the Trump Administration help? With the eternal bureaucracy. Because on Earth they face great technical challenges, but they also face a legislative wall. To have more energy, the simplest and most immediate step is to build new power plants, but that means successfully going through the tangle of procedures that slow down the process. There is only one small problem: that in the United States depending on technology, it can take five to ten years… if you’re lucky. Interconnection to the grid alone can take six years, successfully overcoming an interconnection queue with more than 2,000 GW in projects who are already in line. Then, up to four years of federal and environmental permits to end in another couple of years for state and local licenses that must come to fruition. ‘Permit Stack’ they call it. And the journey does not end here: they must also avoid andthe citizen movementNot in my backyard‘ (not in my backyard, kind of like “yes, but not in my house”), which has already backed down the Battle Born Solar Project (Nevada), which was going to be the largest solar plant in the United States, or Danskammer gas station (New York), among others. This can delay the operation even further as rights of way must be negotiated with individual owners who may refuse, going through the courts again. The never ending story. To avoid processes NIMBY that last fifteen years or more, companies like OpenAI or Microsoft are buying plants that already exist, such as Three Mile Island, which is going to reopen only for Microsoftinstead of trying to build new ones from scratch. Amazon has also signed infrastructure that is already on the network like the Talen Energy Campus and it has partnered with Dominion Energy and X-energy to develop mini reactors (SMR). SMRs are also Google’s solution, in this case thanks to an agreement with Kairos Power. Everything is to avoid that tangle of ‘Permit stack’ procedures that in practice and according to estimates, makes it is faster to opt for the space route to build a power plant on the old, familiar Earth. At the end of the day for AI companies “The moon is a gift from the universe”, as already Jeff Bezos glimpsed. In Xataka | Musk has created the perfect circle: Tesla’s megabatteries power the AI ​​that will define its next cars In Xataka | Researchers have dismantled the batteries of Tesla and BYD. You already know which one performs better and is much cheaper. Cover | İsmail Enes Ayhan and NASA

Something is going wrong with AI. The US is turning to energy solutions that it thought were buried to power data centers

The race to develop and operate increasingly powerful artificial intelligence models comes at a cost that is rarely at the center of the technological narrative. It is not in the chips or the software, but in the huge amount of electricity needed to keep active data centers running around the clock. In the United States, this pressure is already being translated into concrete decisions: polluting power plants that were in retirement are being restarted to cover increasing peaks and tensions on the grid. The paradox is evident, the most ambitious advance in the technology sector depends, for the moment, on energy solutions from another era. The problem is not so much an absolute shortage of electricity as a time lag. The demand for data centers linked to AI it’s growing much faster than the ability to launch new electrical generation, especially renewable, in short terms. Building large energy infrastructures takes years, while these complexes can advance in much shorter time frames. Faced with this temporary shock, network operators and electricity companies are turning to what already exists and can be activated immediately, even if it is more polluting. PJM in context. The clash between electricity demand and supply is perceived with special clarity in the PJM region, the largest electricity market in the United States, which covers 13 states and concentrates a very significant part of the country’s data centers. We can understand it as a large regional electricity exchange that coordinates generation, prices and network stability in real time. There, the growth of data centers linked to AI is putting to the test a system designed for a very different consumption pattern, making PJM the first thermometer of a problem that is beginning to appear in other areas. What is a central peaker. The calls central peakeror peak, are facilities designed to come online only during short periods of peak demand, such as heat waves or winter peaks, when the system needs immediate reinforcement. They are not designed to operate continuously, but to react quickly. According to a report According to the US Government Accountability Office, these facilities generate just 3% of the country’s electricity, but they account for nearly 19% of the installed capacity, a reserve that is now being used much more frequently than expected. South view of the Fisk plant in Chicago The case of the Fisk plant, in the working-class neighborhood of Pilsen, in Chicago, illustrates well how this shift translates on the ground. It is an oil-fueled facility, built decades ago and scheduled to be retired next year, that had been relegated to an almost testimonial role. The arrival of new electrical demands associated with data centers changed that equation. Matt Pistner, senior vice president of generation at NRG Energy, explained to Reuters that the company saw an economic argument to maintain the units and that is why it withdrew the closure notice, a decision that returns activity to a location that many residents believed was in permanent withdrawal. When the price rules. The change is not explained only by technical needs, but also by very clear market signals. In PJM, the prices paid to generators to guarantee supply at times of maximum demand skyrocketed this summer, more than 800% compared to the previous year. An analysis by the aforementioned agency shows that about 60% of oil, gas and coal plants scheduled for retirement in the region postponed or canceled those plans this year, and most of them were units peakerjust the ones that best fit in this new scenario of relative scarcity. The bill for this energy shift is paid above all at a local level. The power plants peaker They tend to be older facilities, with lower chimneys and fewer pollution filters than other plants, which increases the impact on their immediate surroundings when they operate more frequently. Coal is also postponed. The phenomenon is not limited to power plants peaker fueled by oil or gas. On a national scale, several utilities have begun to delay the closure of coal plants that were part of their climate commitments. A DeSmog analysis identified at least 15 retirements postponed from January 2025 alone, facilities that together represent about 1.5% of US energy emissions. Dominion Energy offers a clear example: In 2020 he promised to generate all its electricity with renewables by 2045, but after the company projected that data center demand in Virginia will quadruple by 2038, it is now taking a step back. Images | Xataka with Gemini 3 Pro | Theodore Kloba In Xataka | A former NASA engineer is clear: data centers in space are a horrible idea

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.