Data centers are so important that Meta has spent millions on advertising to change our perception of them

Meta has spent 6.4 million dollars on an advertising campaign between November and December of last year to convince the American public of the benefits of its data centers, according to the New York Times. The ads, aired in eight state capitals and Washington, DC, featured idealized images of American towns revitalized by these facilities. exists an increasingly significant social rejection on the installation of data centers dedicated to AI, especially due to the impact they have on the excessive consumption of basic resources like light and water. And of course, first we have to convince that they are key so that Meta and the rest of the big technology companies can continue with their operations. The Goal campaign. According to the media, the ads featured emotional stories about Altoona (Iowa) and Los Lunas (New Mexico), two locations where Meta operates data centers. With guitar music and shots of farms and football fields, the videos promised jobs and prosperity. “We are bringing jobs here, for ourselves and for our next generation,” the voiceover said. According to Michael Beach, CEO of Cross Screen Media, Meta “could have purchased these ads with the goal of influencing political decisions and reaching legislators.” Ryan Daniels, spokesperson for Meta, limited himself to say to the NYT that the company pays the full costs of the energy used by its data centers, without commenting on the advertising campaign. Meta is not alone. Just like account NYT, Amazon is funding a similar campaign in Virginia through Virginia Connects, a nonprofit created by the Data Center Coalition. From the Financial Times they point In addition, other operators such as Digital Realty, QTS and NTT Data are also acting more intensely to defend the construction of new facilities. Endurance. In the United States, social rejection has caused the cancellation of multimillion-dollar projects in Oregon, Arizona, Missouri, Indiana and Virginia. Democratic Senator Chris Van Hollen explained He told the NYT that the issue has become “a priority on Capitol Hill” when his voters began to complain en masse about electricity bills. Just like share The media, this month, Van Hollen presented a law to regulate the energy consumption of data centers. Even President Donald Trump spoke out on the matter: “The big tech companies that build them must pay their own way,” wrote a few weeks ago on Truth Social. electricity bill. Data centers have become critical infrastructures for the development of artificial intelligence, but there is increasing social tension over their installation. In October, Bloomberg counted that in the last five years the wholesale price of electricity in areas near large concentrations of data centers in the United States had increased by up to 267%. In Baltimore, residents paid $17 per megawatt-hour in 2020; In 2025 that figure reaches $38. On the other hand, the medium demonstrated In their research, 70% of the points where electricity price increases were recorded were less than 80 kilometers from data centers with significant activity. From Bloomberg they estimate that the energy demand of these facilities in the United States will double by 2035, becoming the largest increase since the 1960s. The situation in Spain. Our country is also experiencing a boom in the construction of data centers. The Community of Madrid, paradoxically the region with the greatest energy deficit in Spainconcentrates a good part of these projects and is expected to reach a power of 1.7 gigawatts in 2030. The consulting firm CBRE pointed out in a report that “there is no investor, operator or large technology company that does not have in its strategic plans to establish its data center project in the Iberian market.” Madrid, together with Barcelona, ​​already competes with cities such as Milan, Zurich or Berlin, although still far from the leading European group in terms of power capacity formed by Frankfurt, London, Amsterdam, Paris and Dublin. What awaits us. According to Bloomberg, the forecasts they point because data centers will consume more than 4% of the world’s electricity in 2035. If these facilities were a country, they would be fourth in energy consumption, only behind China, the United States and India. Meanwhile, big technology companies are already exploring solutions such as modular nuclear reactors (SMR) to power your facilities, or send data centers to space. Cover image | Mark ZuckerbergGoal In Xataka | “The assemblies are not going to be done by AI”: we talk to the kids who have become carpenters, truck drivers and tinkerers

who puts the most data centers into orbit

He map of world data centers It shows that there is no decentralized internet and that they are proliferating like mushrooms. In fact, planet Earth has fallen short and big tech companies already have their eyes set on the sky to plant a data center in space due to issues such as energy demand, environmental impact and, why not say it, to avoid regulation. The “panacea” of space. Faced with the threat of energy consumption similar to that of Japan in 2030according to data from the International Energy Agency or the brutal density of Data center Alley in Loudonin northern Virginia, with nearly 250 operational facilities, space envisions the possibilities of having satellites equipped with solar panels that capture energy directly from the sun, thermal dissipation in space and the absence of terrain limitations. There’s less left. For it to be viable, it takes at least a decade, as esteem University of Central Florida research professor and former NASA member Phil Metzger. However, it is one thing for the bills to work out economically and another for technologically having to wait so long. According to Josep Jornetprofessor of computer and electrical engineering at Northeastern University and satellite researcher, in just a couple of years we will begin to see evidence. And he is clear: space is the next frontier to conquer: “There was a gold rush in the West. Now there is a space race and everyone wants to place their technology in space.” Money galore. The Catalan scientist is clear that companies have incentives to move quickly and invest to get ahead to dominate the AI ​​race in general and space in particular: “Everyone wants to say they have the first platform to reach this milestone (…)So companies are spending money like there is no tomorrow.” However, Google, SpaceX and Blue Origin they are already working in developing technologies for this purpose and they are not the only ones: SpaceX. At the end of the year the Wall Street Journal uncovered Elon Musk’s company’s plan to realize data centers in space. Its CEO explained in a tweet how he would do it: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” More specifically, they are working on modifying and improving their rockets to make them capable of hosting computing loads for AI. Blue Origin. The American media also put on the table Jeff Bezos’ project, which at the time revealed at the Italian Tech Week that it’s a matter of time before we see “giant training clusters” of AI in orbit in the next 10 to 20 years. The company has a team dedicated to developing the technology required for centers in space. Google. Last November the Mountain View company speak of their experimental project Project Suncatcher: in 2027 and with the collaboration of Planet Labs they will launch two test satellites with their own AI processing chips. Others. There are other smaller corporations working in this area. The most notable is StarCloud, a startup backed by NVIDIA that a few weeks ago launched a satellite with an NVDIA H100. This GPU is used to run a version GemmaGoogle’s open language model. You need energy (and knowing how to use it). Although the foundations have already been laid, the road is not exactly downhill. Jornet details that one of the big obstacles will be having enough energy for these orbital data centers to function: “The Sun can be a great source of energy, but to properly harness it, orbiting data centers would need huge solar panels kilometers long or a constellation of smaller panels that could number in the tens of thousands.” Life in space is hard. There are more melons to open, such as how AI chips will withstand harmful space radiation, as well as heat dissipation and cooling. On Earth thousands of liters of water are used. In space there is no such option and although temperatures are low, there is no air to cool the chips naturally. The bill to the Earth. Even ignoring the environmental impact in space, it also leaves its mark on Earth. At least, in the short term: rocket launches not only consume fossil fuels, but also damage ecosystems and animals in the environment, as happens at Cape Canaveralwhich now hosts about 80 launches a year. In Xataka | The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation In Xataka | The problem with data centers is not that they are running out of water or energy: it is that they are running out of copper Cover | Pixabay

The US electrical grid does not support so many data centers so they have had an idea: disconnect them to avoid blackouts

One third of all data centers in the world They are in the US and that is putting a huge burden on the electrical grid. One of the consequences that consumers are noticing is the price increases on the invoice, But electricity operators already foresee another problem: blackouts. What is happening. They tell it in WSJ. The US power grid is beginning to become strained, with grid operators expecting blackouts during periods of high demand. The solution they propose to avoid this is to make data centers disconnect from the network and use their own energy reserves temporarily. The technology companies have not been amused and talk about “discriminatory measures.” Why is it important. In 2023, data centers already consumed 4% of all the country’s electricity and the forecasts are that by 2028 that percentage will increase to 12%. The electrical grid is not prepared to support so much demand and, although it is already expanding, the pace of construction of new data centers is faster. Network operators face a difficult dilemma: powering data centers while maintaining supply to consumers. ‘Kill switch’. PJM Interconnection It is the organization that oversees the energy market in the Midwest, where they have already suffered from the problem of price increases. The concern that blackouts will occur is on the table and PJM has proposed that technology companies create their own energy sources or accept that their supply will be cut off if the network becomes too saturated. They are not the only ones who have raised something like this. With demand expected to double by 2035, Texas passed a law last year that contemplates a ‘kill switch’ that allows large consumers, such as data centers, to be disconnected at times when the network is under “extreme stress.” What the technologies say. As we said, the companies that own these data centers have not been very happy with the proposal. The Data Center Coalitionof which companies such as Google, Microsoft and AWS are part, have stated that the proposal is discriminatory since data centers need a reliable and stable network. They also warn that depending on their own energy reserves could have a negative environmental impact, by forcing them to use solutions such as diesel generators. Waiting times. There is an intermediate scenario in which technology companies can obtain benefits if they accept these conditions. As the electrical infrastructure does not support so much demand, data centers have to wait several years to be connected to the network, normally between 3 and 5 years, although there have been cases up to 8 years. Southwest Power Pool, the grid operator in Texas, has offered data centers a deal: give them access to the grid sooner in exchange for agreeing to be disconnected during times of high demand. According to a recent study Funded by Google, data centers that have more flexible connections (i.e., those that build their own power sources and accept temporary disconnections) typically connect to the grid several years faster than those that do not. Bring your own energy. Despite the reluctance towards that off button, generating your own energy is the most realistic solution and the one towards which the industry seems to be moving. Google recently bought an electrical company in order to obtain its own energy. Others big tech Amazon, Microsoft, Oracle or xAI are also exploring create your own energy solutions such as natural gas and solar panels. Image | Google In Xataka | Drastically reducing data center consumption is crucial for AI. And China has had an idea: submerge them in the sea

a third of the world’s data centers are in a single country

Currently there are more than 11,000 data centers operating worldwidewhich is said soon. Seeing the huge investment by technology companies, The figure is going to grow exponentially in the coming years. Now, thanks to the interactive map of Data Center Map We know where they are. An overwhelming majority of them are in the northern hemisphere, with one country accounting for almost a third of the total. United States rules USA To no one’s surprise, the country with the largest number of data centers is the United States. Considering that the major cloud infrastructure companies are American, this is also not surprising. In total they have 4,303 data centers spread throughout the territory, but not on a regular basis: there are regions in which the concentration is brutal. In the state of Virginia alone there are a whopping 668 data centers, which is more than Germany, the second country on the list with 494 centers. The weather too We already know that data centers consume a lot of energy and much of it goes into cooling their components. The hotter it is outside, the more it will cost to cool it and therefore the more energy is consumed, as well as water. According to the American Society of Heating, Refrigerating and Air Conditioning Engineers, The ideal temperature for a data center is between 18 and 27 degrees Celsius. Location has a notable impact on electricity and water expenses, which is why technology companies usually choose places with lower temperatures to set up their infrastructure. The south also wants its piece of the pie Indonesia It is striking that, despite the temperature recommendation, there are many data centers in countries where heat is a problem. Rest of World has done an extensive analysis about this phenomenon and estimates that at least 600 facilities are operating in areas outside the optimal range. In fact, following the list of countries with the highest number of data centers, we see that Indonesia is in third place with 184 facilities, followed by Brazil with 196. Both have a average temperature of more than 26 degrees, which means that for much of the year temperatures exceed that threshold. Singapore A striking case is that of Singapore, where the average temperature is more than 28 degrees. It has 78 data centers, a low figure compared to those we have mentioned, but they are concentrated in a very small area, which makes it one of the countries with a higher data center density. Other countries where demand for data centers is increasing are IndiaVietnam and the Philippines, all of them with quite hot climates. The heat challenge Why build in such hot areas? For many countries, data being within their own borders is more important than optimal operating temperature. The risk that arises is that, with the temperatures increasing year after yearwhat is now a manageable situation can become a difficult problem to solve, especially in areas such as Southeast Asia and the Middle East. They say in Rest of World that precisely in Singapore there is an initiative in which more than 20 technology companies and universities participate with one objective: to develop a refrigeration system Specific for humid and hot climates. The most common cooling system is air, but in these areas it is most effective to use a hybrid cooling system that uses air when possible and water when it is hotter. In some areas with extreme temperatures such as the United Arab Emirates, they are even considering build them underground. In China they are testing an even more radical solution: build a data center under the sea. Image | ChatGPT, with data from Data Center Map In Xataka | Aragón is not afraid of AI: it has just approved three more new mega data centers in full commitment to renewables

Energy companies are switching from oil to MW. The new mine is the support for data centers

Gluttonous artificial intelligence and its demanding data centers are reshaping the decarbonization plans. When the world had begun a journey towards renewableswith countries like Chinaand Europeans betting big, and even some US states getting on the traindata centers arrived with needs that were almost impossible to satisfy. At the end of December 2024 we already have that data center consumption had skyrocketedpushing big technology companies to bet so much on renewable as, above all, for immediate access energy such as gas and even coal. Some were even aiming for nuclear to be able to operate. Shortly after, in January 2025, a Reuters report noted that European energy companies, which had embarked on a path of commitment to renewables, were doubling down on oil and gas. Giants like BP and Shell slowed down their investments in clean energy to return to fossil fuel projects. But it’s not all about where data centers extract energy from, but rather who provides them infrastructure. And that, and not so much oil or gas, may be the next energy mine. The new oil mine In an article of Financial Times It is suggested that the fleeting growth of data centers is generating a market that energy companies do not want to miss. As demand for traditional drilling weakens (although it is something that goes by “neighborhoods”), energy sector groups such as Baker Hughes, Halliburton or SLB are taking advantage to pivot to the data center sector. Not building them, not just supplying energy: supporting logistics. Taking advantage of their knowledge of the energy sector, these large companies would be providing equipment such as turbines and power generation systems to those who own data centers, but they also provide generators, batteries, dissipation systems and all the necessary framework to maintain correct energy efficiency. They would also oversee the team. It is, in short, what they already know how to do, but applied to a new sector such as data centers. Because these three examples are not typical oil companies, but technology providers for other companies to extract gas or oil. All three provide services to companies with oil fields, but also supply technology such as gas turbines, compressors or systems. LNG and they were inside sectors such as new energywith carbon capture and storage systems. All of this resonates with the idea that ‘Big Tech’ had when they began to build huge data centers, until they saw that increasingly demanding equipment needed more immediate and stable sources of energy. Data centers = El Dorado It is estimated that US electricity demand will increase by 90 GW -a real nonsense- from now to 2030 only to power the data centers. Traditional electrical grids may not support this load, and it is at that point that these companies that provide energy services They seem like a key entity. Pivoting toward artificial intelligence infrastructure is “key to the evolution of oil and gas,” said Lorenzo Simonelli, CEO of Baker Hughes. And it makes sense when we see that the number of US oil rigs contracted 7% year-over-year in 2025, margins have contracted and demand for drilling services is in interdict. On a business level, it is a masterstroke. Hypothetically speaking, when the new oil crisis arrives and the fall of the market for both crude oil and gas, companies that have pivoted to data centers, going from being service providers for energy companies to being service providers for ‘Big Tech‘, they will not have to take a turn in their strategy because they will already be where the money will be. Because that’s another question: whether the new MW gold for AI will be a lasting business or a passing fever. Image | freepik and Harpagornis In Xataka | The problem with renewables is what to do when there is excess energy. China believes it has the answer with a unique turbine

China decided to privatize its daycare centers in the 1980s. Unknowingly, it was creating its enormous birth crisis.

Not long ago, China had an excess birth problem. For more than three decades, the one child policy stopped the rapid growth of the population, but now its problem is just the opposite. The demographic crisis has turned around and Chinese population is plummeting. The government has launched plans to encourage births and its latest idea is to improve critical infrastructure. Target: daycare centers. They tell it in South China Morning PostChina is reviewing what will be the first law regulating the child care services sector. The measures will focus on children under three years of age, with the aim of building a society “fertility-friendly”. Among its key measures are improving the quality of the service, ensuring that professionals have the necessary qualifications for the position and expanding the offer of more affordable childcare, which will reduce the cost of parenting. Who takes care of the children. China is encouraging couples to have children through different measures and daycare centers were one of the key aspects to improve. Since the 80s, The state stopped offering public daycares, shifting the burden of care to families. Society adapted in the most predictable way: that the grandparents were the ones to take care of the children (something that it doesn’t always turn out well) or that the woman reduced her hours to take care of the care. A question of money. The lack of regulation has caused the supply of affordable daycare centers to be scarce and with insufficiently qualified professionals. Quality daycare was a luxury available to a few, while for less well-off families it is a last resort. The new law seeks to promote the creation of new state centers at more affordable prices. and trust. The scandals over cases of abuse in Chinese daycares are well known inside and outside their borders, and have also been given cases of abuse by babysitters. If, in addition to the fact that it is an expensive service, we add the problem of lack of trust, it is not surprising that care in the early years ends up being a deterrent factor for many families. In 2021, only 5.5% of Chinese children under three years old were in daycarea figure that contrasts with the 88% of schooling from 3 to 6 years old. Other measures. Since the end of the one-child policy in 2015, the government has implemented several plans to correct the declining birth rate curve. Along with births, marriages also declined, so it was proposed teach marriage and love classes and even be a kind of matchmaker for help young people find a partner. His last measure is one of the most striking: put a special tax on condoms. Image | note thanun in Unsplash In Xataka | If the question is how to reactivate birth rates, China believes it has the answer: finance painless births

The exorbitant deployment of data centers for AI has a new problem: salt caverns

In the collective imagination, artificial intelligence is an ethereal cloud of algorithms. The reality is much more complex and what we know for sure is that an energy eater that needs to “eat” constantly. Satya Nadella, CEO of Microsoft, has summarized with unusual crudeness: “The problem is no longer that it is missing Nvidia chips, but that there are not enough plugs.” And so that these plugs have power 24 hours a day with the 99.999% reliability that the sector demands, Big Tech has ended up looking where no one expected: thousands of meters below the ground, towards the salt caverns. When the bits hit the underground. The AI ​​race has entered a “slow start” phase in the construction of these underground caverns, which could hinder the rollout of data centers. According to Fortunethe reason is mathematical since these digital infrastructures do not tolerate interruptions and require extreme reliability. To guarantee this constant flow, natural gas has become the indispensable backup. However, as they explain, it is not enough to produce gas; you have to save it. Industry projections indicate that only about half of the storage that will be needed to meet future demand has been planned. Without these artificial caves dug thousands of meters below the surface, hyperscalers (Google, Amazon, Meta) are left at the mercy of gas pipelines, vulnerable to corrosion, landslides or extreme weather events. But why salt caverns? The technical answer lies in flexibility. As detailed by experts in Fortunethere are two ways to store gas: in depleted oil fields or in salt caverns. The former are cheaper, but structurally slow. The gas is injected in summer and extracted in winter, following a classic seasonal cycle. AI, on the other hand, does not understand seasons. Their demand peaks are constant, sudden and difficult to predict. The salt caverns, created by injecting water to leach the mineral, act as a high-pressure lung: they allow gas to be injected and extracted with a much higher frequency, adapting to the volatility of the electrical grid that powers the servers. The “supercycle 2.0”. Given this scenario, companies like Enbridge they have taken the lead. Greg Ebel, CEO of the company, has confirmed that they are expanding their facilities in Egan (Louisiana) and Moss Bluff (Texas). “This demand dramatically changes the economics of supply,” he said. But it is not enough. Jack Weixel East Daley Analytics analystwarns that double the capacity currently planned is needed. Projects such as the Freeport Energy Storage Hub (FRESH), in Houston, They seek to connect up to 17 gas pipelines to a new salt dome by 2028, but construction times—often exceeding four years—clash with the urgency of AI. For his part, Jim Goetz, CEO of Trinity Gas Storage, defines it as the “storage supercycle 2.0”. His company has just reached the final investment decision (FID) to expand its capacity in East Texas, seeking to support critical infrastructures such as Stargate, the titanic $500 billion project from OpenAI and Microsoft. The shadow of a doubt. The underlying question is not only whether the salt caverns work—they work—but what type of energy system they are consolidating. Natural gas is fast, flexible and reliable, but it also introduces new dependencies and risks. According to analystsgas infrastructure on the Gulf Coast is especially vulnerable to extreme weather events. A direct hurricane over Texas or Louisiana can disrupt production, exports and transportation at the same time. In that scenario, even with gas available in other regionsthe lack of nearby storage can leave data centers without electrical backup. Added to this is the question of price. The sustained increase in demand to fuel data centers, LNG exports and reindustrialization is already pushing up gas and electricity bills. Without enough storage capacity, that volatility is amplified. As the sector points out, storage acts as a buffer; when it is missing, the peaks transferred directly to the consumer. Furthermore, the criticism is more structural since AI is pushing to prolong dependence on fossil fuels just when governments and companies were committed to reducing it. Look beyond the gas. Aware of this physical limit, large technology companies are no longer looking only at salt caverns and gas pipelines. They look for any firm source of electricity that does not depend exclusively on the traditional energy market. An example is Fervo Energy, a geothermal startup that has just closed one of the largest financing rounds in the sector, with Google as an investor and client. His commitment to advanced geothermal —constant electricity 24 hours a day—reflects the extent to which AI is redrawing the energy map. This is not an immediate or universal solution, but it is a clear signal: the problem is no longer technological, but energy-based. A problem only in the United States? The United States is the epicenter, but not the only scenario. The clash between AI and energy is global, although responses vary. In Europe, the rise of AI is leading to rethinking the closure of gas and coal plants. Some electricity companies are negotiating to convert old plants into data centers, taking advantage of their access to the network, water and already depreciated infrastructure. The logic is the same: firm, immediate and available energy. China, for its part, has chosen another path. Beijing not only promotes underwater data centers either large energy clusters in interior provinces, but directly subsidizes the electricity that powers its AI. The objective is to reduce the “fuel” of digital models and compensate for the lower energy efficiency of national chips compared to those from Nvidia. The return to the underground. In all cases, the pattern repeats itself. Renewables are growing, but not fast enough or with the stability necessary to sustain the demand for AI in the short term. Gas – with salt caverns, temporary turbines or recycled plants – becomes the inevitable crutch. In our race to create an intelligence that lives on the plane of ideas, we have ended up returning to mining, drilling, and the depths of the Earth. The future … Read more

The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation

The construction of data centers is proliferating so much that although the largest in the world They are in Kolos (Norway), in The Cidatel (United States) and China, you can find them now even in Botorritain the province of Zaragoza. The limit is the sky. Or well, not even that: because Silicon Valley has been put between eyebrows set up data centers in space. And the main big tech companies are making moves to achieve this. Former Google CEO Eric Schmidt bought rocket company Relativity Space with that objective. Nvidia has supported the startup Starcloud in its project to launch the first NVIDIA H100 GPU into space a few weeks ago and Elon Musk has even condensed how he would do it in a tweet: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” He when Jeff Bezos slipped it in a prediction at the Italian Tech Week: We will see “giant training clusters” of AI in orbit in the next 10 to 20 years. The moon is a gift from the universe The next question would be “why?”. The reality is that there is no shortage of reasons. AI is a real energy guzzler and as demand does not stop growingspace offers a couple of differential advantages over Earth: almost unlimited energy and free cooling. On the one hand, in space we have a sun-synchronous orbit where solar panels receive energy almost continuously. On the other hand, you can install a radiator so large that the space functions as a kind of ‘infinite heat sink at -270°C’. The enormous amounts of water essential for cooling on Earth would not be needed. Let’s face it, today there are no plans to have data centers in space. But not too far away: University of Central Florida research professor and former NASA member Phil Metzger esteem that perhaps within a decade it could be economically viable. But its viability is so clear that it considers that taking AI servers into space are “the first real business case that will give way to many more“in the face of a future human migration beyond Earth. So for now, they try it on Earth. Consequence: that Donald Trump declare an energy emergency due to the enormous electricity demand expected for the coming years. As the power grid catches up (or tries to), AI companies have decided to move from a passive to a proactive position: Meta is going to become an electricity marketer. xAI by Elon Musk is using gas turbines as energy sources temporary. OpenAI is pushing to the United States government to lend a hand to electricity companies to add 100 gigawatts per year. That figure doesn’t say much, but it is astronomical: what OpenAI is asking for is that The United States built almost an entire Spain (around 145 GWh considering the 129 GW consolidated at the end of 2024 plus the solar and wind deployment of 2025) every year and a half in terms of infrastructure. AI is growing faster than electrical bureaucracy is advancing How could the Trump Administration help? With the eternal bureaucracy. Because on Earth they face great technical challenges, but they also face a legislative wall. To have more energy, the simplest and most immediate step is to build new power plants, but that means successfully going through the tangle of procedures that slow down the process. There is only one small problem: that in the United States depending on technology, it can take five to ten years… if you’re lucky. Interconnection to the grid alone can take six years, successfully overcoming an interconnection queue with more than 2,000 GW in projects who are already in line. Then, up to four years of federal and environmental permits to end in another couple of years for state and local licenses that must come to fruition. ‘Permit Stack’ they call it. And the journey does not end here: they must also avoid andthe citizen movementNot in my backyard‘ (not in my backyard, kind of like “yes, but not in my house”), which has already backed down the Battle Born Solar Project (Nevada), which was going to be the largest solar plant in the United States, or Danskammer gas station (New York), among others. This can delay the operation even further as rights of way must be negotiated with individual owners who may refuse, going through the courts again. The never ending story. To avoid processes NIMBY that last fifteen years or more, companies like OpenAI or Microsoft are buying plants that already exist, such as Three Mile Island, which is going to reopen only for Microsoftinstead of trying to build new ones from scratch. Amazon has also signed infrastructure that is already on the network like the Talen Energy Campus and it has partnered with Dominion Energy and X-energy to develop mini reactors (SMR). SMRs are also Google’s solution, in this case thanks to an agreement with Kairos Power. Everything is to avoid that tangle of ‘Permit stack’ procedures that in practice and according to estimates, makes it is faster to opt for the space route to build a power plant on the old, familiar Earth. At the end of the day for AI companies “The moon is a gift from the universe”, as already Jeff Bezos glimpsed. In Xataka | Musk has created the perfect circle: Tesla’s megabatteries power the AI ​​that will define its next cars In Xataka | Researchers have dismantled the batteries of Tesla and BYD. You already know which one performs better and is much cheaper. Cover | İsmail Enes Ayhan and NASA

Something is going wrong with AI. The US is turning to energy solutions that it thought were buried to power data centers

The race to develop and operate increasingly powerful artificial intelligence models comes at a cost that is rarely at the center of the technological narrative. It is not in the chips or the software, but in the huge amount of electricity needed to keep active data centers running around the clock. In the United States, this pressure is already being translated into concrete decisions: polluting power plants that were in retirement are being restarted to cover increasing peaks and tensions on the grid. The paradox is evident, the most ambitious advance in the technology sector depends, for the moment, on energy solutions from another era. The problem is not so much an absolute shortage of electricity as a time lag. The demand for data centers linked to AI it’s growing much faster than the ability to launch new electrical generation, especially renewable, in short terms. Building large energy infrastructures takes years, while these complexes can advance in much shorter time frames. Faced with this temporary shock, network operators and electricity companies are turning to what already exists and can be activated immediately, even if it is more polluting. PJM in context. The clash between electricity demand and supply is perceived with special clarity in the PJM region, the largest electricity market in the United States, which covers 13 states and concentrates a very significant part of the country’s data centers. We can understand it as a large regional electricity exchange that coordinates generation, prices and network stability in real time. There, the growth of data centers linked to AI is putting to the test a system designed for a very different consumption pattern, making PJM the first thermometer of a problem that is beginning to appear in other areas. What is a central peaker. The calls central peakeror peak, are facilities designed to come online only during short periods of peak demand, such as heat waves or winter peaks, when the system needs immediate reinforcement. They are not designed to operate continuously, but to react quickly. According to a report According to the US Government Accountability Office, these facilities generate just 3% of the country’s electricity, but they account for nearly 19% of the installed capacity, a reserve that is now being used much more frequently than expected. South view of the Fisk plant in Chicago The case of the Fisk plant, in the working-class neighborhood of Pilsen, in Chicago, illustrates well how this shift translates on the ground. It is an oil-fueled facility, built decades ago and scheduled to be retired next year, that had been relegated to an almost testimonial role. The arrival of new electrical demands associated with data centers changed that equation. Matt Pistner, senior vice president of generation at NRG Energy, explained to Reuters that the company saw an economic argument to maintain the units and that is why it withdrew the closure notice, a decision that returns activity to a location that many residents believed was in permanent withdrawal. When the price rules. The change is not explained only by technical needs, but also by very clear market signals. In PJM, the prices paid to generators to guarantee supply at times of maximum demand skyrocketed this summer, more than 800% compared to the previous year. An analysis by the aforementioned agency shows that about 60% of oil, gas and coal plants scheduled for retirement in the region postponed or canceled those plans this year, and most of them were units peakerjust the ones that best fit in this new scenario of relative scarcity. The bill for this energy shift is paid above all at a local level. The power plants peaker They tend to be older facilities, with lower chimneys and fewer pollution filters than other plants, which increases the impact on their immediate surroundings when they operate more frequently. Coal is also postponed. The phenomenon is not limited to power plants peaker fueled by oil or gas. On a national scale, several utilities have begun to delay the closure of coal plants that were part of their climate commitments. A DeSmog analysis identified at least 15 retirements postponed from January 2025 alone, facilities that together represent about 1.5% of US energy emissions. Dominion Energy offers a clear example: In 2020 he promised to generate all its electricity with renewables by 2045, but after the company projected that data center demand in Virginia will quadruple by 2038, it is now taking a step back. Images | Xataka with Gemini 3 Pro | Theodore Kloba In Xataka | A former NASA engineer is clear: data centers in space are a horrible idea

data centers in space are a horrible idea

Artificial intelligence has turned energy into the new technological bottleneck. And faced with that limit, some of the largest companies in the world have begun to look up. To give some examples, Jeff Bezos has spoken of “giant AI clusters orbiting the planet” in a decade or two. Google has experienced with running artificial intelligence calculations on solar-powered satellites. Nvidia supports startups who want to launch GPUs into space. Even OpenAI has tried the purchase of a rocket company to ensure his own path off Earth. The promise is seductive: solar data centers running around the clock, without power grids or cooling towers. The problem is that, when you move from the story to physics, engineering and numbers, the idea begins to break down. Data centers in space. There is a question that surrounds this issue: why do technology companies want to send data centers to space? The motivation at first glance is clear. According to data from the International Energy Agencydata center electricity consumption could double by 2030, driven by the explosion of generative AI. Training and running models like ChatGPT, Gemini or Claude requires massive amounts of electricity and huge volumes of water for cooling. In many places, these projects are already running into local opposition or physical network limits. In this context, space appears as a tempting solution. In certain orbits, solar panels can receive almost constant light, without clouds or night cycles. Besides, as Bezos and other defenders explainthe vacuum of space seems to offer an ideal environment to dissipate heat without resorting to cooling towers or millions of liters of fresh water. According to this argument, space data centers would be more efficient, more sustainable and, over time, even cheaper than terrestrial ones. For some executiveswould not be an eccentricity, but the “natural evolution” of an infrastructure that already began with communications satellites. When engineers raise their hands. Faced with the enthusiasm of corporate statements, several space engineering experts have been much more forceful. In one of the most cited texts on the subjecta former NASA engineer with a PhD in space electronics and direct experience in AI infrastructure at Google sums up his position bluntly: “This is a terrible idea and it doesn’t make any sense.” His criticism is not ideological, but technical. And it starts with the first great myth, the supposed abundance of energy in space. Solar energy is not magic. The largest solar system ever deployed outside of Earth is the International Space Station. According to NASA dataits panels cover about 2,500 square meters and, under ideal conditions, generate between 84 and 120 kilowatts of power, a part of which is used to charge batteries for periods in the shade. to put it in contexta single modern GPU for AI consumes on the order of 700 watts, and in practice around 1 kilowatt when losses and auxiliary systems are taken into account. With those figures, an infrastructure the size of the ISS could barely power a few hundred GPUs. As this engineer explainsa modern data center can house tens or hundreds of thousands of GPUs. Matching that capability would require launching hundreds of structures the size—and complexity—of the International Space Station. And even then, each would be equivalent to just a few racks of terrestrial servers. Furthermore, the nuclear alternative does not solve the problem either since the nuclear generators used in space, RTGs, produce between 50 and 150 watts. In other words, not even enough to power a single GPU. Space is not a refrigerator. The second big argument against orbital data centers is cooling. It is frequently repeated that the space is cold, and that this would make it easier to dissipate heat from the servers. According to engineers, this is one of the most misleading ideas in the entire debate. On Earth, cooling is based on convection: air or water carries away heat. In the vacuum of space, convection does not exist. All heat must be removed by radiation, a much less efficient process that requires enormous surfaces. NASA itself offers a compelling examplethe active thermal control system of the International Space Station. It is an extremely complex network of ammonia circuits, pumps, exchangers and giant radiators. And even so, its dissipation capacity is in the order of tens of kilowatts. According to the calculations of the aforementioned engineercooling the heat generated by high-performance GPUs in space would require radiators even larger than the solar panels that power them. The result would be a colossal satellite, larger and more complex than the ISS, to carry out a task that is solved much more simply on Earth. And there is a third factor: radiation. In orbit, electronics are exposed to charged particles that can cause bit errors, unexpected reboots, or permanent damage to chips. Although some tests, such as those carried out by Google with its TPUs, show that certain components can withstand high doses, the failures do not disappear, they only multiply. Shielding systems reduces risk, but adds mass. And each extra kilo increases the cost of the launch. Furthermore, AI hardware has a very short lifespan, as it becomes obsolete within a few years. On Earth it is replaced; In space, no. As critics point outan orbital data center would have to operate for many years to amortize its cost, but it would do so with hardware that is left behind much sooner. So why do they keep insisting? The answer seems to lie less in current engineering and more in long-term strategy. All of these projects depend on the condition that launch costs fall drastically. Some estimatesthey talk about thresholds of about 200 dollars per kilo so that space data centers can compete economically with terrestrial ones. That scenario relies on fully reusable rockets like Starship, which have not yet demonstrated that capability on an operational scale. Meanwhile, terrestrial renewable energies they continue to get cheaperand storage systems They improve year after year. Furthermore, the story of the space fulfills another function because it positions … Read more

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.