a third of the world’s data centers are in a single country

Currently there are more than 11,000 data centers operating worldwidewhich is said soon. Seeing the huge investment by technology companies, The figure is going to grow exponentially in the coming years. Now, thanks to the interactive map of Data Center Map We know where they are. An overwhelming majority of them are in the northern hemisphere, with one country accounting for almost a third of the total. United States rules USA To no one’s surprise, the country with the largest number of data centers is the United States. Considering that the major cloud infrastructure companies are American, this is also not surprising. In total they have 4,303 data centers spread throughout the territory, but not on a regular basis: there are regions in which the concentration is brutal. In the state of Virginia alone there are a whopping 668 data centers, which is more than Germany, the second country on the list with 494 centers. The weather too We already know that data centers consume a lot of energy and much of it goes into cooling their components. The hotter it is outside, the more it will cost to cool it and therefore the more energy is consumed, as well as water. According to the American Society of Heating, Refrigerating and Air Conditioning Engineers, The ideal temperature for a data center is between 18 and 27 degrees Celsius. Location has a notable impact on electricity and water expenses, which is why technology companies usually choose places with lower temperatures to set up their infrastructure. The south also wants its piece of the pie Indonesia It is striking that, despite the temperature recommendation, there are many data centers in countries where heat is a problem. Rest of World has done an extensive analysis about this phenomenon and estimates that at least 600 facilities are operating in areas outside the optimal range. In fact, following the list of countries with the highest number of data centers, we see that Indonesia is in third place with 184 facilities, followed by Brazil with 196. Both have a average temperature of more than 26 degrees, which means that for much of the year temperatures exceed that threshold. Singapore A striking case is that of Singapore, where the average temperature is more than 28 degrees. It has 78 data centers, a low figure compared to those we have mentioned, but they are concentrated in a very small area, which makes it one of the countries with a higher data center density. Other countries where demand for data centers is increasing are IndiaVietnam and the Philippines, all of them with quite hot climates. The heat challenge Why build in such hot areas? For many countries, data being within their own borders is more important than optimal operating temperature. The risk that arises is that, with the temperatures increasing year after yearwhat is now a manageable situation can become a difficult problem to solve, especially in areas such as Southeast Asia and the Middle East. They say in Rest of World that precisely in Singapore there is an initiative in which more than 20 technology companies and universities participate with one objective: to develop a refrigeration system Specific for humid and hot climates. The most common cooling system is air, but in these areas it is most effective to use a hybrid cooling system that uses air when possible and water when it is hotter. In some areas with extreme temperatures such as the United Arab Emirates, they are even considering build them underground. In China they are testing an even more radical solution: build a data center under the sea. Image | ChatGPT, with data from Data Center Map In Xataka | Aragón is not afraid of AI: it has just approved three more new mega data centers in full commitment to renewables

Energy companies are switching from oil to MW. The new mine is the support for data centers

Gluttonous artificial intelligence and its demanding data centers are reshaping the decarbonization plans. When the world had begun a journey towards renewableswith countries like Chinaand Europeans betting big, and even some US states getting on the traindata centers arrived with needs that were almost impossible to satisfy. At the end of December 2024 we already have that data center consumption had skyrocketedpushing big technology companies to bet so much on renewable as, above all, for immediate access energy such as gas and even coal. Some were even aiming for nuclear to be able to operate. Shortly after, in January 2025, a Reuters report noted that European energy companies, which had embarked on a path of commitment to renewables, were doubling down on oil and gas. Giants like BP and Shell slowed down their investments in clean energy to return to fossil fuel projects. But it’s not all about where data centers extract energy from, but rather who provides them infrastructure. And that, and not so much oil or gas, may be the next energy mine. The new oil mine In an article of Financial Times It is suggested that the fleeting growth of data centers is generating a market that energy companies do not want to miss. As demand for traditional drilling weakens (although it is something that goes by “neighborhoods”), energy sector groups such as Baker Hughes, Halliburton or SLB are taking advantage to pivot to the data center sector. Not building them, not just supplying energy: supporting logistics. Taking advantage of their knowledge of the energy sector, these large companies would be providing equipment such as turbines and power generation systems to those who own data centers, but they also provide generators, batteries, dissipation systems and all the necessary framework to maintain correct energy efficiency. They would also oversee the team. It is, in short, what they already know how to do, but applied to a new sector such as data centers. Because these three examples are not typical oil companies, but technology providers for other companies to extract gas or oil. All three provide services to companies with oil fields, but also supply technology such as gas turbines, compressors or systems. LNG and they were inside sectors such as new energywith carbon capture and storage systems. All of this resonates with the idea that ‘Big Tech’ had when they began to build huge data centers, until they saw that increasingly demanding equipment needed more immediate and stable sources of energy. Data centers = El Dorado It is estimated that US electricity demand will increase by 90 GW -a real nonsense- from now to 2030 only to power the data centers. Traditional electrical grids may not support this load, and it is at that point that these companies that provide energy services They seem like a key entity. Pivoting toward artificial intelligence infrastructure is “key to the evolution of oil and gas,” said Lorenzo Simonelli, CEO of Baker Hughes. And it makes sense when we see that the number of US oil rigs contracted 7% year-over-year in 2025, margins have contracted and demand for drilling services is in interdict. On a business level, it is a masterstroke. Hypothetically speaking, when the new oil crisis arrives and the fall of the market for both crude oil and gas, companies that have pivoted to data centers, going from being service providers for energy companies to being service providers for ‘Big Tech‘, they will not have to take a turn in their strategy because they will already be where the money will be. Because that’s another question: whether the new MW gold for AI will be a lasting business or a passing fever. Image | freepik and Harpagornis In Xataka | The problem with renewables is what to do when there is excess energy. China believes it has the answer with a unique turbine

why OpenAI is installing Boeing 747 engines in its data farms

Just three years ago, Blake Scholl, CEO of aviation company Boom Supersonic, had a linear business plan: He would first build the supersonic plane of the future and, much later, retrofit its engines to generate power. However, a phone call changed the order of factors and revealed the desperation of the technology industry. On the other end of the line was Sam Altman. The OpenAI CEO’s message was a direct plea: “Please, please, please get us something.” Altman wasn’t looking for plane tickets; I was looking for electrical power. This anecdote, reported to the Financial Timessummarizes the state of emergency in the sector: artificial intelligence is advancing at breakneck speed, but it has hit the wall of physical infrastructure. While the AI evolves in monthspermits to connect to the electrical grid can take up to ten years in some regions. Faced with this paralysis, the industry has opted for “Plan B” which consists of bypassing the grid and manufacturing its own energy on site. The tall price of urgency. This strategic shift has profound consequences. The first is economic, the “delay” is expensive. According to BNP Paribas analystspower from a gas plant built for Meta in Ohio costs about $175 per megawatt hour, nearly double the average cost for an industrial customer. The second is environmental. Mark Dyson, from Rocky Mountain Institutewarns that the emissions of these plants are much worse than those of the general network, which combines efficient gas with renewables. Despite this, the urgency is such that the authorities are giving in. In Virginia, the world’s data center heartland, it is considering relaxing emissions rules to allow generators to run more frequently. Even polluting plants that were in retirement, like the Fisk plant in Chicagohave canceled their closure to feed the demand for AI. From the sky to the data center. The most surprising solution comes from aeronautical engineering through aeroderivative turbines. The ProEnergy Company are buying motor cores CF6-80C2 of the iconic Boeing 747 to rebuild them as ground power units. A single one of these turbines generates 48 megawatts, enough for a city of 40,000 homes. It is not an isolated case. GE Vernova already supplies this technology for the gigantic Stargate (OpenAI/Microsoft) data center in Texas. Blake Scholl himself confirmed that it will sell Crusoe turbines “practically identical” to those of his supersonic planes to finance his aeronautical project. The return of diesel. Beyond aviation turbines, the sector is rescuing the most reviled fuel: diesel. The manufacturer Cummins has already sold 39 gigawatts of energy to data centers, doubling their capacity this year. What was once emergency equipment for power outages is now in demand as a primary energy source. The situation has escalated to the US Government. Secretary of Energy, Chris Wright, suggested on Fox News an almost war economy measure: requisition the backup generators from data centers or large stores like Walmart to turn them over to the network when the general system falters. The ignored alternative: Is smoke necessary? Not everyone agrees that the return to the fossil is inevitable. A study by researchers at Stripe, Paces and Scale Microgrids maintains that the future is in “off-grid” solar microgrids. According to their calculations, a system with 44% solar energy is already as cheap as gas, and one with 90% renewables would surpass nuclear projects in profitability. The advantage is speed since these solar farms can be built in less than two years in desert areas from Texas or Arizona. Giants like Google have taken note, buying the electric company Intersect Power for 4.75 billion dollars to protect its clean supply and not depend on the network. However, the majority industry prefers diesel and known gas due to a matter of technical inertia, due to the prosaic fear that the cloud will go out if the sun does not shine. AI goes physical. The industry finds itself in a technical paradox. To power the most advanced software on the planet, big technology companies are resurrecting combustion engines and burning fossil fuels on a massive scale. Although these “bridge turbines” allow AI to continue growing today, experts cited by the Financial Times They warn that this fever could cool as the tech giants reduce their capital spending. For now, the cloud has had to come down to earth. The future of artificial intelligence, ironically, depends not only on brilliant code, but on who controls the underground and who manages to turn on enough “plugs” so that the greatest technological revolution of our era is not left in the dark. Image | freepik and Harpagornis Xataka | The exorbitant deployment of data centers for AI has a new problem: salt caverns

The exorbitant deployment of data centers for AI has a new problem: salt caverns

In the collective imagination, artificial intelligence is an ethereal cloud of algorithms. The reality is much more complex and what we know for sure is that an energy eater that needs to “eat” constantly. Satya Nadella, CEO of Microsoft, has summarized with unusual crudeness: “The problem is no longer that it is missing Nvidia chips, but that there are not enough plugs.” And so that these plugs have power 24 hours a day with the 99.999% reliability that the sector demands, Big Tech has ended up looking where no one expected: thousands of meters below the ground, towards the salt caverns. When the bits hit the underground. The AI ​​race has entered a “slow start” phase in the construction of these underground caverns, which could hinder the rollout of data centers. According to Fortunethe reason is mathematical since these digital infrastructures do not tolerate interruptions and require extreme reliability. To guarantee this constant flow, natural gas has become the indispensable backup. However, as they explain, it is not enough to produce gas; you have to save it. Industry projections indicate that only about half of the storage that will be needed to meet future demand has been planned. Without these artificial caves dug thousands of meters below the surface, hyperscalers (Google, Amazon, Meta) are left at the mercy of gas pipelines, vulnerable to corrosion, landslides or extreme weather events. But why salt caverns? The technical answer lies in flexibility. As detailed by experts in Fortunethere are two ways to store gas: in depleted oil fields or in salt caverns. The former are cheaper, but structurally slow. The gas is injected in summer and extracted in winter, following a classic seasonal cycle. AI, on the other hand, does not understand seasons. Their demand peaks are constant, sudden and difficult to predict. The salt caverns, created by injecting water to leach the mineral, act as a high-pressure lung: they allow gas to be injected and extracted with a much higher frequency, adapting to the volatility of the electrical grid that powers the servers. The “supercycle 2.0”. Given this scenario, companies like Enbridge they have taken the lead. Greg Ebel, CEO of the company, has confirmed that they are expanding their facilities in Egan (Louisiana) and Moss Bluff (Texas). “This demand dramatically changes the economics of supply,” he said. But it is not enough. Jack Weixel East Daley Analytics analystwarns that double the capacity currently planned is needed. Projects such as the Freeport Energy Storage Hub (FRESH), in Houston, They seek to connect up to 17 gas pipelines to a new salt dome by 2028, but construction times—often exceeding four years—clash with the urgency of AI. For his part, Jim Goetz, CEO of Trinity Gas Storage, defines it as the “storage supercycle 2.0”. His company has just reached the final investment decision (FID) to expand its capacity in East Texas, seeking to support critical infrastructures such as Stargate, the titanic $500 billion project from OpenAI and Microsoft. The shadow of a doubt. The underlying question is not only whether the salt caverns work—they work—but what type of energy system they are consolidating. Natural gas is fast, flexible and reliable, but it also introduces new dependencies and risks. According to analystsgas infrastructure on the Gulf Coast is especially vulnerable to extreme weather events. A direct hurricane over Texas or Louisiana can disrupt production, exports and transportation at the same time. In that scenario, even with gas available in other regionsthe lack of nearby storage can leave data centers without electrical backup. Added to this is the question of price. The sustained increase in demand to fuel data centers, LNG exports and reindustrialization is already pushing up gas and electricity bills. Without enough storage capacity, that volatility is amplified. As the sector points out, storage acts as a buffer; when it is missing, the peaks transferred directly to the consumer. Furthermore, the criticism is more structural since AI is pushing to prolong dependence on fossil fuels just when governments and companies were committed to reducing it. Look beyond the gas. Aware of this physical limit, large technology companies are no longer looking only at salt caverns and gas pipelines. They look for any firm source of electricity that does not depend exclusively on the traditional energy market. An example is Fervo Energy, a geothermal startup that has just closed one of the largest financing rounds in the sector, with Google as an investor and client. His commitment to advanced geothermal —constant electricity 24 hours a day—reflects the extent to which AI is redrawing the energy map. This is not an immediate or universal solution, but it is a clear signal: the problem is no longer technological, but energy-based. A problem only in the United States? The United States is the epicenter, but not the only scenario. The clash between AI and energy is global, although responses vary. In Europe, the rise of AI is leading to rethinking the closure of gas and coal plants. Some electricity companies are negotiating to convert old plants into data centers, taking advantage of their access to the network, water and already depreciated infrastructure. The logic is the same: firm, immediate and available energy. China, for its part, has chosen another path. Beijing not only promotes underwater data centers either large energy clusters in interior provinces, but directly subsidizes the electricity that powers its AI. The objective is to reduce the “fuel” of digital models and compensate for the lower energy efficiency of national chips compared to those from Nvidia. The return to the underground. In all cases, the pattern repeats itself. Renewables are growing, but not fast enough or with the stability necessary to sustain the demand for AI in the short term. Gas – with salt caverns, temporary turbines or recycled plants – becomes the inevitable crutch. In our race to create an intelligence that lives on the plane of ideas, we have ended up returning to mining, drilling, and the depths of the Earth. The future … Read more

The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation

The construction of data centers is proliferating so much that although the largest in the world They are in Kolos (Norway), in The Cidatel (United States) and China, you can find them now even in Botorritain the province of Zaragoza. The limit is the sky. Or well, not even that: because Silicon Valley has been put between eyebrows set up data centers in space. And the main big tech companies are making moves to achieve this. Former Google CEO Eric Schmidt bought rocket company Relativity Space with that objective. Nvidia has supported the startup Starcloud in its project to launch the first NVIDIA H100 GPU into space a few weeks ago and Elon Musk has even condensed how he would do it in a tweet: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” He when Jeff Bezos slipped it in a prediction at the Italian Tech Week: We will see “giant training clusters” of AI in orbit in the next 10 to 20 years. The moon is a gift from the universe The next question would be “why?”. The reality is that there is no shortage of reasons. AI is a real energy guzzler and as demand does not stop growingspace offers a couple of differential advantages over Earth: almost unlimited energy and free cooling. On the one hand, in space we have a sun-synchronous orbit where solar panels receive energy almost continuously. On the other hand, you can install a radiator so large that the space functions as a kind of ‘infinite heat sink at -270°C’. The enormous amounts of water essential for cooling on Earth would not be needed. Let’s face it, today there are no plans to have data centers in space. But not too far away: University of Central Florida research professor and former NASA member Phil Metzger esteem that perhaps within a decade it could be economically viable. But its viability is so clear that it considers that taking AI servers into space are “the first real business case that will give way to many more“in the face of a future human migration beyond Earth. So for now, they try it on Earth. Consequence: that Donald Trump declare an energy emergency due to the enormous electricity demand expected for the coming years. As the power grid catches up (or tries to), AI companies have decided to move from a passive to a proactive position: Meta is going to become an electricity marketer. xAI by Elon Musk is using gas turbines as energy sources temporary. OpenAI is pushing to the United States government to lend a hand to electricity companies to add 100 gigawatts per year. That figure doesn’t say much, but it is astronomical: what OpenAI is asking for is that The United States built almost an entire Spain (around 145 GWh considering the 129 GW consolidated at the end of 2024 plus the solar and wind deployment of 2025) every year and a half in terms of infrastructure. AI is growing faster than electrical bureaucracy is advancing How could the Trump Administration help? With the eternal bureaucracy. Because on Earth they face great technical challenges, but they also face a legislative wall. To have more energy, the simplest and most immediate step is to build new power plants, but that means successfully going through the tangle of procedures that slow down the process. There is only one small problem: that in the United States depending on technology, it can take five to ten years… if you’re lucky. Interconnection to the grid alone can take six years, successfully overcoming an interconnection queue with more than 2,000 GW in projects who are already in line. Then, up to four years of federal and environmental permits to end in another couple of years for state and local licenses that must come to fruition. ‘Permit Stack’ they call it. And the journey does not end here: they must also avoid andthe citizen movementNot in my backyard‘ (not in my backyard, kind of like “yes, but not in my house”), which has already backed down the Battle Born Solar Project (Nevada), which was going to be the largest solar plant in the United States, or Danskammer gas station (New York), among others. This can delay the operation even further as rights of way must be negotiated with individual owners who may refuse, going through the courts again. The never ending story. To avoid processes NIMBY that last fifteen years or more, companies like OpenAI or Microsoft are buying plants that already exist, such as Three Mile Island, which is going to reopen only for Microsoftinstead of trying to build new ones from scratch. Amazon has also signed infrastructure that is already on the network like the Talen Energy Campus and it has partnered with Dominion Energy and X-energy to develop mini reactors (SMR). SMRs are also Google’s solution, in this case thanks to an agreement with Kairos Power. Everything is to avoid that tangle of ‘Permit stack’ procedures that in practice and according to estimates, makes it is faster to opt for the space route to build a power plant on the old, familiar Earth. At the end of the day for AI companies “The moon is a gift from the universe”, as already Jeff Bezos glimpsed. In Xataka | Musk has created the perfect circle: Tesla’s megabatteries power the AI ​​that will define its next cars In Xataka | Researchers have dismantled the batteries of Tesla and BYD. You already know which one performs better and is much cheaper. Cover | İsmail Enes Ayhan and NASA

Something is going wrong with AI. The US is turning to energy solutions that it thought were buried to power data centers

The race to develop and operate increasingly powerful artificial intelligence models comes at a cost that is rarely at the center of the technological narrative. It is not in the chips or the software, but in the huge amount of electricity needed to keep active data centers running around the clock. In the United States, this pressure is already being translated into concrete decisions: polluting power plants that were in retirement are being restarted to cover increasing peaks and tensions on the grid. The paradox is evident, the most ambitious advance in the technology sector depends, for the moment, on energy solutions from another era. The problem is not so much an absolute shortage of electricity as a time lag. The demand for data centers linked to AI it’s growing much faster than the ability to launch new electrical generation, especially renewable, in short terms. Building large energy infrastructures takes years, while these complexes can advance in much shorter time frames. Faced with this temporary shock, network operators and electricity companies are turning to what already exists and can be activated immediately, even if it is more polluting. PJM in context. The clash between electricity demand and supply is perceived with special clarity in the PJM region, the largest electricity market in the United States, which covers 13 states and concentrates a very significant part of the country’s data centers. We can understand it as a large regional electricity exchange that coordinates generation, prices and network stability in real time. There, the growth of data centers linked to AI is putting to the test a system designed for a very different consumption pattern, making PJM the first thermometer of a problem that is beginning to appear in other areas. What is a central peaker. The calls central peakeror peak, are facilities designed to come online only during short periods of peak demand, such as heat waves or winter peaks, when the system needs immediate reinforcement. They are not designed to operate continuously, but to react quickly. According to a report According to the US Government Accountability Office, these facilities generate just 3% of the country’s electricity, but they account for nearly 19% of the installed capacity, a reserve that is now being used much more frequently than expected. South view of the Fisk plant in Chicago The case of the Fisk plant, in the working-class neighborhood of Pilsen, in Chicago, illustrates well how this shift translates on the ground. It is an oil-fueled facility, built decades ago and scheduled to be retired next year, that had been relegated to an almost testimonial role. The arrival of new electrical demands associated with data centers changed that equation. Matt Pistner, senior vice president of generation at NRG Energy, explained to Reuters that the company saw an economic argument to maintain the units and that is why it withdrew the closure notice, a decision that returns activity to a location that many residents believed was in permanent withdrawal. When the price rules. The change is not explained only by technical needs, but also by very clear market signals. In PJM, the prices paid to generators to guarantee supply at times of maximum demand skyrocketed this summer, more than 800% compared to the previous year. An analysis by the aforementioned agency shows that about 60% of oil, gas and coal plants scheduled for retirement in the region postponed or canceled those plans this year, and most of them were units peakerjust the ones that best fit in this new scenario of relative scarcity. The bill for this energy shift is paid above all at a local level. The power plants peaker They tend to be older facilities, with lower chimneys and fewer pollution filters than other plants, which increases the impact on their immediate surroundings when they operate more frequently. Coal is also postponed. The phenomenon is not limited to power plants peaker fueled by oil or gas. On a national scale, several utilities have begun to delay the closure of coal plants that were part of their climate commitments. A DeSmog analysis identified at least 15 retirements postponed from January 2025 alone, facilities that together represent about 1.5% of US energy emissions. Dominion Energy offers a clear example: In 2020 he promised to generate all its electricity with renewables by 2045, but after the company projected that data center demand in Virginia will quadruple by 2038, it is now taking a step back. Images | Xataka with Gemini 3 Pro | Theodore Kloba In Xataka | A former NASA engineer is clear: data centers in space are a horrible idea

data centers in space are a horrible idea

Artificial intelligence has turned energy into the new technological bottleneck. And faced with that limit, some of the largest companies in the world have begun to look up. To give some examples, Jeff Bezos has spoken of “giant AI clusters orbiting the planet” in a decade or two. Google has experienced with running artificial intelligence calculations on solar-powered satellites. Nvidia supports startups who want to launch GPUs into space. Even OpenAI has tried the purchase of a rocket company to ensure his own path off Earth. The promise is seductive: solar data centers running around the clock, without power grids or cooling towers. The problem is that, when you move from the story to physics, engineering and numbers, the idea begins to break down. Data centers in space. There is a question that surrounds this issue: why do technology companies want to send data centers to space? The motivation at first glance is clear. According to data from the International Energy Agencydata center electricity consumption could double by 2030, driven by the explosion of generative AI. Training and running models like ChatGPT, Gemini or Claude requires massive amounts of electricity and huge volumes of water for cooling. In many places, these projects are already running into local opposition or physical network limits. In this context, space appears as a tempting solution. In certain orbits, solar panels can receive almost constant light, without clouds or night cycles. Besides, as Bezos and other defenders explainthe vacuum of space seems to offer an ideal environment to dissipate heat without resorting to cooling towers or millions of liters of fresh water. According to this argument, space data centers would be more efficient, more sustainable and, over time, even cheaper than terrestrial ones. For some executiveswould not be an eccentricity, but the “natural evolution” of an infrastructure that already began with communications satellites. When engineers raise their hands. Faced with the enthusiasm of corporate statements, several space engineering experts have been much more forceful. In one of the most cited texts on the subjecta former NASA engineer with a PhD in space electronics and direct experience in AI infrastructure at Google sums up his position bluntly: “This is a terrible idea and it doesn’t make any sense.” His criticism is not ideological, but technical. And it starts with the first great myth, the supposed abundance of energy in space. Solar energy is not magic. The largest solar system ever deployed outside of Earth is the International Space Station. According to NASA dataits panels cover about 2,500 square meters and, under ideal conditions, generate between 84 and 120 kilowatts of power, a part of which is used to charge batteries for periods in the shade. to put it in contexta single modern GPU for AI consumes on the order of 700 watts, and in practice around 1 kilowatt when losses and auxiliary systems are taken into account. With those figures, an infrastructure the size of the ISS could barely power a few hundred GPUs. As this engineer explainsa modern data center can house tens or hundreds of thousands of GPUs. Matching that capability would require launching hundreds of structures the size—and complexity—of the International Space Station. And even then, each would be equivalent to just a few racks of terrestrial servers. Furthermore, the nuclear alternative does not solve the problem either since the nuclear generators used in space, RTGs, produce between 50 and 150 watts. In other words, not even enough to power a single GPU. Space is not a refrigerator. The second big argument against orbital data centers is cooling. It is frequently repeated that the space is cold, and that this would make it easier to dissipate heat from the servers. According to engineers, this is one of the most misleading ideas in the entire debate. On Earth, cooling is based on convection: air or water carries away heat. In the vacuum of space, convection does not exist. All heat must be removed by radiation, a much less efficient process that requires enormous surfaces. NASA itself offers a compelling examplethe active thermal control system of the International Space Station. It is an extremely complex network of ammonia circuits, pumps, exchangers and giant radiators. And even so, its dissipation capacity is in the order of tens of kilowatts. According to the calculations of the aforementioned engineercooling the heat generated by high-performance GPUs in space would require radiators even larger than the solar panels that power them. The result would be a colossal satellite, larger and more complex than the ISS, to carry out a task that is solved much more simply on Earth. And there is a third factor: radiation. In orbit, electronics are exposed to charged particles that can cause bit errors, unexpected reboots, or permanent damage to chips. Although some tests, such as those carried out by Google with its TPUs, show that certain components can withstand high doses, the failures do not disappear, they only multiply. Shielding systems reduces risk, but adds mass. And each extra kilo increases the cost of the launch. Furthermore, AI hardware has a very short lifespan, as it becomes obsolete within a few years. On Earth it is replaced; In space, no. As critics point outan orbital data center would have to operate for many years to amortize its cost, but it would do so with hardware that is left behind much sooner. So why do they keep insisting? The answer seems to lie less in current engineering and more in long-term strategy. All of these projects depend on the condition that launch costs fall drastically. Some estimatesthey talk about thresholds of about 200 dollars per kilo so that space data centers can compete economically with terrestrial ones. That scenario relies on fully reusable rockets like Starship, which have not yet demonstrated that capability on an operational scale. Meanwhile, terrestrial renewable energies they continue to get cheaperand storage systems They improve year after year. Furthermore, the story of the space fulfills another function because it positions … Read more

AI data centers are skyrocketing your electricity bill

data centers They consume a lot of electricityfrom there arise proposals as crazy as that of take them to space either submerge them in the sea to reduce its consumption. Technology companies face a problem of shortage of electrical energy, but the real problem is something else: data centers are causing the electricity bill to rise for all citizens. Now three US senators want to investigate it thoroughly. A political question. They say in the New York Times that three Democratic senators have announced that they are going to investigate big technology companies for their role in increasing the electricity bill. Senators have sent letters to Microsoft, Google, Meta, Amazon, CoreWeave and other companies asking them to detail exactly what their data centers consume. The bill increases have become a political issue and have played an important role in elections in several states. In the case of Virginia, where the largest number of data centers in the world are concentrated, Governor Abigail Spanberger’s campaign included proposals to require data centers to “pay their fair share.” The problem. For the past 20 years, the US electricity system had been stuck with stable energy demand or very modest increases. Data centers have seen very abrupt growth. In 2023, data centers consumed 4% of all electricity in the United States and this is estimated to increase up to 12% 2028. This abrupt increase in demand has forced electricity companies to modernize the network. The technology companies assume part of the cost, but not all, and the way to recover that investment is through the bill of all network users. The discount trick. The technological ones, such as Amazon ensures that its data centers are not raising the bill and that they assume all the costs, contributing to improving the network for everyone. What they don’t say is that they benefit from enormous discounts, like the one they Amazon itself requested regulatory authorities in Ohio in 2024, where they are building a data center, a discount on the electricity rate. The problem is that the agreement is opaque and we do not know how much that discount was, but it is estimated that it could be 135 million per year, over a period of 10 years. Who really pays? In many cases, technology companies pay for the infrastructure necessary to expand the network, but what about these discounts? According to a paper published by the Harvard Electricity Law Initiative in which they reviewed more than 50 regulatory cases, it is very common for electricity companies to offer subsidies to attract technology companies and the way to compensate for these discounts is to pass them on to all network subscribers, which ends up increasing the bill. Unaffordable increases. According to the United States Energy Information Administrationin September electricity increased 7% compared to the same period of the previous year. Things change if we go to the cities near the data centers, where the increases have reached 267%, unaffordable figures for many citizens. Proposals. There are states that are already legislating to prevent network customers from ending up paying the bill for data centers. This is the case of Michigan, which has put special rules for data centers. Companies must sign a contract of at least 15 years, face fines if they cancel before and pay at least 80% of the contracted power even if they do not use it. In addition, they must pay all the costs of the lines and services that are built to serve them. However, these proposals could encounter difficulties due to the executive order that Trump signed and that prevents states from enacting laws that could stop the advance of AI, all to win the battle against China. Image | Google In Xataka | The United States may win the AI ​​race, but its problem is different: China is winning all the others

Google is serious about putting data centers in space. Elon Musk and Jeff Bezos rub hands

While there are municipalities debating whether to let big technology companies install data centers in their domainsGoogle wants a strike further: taking the data centers to space. Google. The company revealed its intentions a few weeks ago and your Suncatcher project wants to install two prototype satellites before 2027. Curiously, Elon Musk and Jeff Bezos are more than delighted with the idea of ​​their rival. Suncatcher Project. Push the capabilities of the artificial intelligence requires that we train it and, for this, they are necessary huge data centers with spectacular computing power. The problem is that the energy needs of these facilities They are astronomical, becoming resource sinksmaking oil companies set aside their renewable energy plans and even raising the opening of “private” nuclear power plants. Suncatcher couldn’t have a more appropriate name. In space, without the influence of the atmosphere, solar panels They capture the light spectrum in a different way, enough to feed those data centers that seem insatiable, and what Google proposes is to build constellations of dozens or hundreds of satellites that orbit in formation at about 650 kilometers high. Each of them would be armed with Trillium TPU (processors specifically designed for AI calculations) and would be connected to each other via laser optical links. Pichai puts the topic anywhere. Although 2027 is the key date, it is evident that Google is very interested in airing its plans because it is a sign of both technological power and an invitation for interested entities to invest in the process – and a way to continue inflating everything around AI-. And the person who is practicing this speech the most is the company’s CEO himself: Sundar Pichai. Since we learned of Google’s plans, Pichai has spoken of the topic in every interview he has given. It does not tell anything new beyond that hope of having TPUs in space in 2027 and the ambition that in a decade extraterrestrial data centers will be the norm. Musk and Bezos: competition, but allies. And if Google is interested in selling its narrative, those who are also interested are two of its most direct competitors: Elon Musk and Jeff Bezos. Both Musk with several of his companies and Bezos with Amazon Web Services are in the race for data centers and artificial intelligence. They have some of the largest on the planet, but they also have something that the rest of the competitors don’t: ability to launch things into space. Musk with SpaceX and Bezos with Blue Origin have the tools to put satellites into orbit, charging for each kilo they launch into space. And it is there, the more credible it seems that the future of computing is in low Earth orbit, the more economic and political sense they will make. SpaceX as Blue Origin. Both are Google’s competition, but also the option for Google to achieve its objective. And, ultimately, we keep seeing rival companies renting their services from each other. Data center fever in space. The truth is that, at first, it sounds like a crazy plan to build these extraterrestrial data centers, but from the most pragmatic point of view (removing logistics and the money that both development and each launch will cost from the equation), it is a plan that makes sense. In space, a panel can perform up to eight times more than on the Earth’s surface, in addition to generating electricity continuously by not depending on day/night cycles. It is something that would eliminate the need for huge batteries, but also for complex water-based cooling systems. And, as we said, Google is not alone in this. Currently, there is a fever for space data centers with big technology companies in the spotlight: Considerable challenges. Now, Google itself comment It will not be easy to carry out this strategy. On the one hand, the costs. The company claims that prices may fall several thousand dollars per kilo to just $200/kg by mid-2030 if the industry consolidates. They note that, in that case, the price of launching and operating a space data center could be comparable to the energy costs for an equivalent terrestrial data center. Another difficulty will be maintaining a close orbit between the satellites. They would have to be within 100-200 meters of each other for optical links to be viable. And most importantly: radiation tolerance by the TPUs. Google has been experimenting with this for years, but they must test the effects of radiation on sensitive components such as the HBM memory. Surely astronomers They will be delighted with this strategysame as with starlink. Image | THAT In Xataka | We are launching more things into space than ever before. And the next problem is already on the table: how to pollute less

We have been talking theoretically about data centers in space for months. A company already has a plan to set it up in 2027

The Californian startup Aetherflux has announced which will launch its first data center satellite in the first quarter of 2027. It is the initial node of a constellation that the company has named “Galactic Brain”, designed to offer in-orbit computing capacity powered by continuous solar energy. The underlying promise. Aetherflux presents an alternative to the years of construction that terrestrial data centers require. According to Baiju Bhatt, company founder and co-founder of the financial firm Robinhood, “the race toward artificial general intelligence is fundamentally a race for computing power and, by extension, energy.” The company is committed to placing sunlight next to silicon and completely bypassing the electrical grid. How the project works. The Galactic Brain satellites will operate in low Earth orbit, taking advantage of solar radiation 24 hours a day, something impossible on land. Advanced thermal systems would eliminate the limitations faced by terrestrial data centers, which require large amounts of water and electricity for cooling. In addition, the constellation fits within Aetherflux’s initial plans: transmitting energy from space to Earth using infrared lasers. The competition is already underway. Aetherflux is not alone in this bet. Google presented in November your Suncatcher projecta plan to launch AI chips into space on solar-powered satellites. Jeff Bezos too expressed his optimism on large data centers operating in space in the next decade or two, a goal that Blue Origin has been working on for more than a year. SpaceX also works in use Starlink satellites for computing loads of AI. Musk himself wrote in The real obstacles. Although launch costs have decreased considerably, they remain prohibitive. According to recent estimateslaunching a kilogram with SpaceX’s Falcon Heavy costs around $1,400. Google calculate that if these costs drop to about $200 per kilogram by 2030, as projected, the expense of establishing and operating space data centers would be comparable to that of terrestrial facilities. In addition, the chips will have to withstand more intense radiation and avoid collisions in an increasingly congested orbit. The urgency. Big tech is colliding with physical limits on Earth. From 2023, dozens of data center projects have been blocked or delayed in the United States due to local opposition over electricity consumption, water use and associated pollution. According to the consulting firm CBRElimitations in electricity generation have become the main inhibitor of data center growth around the world. The Aetherflux Calendar. The company, founded in 2024 and which has raised $60 million in financing, plans to first demonstrate the feasibility of transmitting space energy through a satellite that will launch in 2026. If all goes according to plan, the first Galactic Brain node will arrive in 2027. The company anticipates launching about 30 satellites at a time on a SpaceX Falcon 9 or equivalent, although if Starship becomes an option, they could orbit more than 100 data center satellites in a single launch. The long term strategy. Aetherflux hasn’t revealed pricing yet, but promise Multi-gigabit bandwidth with near-constant uptime. Their approach is to continually release new hardware and quickly integrate the latest architectures. Older systems would run lower priority tasks until the life of the high-end GPUs were exhausted, which under high utilization and radiation might not last more than a few years. Cover image | İsmail Enes Ayhan and NASA In Xataka | OpenAI launches GPT-5.2 weeks after GPT-5.1: a maneuver that aims to cut ground on Google’s Gemini 3

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.