data centers in space are a horrible idea

Artificial intelligence has turned energy into the new technological bottleneck. And faced with that limit, some of the largest companies in the world have begun to look up. To give some examples, Jeff Bezos has spoken of “giant AI clusters orbiting the planet” in a decade or two. Google has experienced with running artificial intelligence calculations on solar-powered satellites. Nvidia supports startups who want to launch GPUs into space. Even OpenAI has tried the purchase of a rocket company to ensure his own path off Earth. The promise is seductive: solar data centers running around the clock, without power grids or cooling towers. The problem is that, when you move from the story to physics, engineering and numbers, the idea begins to break down. Data centers in space. There is a question that surrounds this issue: why do technology companies want to send data centers to space? The motivation at first glance is clear. According to data from the International Energy Agencydata center electricity consumption could double by 2030, driven by the explosion of generative AI. Training and running models like ChatGPT, Gemini or Claude requires massive amounts of electricity and huge volumes of water for cooling. In many places, these projects are already running into local opposition or physical network limits. In this context, space appears as a tempting solution. In certain orbits, solar panels can receive almost constant light, without clouds or night cycles. Besides, as Bezos and other defenders explainthe vacuum of space seems to offer an ideal environment to dissipate heat without resorting to cooling towers or millions of liters of fresh water. According to this argument, space data centers would be more efficient, more sustainable and, over time, even cheaper than terrestrial ones. For some executiveswould not be an eccentricity, but the “natural evolution” of an infrastructure that already began with communications satellites. When engineers raise their hands. Faced with the enthusiasm of corporate statements, several space engineering experts have been much more forceful. In one of the most cited texts on the subjecta former NASA engineer with a PhD in space electronics and direct experience in AI infrastructure at Google sums up his position bluntly: “This is a terrible idea and it doesn’t make any sense.” His criticism is not ideological, but technical. And it starts with the first great myth, the supposed abundance of energy in space. Solar energy is not magic. The largest solar system ever deployed outside of Earth is the International Space Station. According to NASA dataits panels cover about 2,500 square meters and, under ideal conditions, generate between 84 and 120 kilowatts of power, a part of which is used to charge batteries for periods in the shade. to put it in contexta single modern GPU for AI consumes on the order of 700 watts, and in practice around 1 kilowatt when losses and auxiliary systems are taken into account. With those figures, an infrastructure the size of the ISS could barely power a few hundred GPUs. As this engineer explainsa modern data center can house tens or hundreds of thousands of GPUs. Matching that capability would require launching hundreds of structures the size—and complexity—of the International Space Station. And even then, each would be equivalent to just a few racks of terrestrial servers. Furthermore, the nuclear alternative does not solve the problem either since the nuclear generators used in space, RTGs, produce between 50 and 150 watts. In other words, not even enough to power a single GPU. Space is not a refrigerator. The second big argument against orbital data centers is cooling. It is frequently repeated that the space is cold, and that this would make it easier to dissipate heat from the servers. According to engineers, this is one of the most misleading ideas in the entire debate. On Earth, cooling is based on convection: air or water carries away heat. In the vacuum of space, convection does not exist. All heat must be removed by radiation, a much less efficient process that requires enormous surfaces. NASA itself offers a compelling examplethe active thermal control system of the International Space Station. It is an extremely complex network of ammonia circuits, pumps, exchangers and giant radiators. And even so, its dissipation capacity is in the order of tens of kilowatts. According to the calculations of the aforementioned engineercooling the heat generated by high-performance GPUs in space would require radiators even larger than the solar panels that power them. The result would be a colossal satellite, larger and more complex than the ISS, to carry out a task that is solved much more simply on Earth. And there is a third factor: radiation. In orbit, electronics are exposed to charged particles that can cause bit errors, unexpected reboots, or permanent damage to chips. Although some tests, such as those carried out by Google with its TPUs, show that certain components can withstand high doses, the failures do not disappear, they only multiply. Shielding systems reduces risk, but adds mass. And each extra kilo increases the cost of the launch. Furthermore, AI hardware has a very short lifespan, as it becomes obsolete within a few years. On Earth it is replaced; In space, no. As critics point outan orbital data center would have to operate for many years to amortize its cost, but it would do so with hardware that is left behind much sooner. So why do they keep insisting? The answer seems to lie less in current engineering and more in long-term strategy. All of these projects depend on the condition that launch costs fall drastically. Some estimatesthey talk about thresholds of about 200 dollars per kilo so that space data centers can compete economically with terrestrial ones. That scenario relies on fully reusable rockets like Starship, which have not yet demonstrated that capability on an operational scale. Meanwhile, terrestrial renewable energies they continue to get cheaperand storage systems They improve year after year. Furthermore, the story of the space fulfills another function because it positions … Read more

AI data centers are skyrocketing your electricity bill

data centers They consume a lot of electricityfrom there arise proposals as crazy as that of take them to space either submerge them in the sea to reduce its consumption. Technology companies face a problem of shortage of electrical energy, but the real problem is something else: data centers are causing the electricity bill to rise for all citizens. Now three US senators want to investigate it thoroughly. A political question. They say in the New York Times that three Democratic senators have announced that they are going to investigate big technology companies for their role in increasing the electricity bill. Senators have sent letters to Microsoft, Google, Meta, Amazon, CoreWeave and other companies asking them to detail exactly what their data centers consume. The bill increases have become a political issue and have played an important role in elections in several states. In the case of Virginia, where the largest number of data centers in the world are concentrated, Governor Abigail Spanberger’s campaign included proposals to require data centers to “pay their fair share.” The problem. For the past 20 years, the US electricity system had been stuck with stable energy demand or very modest increases. Data centers have seen very abrupt growth. In 2023, data centers consumed 4% of all electricity in the United States and this is estimated to increase up to 12% 2028. This abrupt increase in demand has forced electricity companies to modernize the network. The technology companies assume part of the cost, but not all, and the way to recover that investment is through the bill of all network users. The discount trick. The technological ones, such as Amazon ensures that its data centers are not raising the bill and that they assume all the costs, contributing to improving the network for everyone. What they don’t say is that they benefit from enormous discounts, like the one they Amazon itself requested regulatory authorities in Ohio in 2024, where they are building a data center, a discount on the electricity rate. The problem is that the agreement is opaque and we do not know how much that discount was, but it is estimated that it could be 135 million per year, over a period of 10 years. Who really pays? In many cases, technology companies pay for the infrastructure necessary to expand the network, but what about these discounts? According to a paper published by the Harvard Electricity Law Initiative in which they reviewed more than 50 regulatory cases, it is very common for electricity companies to offer subsidies to attract technology companies and the way to compensate for these discounts is to pass them on to all network subscribers, which ends up increasing the bill. Unaffordable increases. According to the United States Energy Information Administrationin September electricity increased 7% compared to the same period of the previous year. Things change if we go to the cities near the data centers, where the increases have reached 267%, unaffordable figures for many citizens. Proposals. There are states that are already legislating to prevent network customers from ending up paying the bill for data centers. This is the case of Michigan, which has put special rules for data centers. Companies must sign a contract of at least 15 years, face fines if they cancel before and pay at least 80% of the contracted power even if they do not use it. In addition, they must pay all the costs of the lines and services that are built to serve them. However, these proposals could encounter difficulties due to the executive order that Trump signed and that prevents states from enacting laws that could stop the advance of AI, all to win the battle against China. Image | Google In Xataka | The United States may win the AI ​​race, but its problem is different: China is winning all the others

Google is serious about putting data centers in space. Elon Musk and Jeff Bezos rub hands

While there are municipalities debating whether to let big technology companies install data centers in their domainsGoogle wants a strike further: taking the data centers to space. Google. The company revealed its intentions a few weeks ago and your Suncatcher project wants to install two prototype satellites before 2027. Curiously, Elon Musk and Jeff Bezos are more than delighted with the idea of ​​their rival. Suncatcher Project. Push the capabilities of the artificial intelligence requires that we train it and, for this, they are necessary huge data centers with spectacular computing power. The problem is that the energy needs of these facilities They are astronomical, becoming resource sinksmaking oil companies set aside their renewable energy plans and even raising the opening of “private” nuclear power plants. Suncatcher couldn’t have a more appropriate name. In space, without the influence of the atmosphere, solar panels They capture the light spectrum in a different way, enough to feed those data centers that seem insatiable, and what Google proposes is to build constellations of dozens or hundreds of satellites that orbit in formation at about 650 kilometers high. Each of them would be armed with Trillium TPU (processors specifically designed for AI calculations) and would be connected to each other via laser optical links. Pichai puts the topic anywhere. Although 2027 is the key date, it is evident that Google is very interested in airing its plans because it is a sign of both technological power and an invitation for interested entities to invest in the process – and a way to continue inflating everything around AI-. And the person who is practicing this speech the most is the company’s CEO himself: Sundar Pichai. Since we learned of Google’s plans, Pichai has spoken of the topic in every interview he has given. It does not tell anything new beyond that hope of having TPUs in space in 2027 and the ambition that in a decade extraterrestrial data centers will be the norm. Musk and Bezos: competition, but allies. And if Google is interested in selling its narrative, those who are also interested are two of its most direct competitors: Elon Musk and Jeff Bezos. Both Musk with several of his companies and Bezos with Amazon Web Services are in the race for data centers and artificial intelligence. They have some of the largest on the planet, but they also have something that the rest of the competitors don’t: ability to launch things into space. Musk with SpaceX and Bezos with Blue Origin have the tools to put satellites into orbit, charging for each kilo they launch into space. And it is there, the more credible it seems that the future of computing is in low Earth orbit, the more economic and political sense they will make. SpaceX as Blue Origin. Both are Google’s competition, but also the option for Google to achieve its objective. And, ultimately, we keep seeing rival companies renting their services from each other. Data center fever in space. The truth is that, at first, it sounds like a crazy plan to build these extraterrestrial data centers, but from the most pragmatic point of view (removing logistics and the money that both development and each launch will cost from the equation), it is a plan that makes sense. In space, a panel can perform up to eight times more than on the Earth’s surface, in addition to generating electricity continuously by not depending on day/night cycles. It is something that would eliminate the need for huge batteries, but also for complex water-based cooling systems. And, as we said, Google is not alone in this. Currently, there is a fever for space data centers with big technology companies in the spotlight: Considerable challenges. Now, Google itself comment It will not be easy to carry out this strategy. On the one hand, the costs. The company claims that prices may fall several thousand dollars per kilo to just $200/kg by mid-2030 if the industry consolidates. They note that, in that case, the price of launching and operating a space data center could be comparable to the energy costs for an equivalent terrestrial data center. Another difficulty will be maintaining a close orbit between the satellites. They would have to be within 100-200 meters of each other for optical links to be viable. And most importantly: radiation tolerance by the TPUs. Google has been experimenting with this for years, but they must test the effects of radiation on sensitive components such as the HBM memory. Surely astronomers They will be delighted with this strategysame as with starlink. Image | THAT In Xataka | We are launching more things into space than ever before. And the next problem is already on the table: how to pollute less

We have been talking theoretically about data centers in space for months. A company already has a plan to set it up in 2027

The Californian startup Aetherflux has announced which will launch its first data center satellite in the first quarter of 2027. It is the initial node of a constellation that the company has named “Galactic Brain”, designed to offer in-orbit computing capacity powered by continuous solar energy. The underlying promise. Aetherflux presents an alternative to the years of construction that terrestrial data centers require. According to Baiju Bhatt, company founder and co-founder of the financial firm Robinhood, “the race toward artificial general intelligence is fundamentally a race for computing power and, by extension, energy.” The company is committed to placing sunlight next to silicon and completely bypassing the electrical grid. How the project works. The Galactic Brain satellites will operate in low Earth orbit, taking advantage of solar radiation 24 hours a day, something impossible on land. Advanced thermal systems would eliminate the limitations faced by terrestrial data centers, which require large amounts of water and electricity for cooling. In addition, the constellation fits within Aetherflux’s initial plans: transmitting energy from space to Earth using infrared lasers. The competition is already underway. Aetherflux is not alone in this bet. Google presented in November your Suncatcher projecta plan to launch AI chips into space on solar-powered satellites. Jeff Bezos too expressed his optimism on large data centers operating in space in the next decade or two, a goal that Blue Origin has been working on for more than a year. SpaceX also works in use Starlink satellites for computing loads of AI. Musk himself wrote in The real obstacles. Although launch costs have decreased considerably, they remain prohibitive. According to recent estimateslaunching a kilogram with SpaceX’s Falcon Heavy costs around $1,400. Google calculate that if these costs drop to about $200 per kilogram by 2030, as projected, the expense of establishing and operating space data centers would be comparable to that of terrestrial facilities. In addition, the chips will have to withstand more intense radiation and avoid collisions in an increasingly congested orbit. The urgency. Big tech is colliding with physical limits on Earth. From 2023, dozens of data center projects have been blocked or delayed in the United States due to local opposition over electricity consumption, water use and associated pollution. According to the consulting firm CBRElimitations in electricity generation have become the main inhibitor of data center growth around the world. The Aetherflux Calendar. The company, founded in 2024 and which has raised $60 million in financing, plans to first demonstrate the feasibility of transmitting space energy through a satellite that will launch in 2026. If all goes according to plan, the first Galactic Brain node will arrive in 2027. The company anticipates launching about 30 satellites at a time on a SpaceX Falcon 9 or equivalent, although if Starship becomes an option, they could orbit more than 100 data center satellites in a single launch. The long term strategy. Aetherflux hasn’t revealed pricing yet, but promise Multi-gigabit bandwidth with near-constant uptime. Their approach is to continually release new hardware and quickly integrate the latest architectures. Older systems would run lower priority tasks until the life of the high-end GPUs were exhausted, which under high utilization and radiation might not last more than a few years. Cover image | İsmail Enes Ayhan and NASA In Xataka | OpenAI launches GPT-5.2 weeks after GPT-5.1: a maneuver that aims to cut ground on Google’s Gemini 3

a data center that will run on wind energy

In the silent race that the world is waging to dominate digital infrastructure, every movement matters. And Brazil, far from being a spectatoronce again occupies a strategic place. The arrival of the TikTok project in the Brazilian northeast confirms a shift in the world technology map: critical infrastructures are no longer concentrated only in the United States, Europe or Asia, but are beginning to expand towards regions that offer abundant renewable energy and direct international connection. The advertisement. TikTok have decided to install a mega data center in the Pecém Industrial and Port Complex, in the state of Ceará. The company detailed in its press release that it will allocate more than 200,000 million reais —about 32,000 million euros—, the largest investment it has made in Latin America. Of that amount, 108 billion will be allocated exclusively to high-tech equipment until 2035; the rest will finance infrastructure, energy systems and future expansions. Operations are planned for 2027, and local authorities estimate the creation of more than 4,000 jobs. The infrastructure that the AI ​​era demands. Data centers have become the engine that makes AI, cloud and streaming possible. As Wired remembersthe push of artificial intelligence has skyrocketed the demand for computing and has opened a global competition to build larger and more efficient infrastructures. Brazilian interest in attracting data centers is supported by both its renewable energy matrix – cheap and abundant – and connectivity what Fortaleza offersentry point for most the submarine cables that link the country with the United States, Europe and Africa. A data center powered only by wind. For the initial phase, TikTok will work with Omnia, a local data center operator, and with Casa dos Ventos, one of the largest renewable energy developers in the country. The project is presented as an example of digital infrastructure powered entirely by clean energy. TikTok and its partners will build exclusive wind farms to supply the center, which will allow them not to use energy from the public grid. Depending on the platformthis will avoid any pressure on local supply. Technically, the company states that it will use a closed water reuse circuit combined with air cooling to reduce water consumption. However, as the Government of Ceará has pointed outrefrigeration will be 100% air-based, and the use of water will be limited to human activities and maintenance. Furthermore, the installation will incorporate PG25 technologywhich allows servers to operate at higher temperatures with less need for cooling, substantially reducing energy expenditure. The voices that question the project. Not everything is celebrations. The main resistance comes from the Anacé indigenous people, who denounce, as reported by El Paísthat part of the complex would occupy territories that they consider ancestral. Their organizations affirm that no prior consultation was carried out and express concern about the possible socio-environmental impacts: both on the use of water and on the transformation of the territory. TikTok maintains that it complies with Brazilian regulations and emphasizes that its energy and cooling model will minimize any pressure on natural resources. The Government of Ceará add thatThe companies involved must invest 15 million reais per year in the communities around the Pecém complex. On the global board of digital infrastructure. The megaproject is part of a broader strategy. Lula’s Government approved measures to reduce taxes and attract data centers, with the intention of transforming Brazil into a regional digital hub. In parallel, the United States promotes initiatives such as the stargate project to maintain competitiveness in artificial intelligence, while China accelerates the expansion of its technology companies abroad. TikTok, of Chinese origin, thus fits into a delicate diplomatic balance that Brazil tries to maintain. Beyond the economic investment, a data center of this scale raises debates about privacy, digital sovereignty and local data storage, dimensions increasingly present on the Brazilian legislative agenda. The speed of digitization. The TikTok megaproject in Ceará symbolizes the tension of a world that is digitizing at unprecedented speeds: it promises clean energy, employment and modernization, but it also reopens discussions about territory, regulation and environmental memory. Between the technological ambition of a digital power and the concerns of a community that defends its land, Brazil once again places itself at the intermediate point of global forces and local demands. The contrast is inevitable: while institutions celebrate the promise of a future powered by wind and data, indigenous communities in the northeast remember that the technology that connects the world also leaves footprints on the ground they walk on. At this intersection between progress and complaints the true impact of TikTok’s new digital heart in Latin America will be defined. Image | PXHere and Greenwish Xataka | Researchers removed Instagram and TikTok from 300 young people to see if their anxiety decreased. The results speak for themselves

In Finland they already know how to deal with excess heat from data centers: convert it into district heating

Helsinki has found an unexpected ally to decarbonize its heating in the midst of the rise of artificial intelligence: waste heat from data centers. The same heat that servers generate when processing millions of queries, training AI models, or moving Internet traffic is no longer wasted. In the Finnish capital, this thermal flow – which is growing at the same rate as the digital world – is beginning to become shelter for tens of thousands of homes. A digital sector that is now heating up cities. For years, data centers were known for one uncomfortable characteristic: they generated a lot of heat and needed huge cooling systems to dissipate it. Now that residual heat is already being channeled to the Helsinki heating network, thanks to agreements signed with operators such as Equinix, Telia and Elisa. Data Center Dynamics remember that the company It has been testing this model for more than a decade – the first pilot tests date back to 2010 – but now the scale is completely different: the thermal demand of the city is enormous and the volume of heat generated by the digital economy is growing non-stop. The result can already be seen, a single data center can heat up to 20,000 homes, according to official figures from Helen. The Telia plant, for example, already recovers up to 90% of the heat generated by its servers, enough to heat 14,000 apartments, and in a few years it could double that figure to 28,000. A change in the way heat is produced. Digital heat recovery is more than just a technological curiosity. It represents a change in the way district heating is conceived. In the words of the Finnish company“the electricity consumed by data centers always ends up being converted into heat.” The difference is that now that heat is no longer released outside: it is reused. The engineering behind urban heat. Finland can convert digital heat into district heating because it has a network of district heating especially advanced: a network of pipes that distributes hot water to homes, schools and public buildings. The process is as follows. A data center generates heat: the servers run 24/7 and are continuously cooled. That heat, instead of being dissipated outside, is captured. It is then recovered and transferred; To do this, data centers can install their own recovery systems or use those offered by the energy company. The heat is sent to an “energy platform”, where heat pumps raise it to useful temperatures. Then, the temperature is adjusted to the 85–90 ºC necessary so that the water can circulate through the urban network. This is where high-temperature heat pumps come into play—some of which, like Patola’sthey work even with outside air at –20 ºC. Finally, the heat is injected into the grid and distributed throughout the city to heat thousands of buildings. Closing the energy circle. To understand why Finland leads this model, we must look at an essential technological element: heat pumps. Not only domestic ones, but also large-scale industrial ones, capable of raising waste heat to temperatures useful for an urban network. Europe—and especially the Nordic countries— has become a world leader of this technology. Finland has 524 heat pumps per 1,000 homes, a figure second only to Norway, and its cities have been electrifying heating for decades. This combination—cold climate, tradition of district heatingheat pump industry and the need to decarbonize quickly—turns Finland into an urban-scale energy laboratory. A model with limits. Although the system works, it is not a panacea. As Middle Parenthesis remembersnot all data centers are close to cores with thermal demand, not all generate enough heat to justify the investment, heat recovery improves efficiency but does not reduce the electrical consumption of data centers, and in hot climates or widely dispersed cities, replicating it is much more difficult. Still, the trend is clear. With the expansion of AI and the growth of cloudthe amount of heat available will only increase. The Nordic countries – Sweden, Norway, Denmark – already take advantage of it, and large operators such as Microsoft and Google They explore similar systems across Europe. From silicon to the stove. The Finnish model shows that, even at the heart of digital infrastructure – those data centers that power our online lives – there can be hidden a useful and concrete source of energy for everyday life. The heat produced by our searches, our videos or our conversations with AI can be transformed, with the right infrastructure, into heating a home in Helsinki. In a world desperately seeking clean heat, Finland has already found a tangible, scalable and surprisingly logical answer: turning the thermal problem of the digital age into a solution for the Nordic climate. A silent reminder that, sometimes, the energy transition advances with a simpler approach: taking advantage of the heat that servers already produce tirelessly. Image | freepik and freepik Xataka | Someone cut five undersea cables in the Baltic. Finland already points to a ship from the “shadow fleet” as responsible

We already have the world’s first fast neutron nuclear reactor. We are going to use it for AI data centers

The growth of artificial intelligence is driving global electricity demand to historical figures. The expansion of data centers, the advance of electrification and the industrial rebound are straining aging networks that are already suffering from saturation in multiple countries. In this scenario, the digital sector—a large consumer of electricity for the development of AI—faces a paradox: it needs much more energy, but it must do so without increasing its emissions. And there arises a proposal that until recently would have seemed like science fiction: data centers powered by a compact fast neutron nuclear reactor. The Stellaria–Equinix deal that no one saw coming. The French startup Stellaria, born from commissariat to the atomistic energy (CEA) and Schneider Electric, announced a pre-purchase agreement with Equinix, one of the largest global data center operators. According to the press releasethe agreement secures Equinix the first 500 MW of capacity of the Stellarium, the molten salt and fast neutron reactor that the company plans to deploy starting in 2035. This reserve is part of Equinix’s initiatives to diversify towards “alternative energies” applied to AI-ready data centers. Autonomy, zero carbon and waste management. It is a brief summary of the first reactor breed and burn intended to supply data centers. As explained by Stellariaoffers: Completely carbon-free and controllable energy, enough to make a data center autonomous. Underground design without exclusion zone, thanks to its operation at atmospheric pressure and its liquid core. Ultra-fast response to load variations, essential for generative AI. Virtually infinite regeneration of fuel, part of which can come from current waste from nuclear power plants. Multi-fuel capability, from uranium 235 and 238 to plutonium 239, MOX, minor actinides and thorium. For Equinix, this means solving one of its great challenges: operating with guaranteed clean energy 24/7 without depending on the grid. For Europe, it marks the entry into a new generation of ultra-compact reactors: the Stellarium occupies just four cubic meters. The technology behind the reactor. The Stellarium is a fourth-generation liquid chloride salt reactor, cooled by natural convection and equipped with four physical containment barriers. It operates on a closed fuel cycle, capable of maintaining fission for more than 20 years without recharging. Stellaria’s roadmap establishes that in 2029 there will be the first fission reaction and six years later a commercial deployment and delivery of the reactor to Equinix. According to the company, The energy density of this type of reactor is “70 million times higher than that of lithium-ion batteries”, which would allow a single Stellarium to supply a city of 400,000 inhabitants. As fusion progresses, fast fission arrives first. To understand why a fast neutron reactor comes to the world of AI before fusion, just compare the technological moment of each. The merger is making spectacular progress—such as the record of the French WEST reactorwhich maintained a stable plasma for 22 minutes, or the Wendelstein 7-Xwhich sustained a high-performance plasma for 43 seconds—but remains experimental. ITER will not be operational this decade and commercial prototypes will not arrive until well into the 2030s. Advanced fission, on the other hand, is much closer to the market. Reactors like Stellaria’s, with molten salt and fast neutrons, do not require the extreme conditions of fusion and can be deployed sooner. The company plans its first reaction in 2029 and a commercial deployment in 2035. The data centers of the future will no longer depend on the network. Equinix already operates more than 270 data centers in 77 metropolitan areas. In Europe they are powered by 100% renewables, but their future demand for AI will require a constant, carbon-free source that does not congest the electrical grid. According to Stellariathis agreement “lays the foundation for data centers with lifetime energy autonomy.” And, if the company meets its schedule, Europe will become the first region in the world where artificial intelligence is powered by compact reactors that recycle their own nuclear waste. The technological race between advanced fission and fusion is far from over, but, today, the first fast neutron reactor intended for AI does not come from ITER or an industrial giant: it comes from a French startup. Europe has just opened a door that could transform, at the same time, the future of energy and computing. Image | freepik and Stellaria Xataka | Google hit the red button when ChatGPT came upon it. Now it is OpenAI who has pressed it, according to WSJ

Sam Altman is trying to buy his own rocket company to compete with SpaceX. The key: data centers

The rivalry between Sam Altman and Elon Musk has just reached its highest point: space. And all so that OpenAI can deploy its own data centers in space. The news. As revealed by the Wall Street Journalthe CEO of OpenAI has been exploring the purchase of Stoke Space, a Seattle startup that develops reusable rockets, with the goal of building data centers in space. Although talks with Stoke Space cooled in the fall, the move confirms a trend we’ve been observing for months: Silicon Valley is outgrowing the Earth to fuel AI. Sam’s plan. According to the Journal’s sources, Sam Altman was not looking for a launch provider, but rather an investment that would ensure OpenAI majority control of Stoke Space. Stoke Space, founded in 2020 by former Blue Origin engineers, is developing a fully reusable rocket called ‘Nova’ to compete with SpaceX’s Falcon 9. So that. Altman maintains a tense rivalry with Elon Musk, so the logic of this move would be to reduce OpenAI’s dependence on Musk’s rockets in the event that it decided to deploy servers in space. But above that there is a purely energetic motivation. The computing demand for AI is so insatiable that the environmental consequences of keeping it on Earth will be unsustainable. In certain orbits, however, solar energy is available 24/7 and the vacuum of space offers an infinite heat sink to cool equipment without wasting water. The fever of space data centers. Altman is not alone in this race. What until recently seemed like an eccentricity has become a serious project for big technology companies: And what does Musk say? The irony of Altman pursuing his own rocket company is that the industry’s undisputed leader, Elon Musk’s SpaceX, already has the infrastructure in place. While his competitors design prototypes and seek financing, Musk has cut off the debate with his usual forcefulness: in the face of the discussion about the need to build new orbital data centers, He assured that there is no need to reinvent the wheel: “It will be enough to scale the Starlink V3 satellites… SpaceX is going to do it.” Images | Brazilian Ministry of Communications | Village Global In Xataka | Building data centers in space was the new hot business. Elon Musk just broke it with a tweet

Data centers consume a lot of water, but it is probably less than we thought. It’s a book’s fault

We can criticize the AI ​​boom for many reasons, but there is one that deeply affected society: the environmental impact, more specifically water consumption of each interaction with the AI, necessary to be able to cool the servers. The problem is realbut everything indicates that it has been magnified and the origin would be a miscalculation in a popular book. the book. It is ‘Empire of AI’ written by Karen Hao and which we already talked about in Xataka. After interviewing hundreds of former employees and people close to the company, the author constructs a detailed and highly critical account of OpenAI, more specifically its CEO Sam Altman. Among the criticisms of this ‘AI empire’, Hao mentions the excessive water consumption of AI, going so far as to state that a data center would consume 1,000 times more water than a city of 88,000 inhabitants. The criticism. Andy Masley tells it in his newsletter The Weird Turn Pro. According to their calculations, in reality 22% of what the city consumes or 3% of the entire municipal system. Furthermore, Masley states that the book confuses water extraction (temporary withdrawal that is returned to the network) with real consumption. The calculation error. The author herself has responded to the article de Masley citing the email he sent to the Municipal Drinking Water and Sewage Service of Chile (SMAPA), from whom he requested information on the total water consumption of Cerrillos and Maipu, the towns he used to make the consumption comparison. The problem is that Hao requested the amount in liters, but they responded without specifying the units and everything indicates that they were actually cubic meters, hence the large discrepancy. The author has consulted again with the SMAPA to clarify this information. It seems that, indeed, there is an error. Estimates. How much water AI consumes has been a recurring question in recent years. In September 2024, a study published by Washington Post He calculated that, to generate a 100-word text with ChatGPT, 519 milliliters of water were needed. The calculation was made taking into account the total annual consumption of data centers and the type of cooling used. It’s truly outrageous. What companies say. AI companies are not very transparent regarding the water and energy consumption of their data centers. The big technology companies give the total annual consumption data in their sustainability reports. We know that a large part of the consumption goes to data centers, but it is not possible to know the real consumption of each search. Google has been the only one that has published specific energy and water consumption data from its AI. According to the company, the water consumption for each Gemini consultation was 0.26 milliliters, or in other words, about five drops of water. We cannot extrapolate this data to all data centers or all companies, but it does seem that previous estimates are quite exaggerated. Water controversy. All of this doesn’t mean there isn’t a problem with water and AI. In fact, the Cerrillos data center where the alleged calculation error is It was never built because the Chilean justice system paralyzed it. due to the climatic impact it was going to have, especially in the context of drought in which the region found itself. Data centers need a lot of water, so much so that initiatives are emerging to cool them submerging them in the ocean. The other problem. Water is just one of the problems data centers face, energy demand poses an even greater challenge. In 2024, Data centers already accounted for 4% of total electricity consumption in the United States and in the surroundings of some of these beasts the electricity bill has risen 267% in recent years. Big tech is already warning: there is no power for so many chips and they are being raised since create nuclear power plants until take their data centers to space. Image | Google In Xataka | What is happening in the US is a warning for Spain: data centers driving up electricity bills in homes

Aragón has just activated its second major data center project. The bet goes through a challenge that is difficult to ignore

Aragón is going through a unique moment: in just a few years it has gone from competing to attract data centers to announce three mega facilities new ones promoted by Forestalia that aim to strengthen their position on the European cloud map. The announcement by the regional government comes in the midst of a race to attract technological investment, but also in a territory where the electrical network works to the limit and every great project depends on decisions that have not yet been made. The result is a scenario as ambitious as it is full of unknowns, which will determine the real impact of this expansion. How these digital complexes work. A data center is, in essence, a technological heart that stores and processes information for millions of users and companies. Every series that is streamed or every operation carried out in the cloud passes through servers that require stable power and constant cooling. That is why the choice of location is so relevant: electrical capacity and operational security are needed. Aragón has been gaining ground on that map and today is seen as a strategic option for new facilities. The project. The Government of Aragon has detailed that the Búfalo Project includes three data centers in Magallón, Botorrita and Alfamén, backed by an investment of 12,048 million euros. The deployment includes DCM Data, DCM Dédalo and DCM Blue, whose works would begin between 2028 and 2029 and will extend for approximately eight years. According to official estimates, the construction will generate about 30,000 temporary jobs. In the operational phase, each facility will add hundreds of workers, with a total that clearly exceeds a thousand stable positions. Aragón on the international board. The accumulated investments in data centers exceed 70,000 million euros and place the community in the same conversation as consolidated European hubs. According to the President of the Government of Aragon, Jorge Azcón, the computing capacity that is being configured rivals that of Dublin and Paris and aspires to approach that of Frankfurt. The regional Executive also states that the data that will be managed will have a European scope, from Germany or France to Italy and the United Kingdom, reinforcing the international dimension of the project. Distributed renewable self-consumption. The Government of Aragon presents self-consumption as a distinctive element of the Búfalo Project, since approximately half of the energy consumption will be associated with wind and photovoltaic parks powered by Forestalia. This volume of generation allows for a renewable supply, although it does not eliminate dependence on the general network, which will provide the rest of the energy. The underlying idea is to combine own generation with existing infrastructure to sustain large-scale facilities. Press to see the message in X The word “self-consumption” may lead one to think that data centers and renewable plants share the same physical space, but this is not the case. Forestalia is setting up parks in various regions of Zaragoza and Teruel, located where the natural resource is most favorable. The data centers, as we say, will be in Magallón, Botorrita and Alfamén, and the connection between both worlds is made entirely through the Red Eléctrica network. It is a distributed scheme that coordinates generation and consumption without a single energy campus. A network to the limit. Aragon produces more electricity than it consumes and exports about 54% of its generation, but that abundance contrasts with a distribution network that functions practically at maximum. A report published in September 2025 sets its occupancy level at 94.3%, well above the national average of 84.3%. This saturation leaves little room to incorporate large consumers such as data centers. The result is a paradox: available energy, but an infrastructure incapable of delivering it to all projects. Projects that have already reached their peak. The bottleneck is not a future hypothesis, but a reality that already affects several operators. According to Heraldothe data centers in the pipeline have requested more than 6,000 MW and only a part has firm access, with cases such as Vantage, which has 90 MW authorized despite aiming for 300. Microsoft also depends on tenders in saturated nodes. The Government itself recognizes that everything will be linked to Red Eléctrica’s planning and the decisions of the central Executive. Water, a debate that is still open? The cooling of data centers has generated concern in Aragon since Amazon asked for late 2024 48% more water for the complexes that already operate in the region. Ecologistas en Acción and the Tu Nube Seca Mi Río platform then warned of the water impact of these facilities in the midst of a structural drought. Azcón maintains that future Forestalia centers will use a closed circuit with “practically imperceptible” consumption and affirms that the debate “is over.” In any case, everything indicates that this matter remains under public scrutiny. To facilitate the path of the Buffalo Project, The Government of Aragon has declared the initiative as of Autonomous General Interest, a figure that allows procedures to be simplified and the different administrations involved better coordinated. This declaration speeds up procedures, but does not resolve the main point of friction: the available electrical capacity. Hence, the regional Executive insists on its willingness to work with the central Government and Red Eléctrica, the only actors that can modify the network planning. Real progress will depend on those decisions. The announcement of the three new data centers, together with the rest of the initiatives in the pipeline, places Aragón at a decisive moment to consolidate its presence on the European cloud map. The investment is notable and so is the promised employment, but much of the result will depend on decisions that are not entirely in the hands of the community. The region has shown intention and movement, although it remains to be seen what the real scope of this bet will be. Images | İsmail Enes Ayhan | Jorge Azcón (X) In Xataka | The European Commission’s pendulum with AI is real: it will sacrifice privacy to … Read more

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.