your dream of putting AI data centers in space is probably not feasible

The possibility of setting up data centers for artificial intelligence (AI) in space is very attractive. So much so that several CEOs of some of the largest technology companies in the US have not hesitated to get wet and ensure that support this strategy. Jeff Bezos predicted in early October 2025 that data centers will reach space over the next two decades with the purpose of solving in one fell swoop the power supply problems currently posed by these facilities on Earth. Elon Musk did not take long to encourage the discussion even more. Shortly after Bezos’ statement posted a tweet in X in which he assured that SpaceX only needed to scale its Starlink V3 satellites equipped with high-speed laser links to bring this idea to fruition. In fact, he closed his tweet with a forceful statement: “SpaceX is going to do it”. However, the laws of physics are implacable. And SpaceX has had no choice but to acknowledge to its investors the daunting challenges that this project entails. Orbital data centers may not come to fruition According to ReutersSpaceX has delivered an official document to its investors in which it recognizes that both orbital AI data centers and human settlement on the Moon and Mars depend on technologies that have not yet been developed or tested, and that, therefore, may not be viable from a commercial point of view. SpaceX is preparing its IPOand this evaluation puts on the table the caution required by the legal obligation to be extremely honest with the risks to avoid future lawsuits from new shareholders. “Our efforts to develop orbital AI computing and in-orbit, lunar and interplanetary industrialization are in the early stages and involve significant technical complexity and the use of technologies that have not yet been tested. For these reasons they may not be able to achieve commercial viability,” SpaceX clarifies. There is no doubt that the challenges that need to be solved for data centers to reach space are colossal. The challenges that need to be solved for data centers to reach space are colossal One of them is the impact of the ionizing radiation about the hardware. This form of radiation is a type of high-frequency energy, such as X-rays, gamma, alpha or beta, which is capable of tearing electrons from atoms, thus altering the structure of molecules. In space, server chips are not protected by the Earth’s atmosphere and magnetic field, which makes them very vulnerable to ionizing radiation, which has the ability to permanently degrade them. To solve this problem it will be necessary to develop some type of shielding capable of protecting the hardware of the servers of the cosmic radiation. This requirement leads us to the next critical challenge: in space it is not possible cool servers using convectionas on Earth, because in the vacuum of space there is neither air nor water. In addition, it would be necessary to use enormous radiators. It is possible to propose several solutions to these problems, but we must not overlook that it is crucial to minimize the weight and complexity of the material that needs to be put into orbit. Otherwise its commercial viability will be non-existent. The two challenges we just delved into are probably the most difficult to solve, but orbital data centers pose more difficulties. One of them is that to deliver the gigawatts per hour they require, it would be necessary to use enormous solar panels. Furthermore, in some applications the latency that these space installations would introduce would probably be unaffordable. And, on top of that, maintaining an orbital data center would be extremely expensive. In fact, it probably wouldn’t even be economically feasible, forcing its owners to introduce massive redundancy that would push it away from profitability. Image | freepik More information | Reuters In Xataka | Elon Musk knows that TSMC is overwhelmed: Terafab is his idea to completely change the global chip industry

Much of the world economy right now consists of setting up data centers. And there is already a game on Steam that simulates it

Surely what you want most when you come back from work is to turn on your PC or console to play a work game. There is not an ounce of sarcasm in this phrase, since for some time now games that are about that, about working, have become popular. And I don’t mean a ‘stardew valley‘ farm management or a ‘Animal Crossing‘mortgage payment: I mean games that are, directly, a second job. There are cleanof be an IT in a company, as a worker supermarket or of construction worker. Also being in charge of a data centerclear. With all the boom in data centers that have drunk the ram market and SSDs, it is possible that you can’t build a PC new because RAM is through the roofbut you can always fulfill the fantasy of being that person who has the power to set up servers and wire everything in their hands. Is called ‘Data Center‘, and as a game to learn how data centers work and turn off your brain, it is… interesting. The game of having an after-work job setting up data centers Don’t think of this game as a construction game like ‘The Sims‘ and the like. Here you already have the space and what you should do is internal management. You must buy the frames to install the racks, servers and switches, but not crazy, but depending on the needs of the clients who hire your services. Once you have the equipment, it is time to interconnect them with the Ethernet cables that link systems within the same rack, but that must also physically go to other platforms. The easiest thing is to pass those cables through aluminum structures hanging from the ceiling, and once you think you have everything ready, it’s time to turn it on. This is when your customers’ traffic is represented by light balls that travel along the cables. Those little balls have their reason because as things progress, Your clients will ask for more and more bandwidthand you will have to start managing and prioritizing. Equipment also breaks, so you will have to go to the PC to order spare parts or upgrades to have greater computing capacity. The idea is to create the perfect system with the best possible data flow, without bottlenecks and without wasting resources, carefully scaling to offer each client what they need and not oversizing. Those little balls represent data traffic. Each color is a customer It is, in short, a work game that can be repetitive, but that is why it works so well. In this type of titles you do not have to solve puzzles, You don’t have to be skillful with the controls or think too much. They are ideal for turning off the brain while we do a repetitive task and simply focus on what we have to do and what clients ask of us. It sounds like the most boring thing in the world, that second job that I mentioned at the beginning of the article, but they are perfect games to turn off the brain while we have a podcast in the background or something like that. In the comments of this particular ‘Data Center’, players highlight the “teaching” aspect and, despite the limitations of some systems, how realistic it feels. The store from which we must order the components Now, it is not a simulator. In the comments, players who claim to work in data centers point out that, although it is curious and represents some things very well, there are others that do not fit reality and technical options are missing such as VLAN systems or managing something as basic as power cabling. The best thing is that it costs nine euros and, if you don’t click on the first two, you can request a refund on Steam very easily. In the end, it is not a game for everyone. No game is, really.but ‘Data Center’ is one more of that much talked about wave of work games that is appearing recently. Because managing a data center may not be your thing, but for example, restore retro games or manage the latest video store of the city before Amazon eats it. Images | ‘Data Center’ on Steam In Xataka | It seemed like a game of imitating movements. It was actually diagnosing autism better than many clinical tests

Ford has been slow to adapt to the electric car, so it is going to start manufacturing batteries for… data centers

Ford has decided to convert its electric vehicle battery manufacturing capacity into a large-scale energy storage business. The move has its own name: Ford Energy, a new division with $2 billion in investment planned for the next two years and the stated objective of supplying batteries to data centers, electricity companies and large industrial consumers. Because now. The starting point is not exactly ideal for the company. Ford’s electric division accumulated net losses of 11.1 billion dollars only in the fourth quarter of 2025, according to Reuters. For this year, the company expects to continue losing between 4,000 and 4,500 million additional dollars in its electrical and software division. “I think the customer has already spoken,” Ford CEO Jim Farley told investors. With battery factories operating at low capacity and the electric vehicle market in the United States in free fall, especially after the elimination of the $7,500 aid last September, Ford has chosen not to dismantle that infrastructure, but to redirect it. What is Ford Energy and how it will work. The bet is articulated around the Glendale, Kentucky, plant, which will be converted to manufacture energy storage systems at network scale. According to counted Ford late last year, the facility will produce LFP (lithium ferrophosphate) cells and storage modules. The cell technology used is licensed by the Chinese firm CATL, with whom Ford already had agreements on its line of electric vehicles. The plan, according to the company itself, is to have initial operational capacity within 18 months and reach at least 20 GWh of annual production by the end of 2027. In parallel, the BlueOval Battery Park Michigan plant, in Marshall, will continue with its production of LFP cells for Ford’s upcoming midsize electric truck, but will also make lower amperage cells aimed at residential storage. Lisa Drake, the board of directors who heads Ford Energy, explained that the “predominant” business opportunity will be in commercial electric grid customers, with data centers as the second priority and the residential segment as the third leg. Drake also noted that when going out to market to explore demand, it became clear that the technology preferred by customers was precisely the containerized prismatic LFP system, something that Ford could easily manufacture thanks to its licenses. For his part, John Lawler, vice president of Ford, counted In the statement, Ford Energy’s core purpose is to “capture the growing demand for reliable energy storage that reinforces the stability and resilience of the electric grid for utilities and large consumers.” The market you want to conquer. The explosion of artificial intelligence electricity consumption in data centers is skyrocketing on a global scale. The International Energy Agency places the demand for these centers around 945 TWh by 2030approximately 3% of global electricity consumption, with a projected growth of 15% annually. In the United States alone, according to the Battery Council International, this consumption could double to between 400 and 600 TWh on the same date. In that scenario, large-scale energy storage becomes critical infrastructure and Ford, like many other converted manufacturersthey see a great business opportunity. Ford is late, but he is not alone. The problem is that Tesla has a decade of advantage. Its energy storage business deployed 46.7 GWh in 2025 alone, 48% more than the previous year according to TechCrunchand was also more profitable than its own electric car division, with gross margins close to 30% compared to 15% for the automobile. General Motors has also made a move: its joint venture with LG Energy Solution has just invested $70 million to convert its Tennessee plant, south of Nashville, into the production of batteries for storage. The transition, however, is neither easy nor cheap. Switching a factory from nickel chemistry, common in electric car batteries, to LFP can take up to 18 months and cost several hundred million dollars, according to share from Reuters. Added to this is technological dependence on China, which dominates the LFP supply chain, and 35% US tariffs on cathode and anode materials of Chinese origin. What this means in the long term. Just like they count From the middle, although the demand for energy storage in North America is expected to almost double in five years, going from 76 to 125 GWh, that is not enough to absorb the more than 275 GWh of productive capacity that the automobile industry has installed with electric in mind. Storage alleviates the problem, but does not completely solve it. Even so, this same reorientation is what many other car manufacturers have opted for in order to take advantage of their infrastructure and contain losses due to their electric cars, especially in the United States, which is where things are much weaker. Cover image | Hans and ford In Xataka | Australia has a straight highway of 150 kilometers. And to prevent you from falling asleep he has put hobbies on the posters

We believed that data centers in space were a thing of the future. Kepler has already activated the largest orbital cluster

For years, talk of data centers in space sounded like the kind of idea that always seemed a few years away. The conversation existed, of course, but almost always supported by long-term plans, ambitious announcements and an industry that had not yet shown much real muscle in orbit. That is why what has just emerged deserves attention. TechCrunch explains that Kepler Communications has already launched the largest computing cluster currently operating in space, a sign that this race is beginning to leave the field of promise to enter, little by little, the field of infrastructure. What has Kepler put into orbit. It is not a large facility suspended above our heads, but rather a distributed cluster made up of 10 operational satellites. Together they add up to around 40 Nvidia Orin processors aimed at Edge Computingconnected to each other by laser links. That set, launched in January of this year, as we say, is today the largest active computing cluster in orbit. The company itself also frames this network as a constellation designed to move data in space almost in real time. What it really is. So we are not facing a massive orbital data center that replicates the Earth model, but rather a distributed architecture that combines connectivity and processing in the full space environment. This difference matters because it allows us to separate two plans that are often mixed: one thing is the large-scale vision defended by actors like SpaceX or Blue Origin, and quite another is this first step, much more attached to immediate uses and specific needs of missions in orbit. The immediate business. If this orbital computing is starting to be interesting, it is because it addresses a fairly clear problem: it does not always make sense to send all the data to Earth to process it later. The initial value of these systems is in working with the information right where it is generated, something especially useful for more advanced sensors and for applications that require a faster response. Kepler also maintains that its network can serve as a basis for future processing and connectivity services between different space assets, and the media adds that the company already transports and processes data uploaded from the ground, as well as information collected by payloads hosted on its own satellites. Sophia Space. Here a startup comes into the picture that wants to upload its proprietary operating system to one of the satellites in the constellation and try to deploy and configure it on six GPUs spread over two ships. In a terrestrial data center that would be almost routine, but it would be the first time we would see something like this in orbit. For Sophia, in addition, the test has a clear risk reduction value before its first launch scheduled for the end of 2027. And we are not talking about a minor detail: the company is developing space computers with passive cooling, a way with which it seeks to attack one of the big problems in this sector: avoiding overheating. Kepler doesn’t want to be that. In the midst of so much noise around orbital data centers, the company itself is trying to position itself in a somewhat different place on the map. Your corporate presentation insists in a mission much more linked to communications, with a hybrid optical constellation designed to modernize the flow of data in low orbit and beyond. In this sense, it does not define itself as a data center company, but as infrastructure for space applications. The journey has begun. If this step by Kepler makes anything clear, it is that orbital computing no longer belongs only to the realm of great presentations. SpaceX wants to deploy a massive network of satellites for AI, Google prepares in-orbit tests with solar-powered chips and Blue Origin has announced a constellation of more than 5,000 satellites. In parallel, starcloud already launched a satellite in 2025 with an Nvidia H100 GPU and Aetherflux targets 2027 for its first node. Images | Kepler Communications | Sophia Space In Xataka | The mystery of the misinflated balloon: the more we calculate the size of the Universe, the less sense it all makes

We had a perfect plan to decarbonize the electrical grid. The brutal consumption of data centers has dynamited it

The daily headlines multi-million dollar investments announced in new language models and cutting-edge chips. Venture capital investors have pumped more than half a billion dollars into AI startups over the last five years. But, as a revealing analysis warns of TechCrunchthe smart money has begun to change sides: today, the best investment in Artificial Intelligence is no longer software. The reality on the ground has become extremely arid. Putting up walls and stacking servers in a giant data center has become the easy part of the equation. The real wall the tech sector is crashing into is finding the electrons needed to power it. According to a report by the analysis firm Sightline Climateup to 50% of data center projects announced for 2026 could face delays. Of the 190 gigawatts (GW) of capacity the company tracks globally, just 5 GW are under actual construction today. The bottleneck is no longer the microchips. It is access to the electrical network. The tyranny of 24/7. Consumption has run amok at a pace that 20th century infrastructure cannot process. A Goldman Sachs analysis projects that AI will shoot energy consumption of data centers by 175% by 2030. The figures all point in the same direction: the Open Energy Outlook predicts that electricity demand combined data centers and crypto mining will grow by 350% this decade. As a result, the pristine image of the technological cloud is evaporating. Google’s emissions have increased by 48% in the last five years, and Microsoft’s by 31% since 2020. The reason? What is known in the industry as the “tyranny of 24/7”. The algorithms do not sleep and require a continuous and steady power supply; They cannot be turned off simply because the wind stops blowing or the sun sets. Given the lack of mass storage systems globally, the fuel that is covering this urgent gap is not green. It is natural gas, which has returned from retirement as the great structural support of the sector. A global collapse with two faces. The pressure has already broken the market balances. In the PJM region—which supplies 13 eastern US states and has the highest density of data centers in the world—capacity prices went from $30 to $270 in a single auction at the end of last year. As John Ketchum, CEO of NextEra Energy, noted, we are facing a “golden era of energy demand”, but with an insurmountable physical limit: “the new electrons cannot reach the network quickly enough.” This electrical asphyxiation is redrawing the global map, and Europe is the best example. Historically, the European market was dominated by the “FLAP-D” markets (Frankfurt, London, Amsterdam, Paris and Dublin). But the network of these cities is no longer going strong. According to data from Greenpeacedata centers accounted for almost 80% of electricity consumption in Dublin, forcing Ireland to impose a moratorium. The market share of these traditional capitals will fall sharply by 2035causing a mass exodus to the Nordic countries (with unburdened networks and cold climates) and to southern Europe, such as Spain, Greece and Italy, in search of green megawatts. The hardware and network problem. When we scratch beneath the surface of this collapse, we discover that the physical problem splits into two large gaps. First, the machine to generate the energy is missing. Since intermittent renewables are not enough, companies turn to gas. However, gas turbines have become a rare commodity. Three years ago, Siemens Energy executives considered this market “dead”; Today, the factories are so overwhelmed that the delivery times for these turbines can extend up to seven years. Second, the “plumbing” is missing. Once the electricity is generated, the task of taming it within the building falls to the transformers. It is an iron and copper block technology that has barely changed in 140 years. As explained TechCrunchAs servers demand more power, traditional electrical equipment will take up twice as much space as the servers themselves. It is mathematically unsustainable. ‘Smart Money’ changes sides. Against this backdrop, venture capital is pivoting. Big tech companies (Amazon, Google, Oracle) are starting to behave like energy giants, devising alternatives to minimize their dependence on an outdated public grid through hybrid or generation approaches. on site. The solutions are divided into several fronts: The nuclear resurgence: Google has signed a pioneering agreement with Kairos Power to develop seven small modular reactors (SMR) by 2030, and Amazon tried (although regulators temporarily blocked it) connecting a data center directly to the Susquehanna nuclear power plant. Super batteries: Google is collaborating in Minnesota with the company Xcel Energy and the startup Form Energy to install batteries capable of discharging energy for 100 hours, thus stabilizing the peaks of renewables. Hardware innovation: Dozens of startups (such as Amperesand or DG Matrix) backed by investment funds are developing silicon-based “solid state” transformers, seeking to finally retire old iron and copper to save vital space in facilities. Regulatory surgery: In southern Europe, organizations such as the CNMC in Spain are applying “flexible access permits”, forcing centers to accept cuts in emergencies so as not to collapse the entire country. The paradox: AI as savior of the electrical system. However, the story has a fascinating twist. The same technology that today threatens to burn the cables of half the world could be the one that ends up saving the electrical system. According to the consultant’s estimates Deloittethe application of artificial intelligence to optimize industrial systems and electrical networks will save more than 3,700 TWh globally by 2030. That is, AI will save almost four times the energy consumed by all the data centers on the planet combined. A report of Ember over Southeast Asia (ASEAN) support thiscalculating that integrating AI into the management of its networks will save more than 67 billion dollars and avoid the emission of almost 400 million tons of CO2. But to get to that future of efficiency, you first have to turn on the machines today. And what is at stake is the world economic map. Hosting these centers is … Read more

Data centers are real “heaters”. And they are settling in regions as hot as Aragón

The data centers They are a black hole in several senses. They are drinking the global NAND chip manufacturing capacity (what affects SSDs, to RAM oa SD cards), the companies that they make batteries they can’t cope and consume wateryes, but much more alarming is energy consumption. In this sense, they are insatiable and, in the end, thousands of pieces of equipment that generate heat are causing another unexpected effect: they are turning the facilities into heat islands. And it is something that has the potential to affect 340 million people. What’s happening. Andrea Marinoni is an associate professor in the Earth Observation group at the University of Cambridge. Also the coordinator of a group of researchers from both the center and the Nanyang Technological University who have published a study called “Heat Island Data: Measuring the Impact of Data Centers on Climate Change.” In it, they present the results of measuring more than 6,000 data centers located far from dense urban areas with the aim of identifying whether these facilities, by themselves, are a notable heat source. The result? “An impact elderly than expected,” according to the researchers. They compared historical temperature measurements from the locations of those data centers over the last 20 years to compare how things have changed recently and identify whether those data centers have had any influence. And, as we said, the impact seems to have been strong: an average of 2°C, with maximums of up to 9°C in some cases. Doesn’t matter the place. This generates a heat island effect, which is when a large amount of heat is concentrated in one area that should not be there. In big cities It’s something that usually happens and that’s why the most efficient urban architecture seeks to combat the phenomenon. And it doesn’t matter where the data center is. In the study they present several examples: Bajío region in Mexico: high data center density, stable climate, but a land surface temperature increase trend of 2 degrees Celsius in the last two decades. It is something that was not identified in nearby areas without data centers. States of Ceará and Piauí in Brazil: increasing trend of 2.8°C with a projection of reaching 3.5°C in the next five years when this is not observed in the rest of the areas. Aragon in Spain: an anomalous increase of 2°C in surface temperature that stands out compared to neighboring provinces. Potential damage. Aragón is a worrying example because heThe region is consolidating as one of the ‘lungs’ of hyperclimbers in Europeas well as one of the regions of Spain key to the expansion of data centers and European technological sovereignty. And the problem is that, according to the study, the impact of this increase in surface temperature reaches up to 10 kilometers away from the hyperscalers. They detail that in the surrounding areas that are about 4.5 kilometers from the data centers, an increase of 1°C can be measured, which seems little, but when we talk about these climatic effects, it is a lot. And, furthermore, they estimate that the impact of increased temperatures due to this broad heat island effect is something with the potential to affect 340 million people. Yes, but. This research has not been the only recent one on the effect of data centers on the land on which they are installed. Researchers at Arizona State University they installed sensors on cars driving near these centers to capture measurements and noticed the same thing as the Cambridge researchers. But one thing to keep in mind: both studies show measurements, but they have not been peer-reviewed. And there are experts, such as Ralph Hintemann, principal investigator at the Borderstep Institute for Innovation and Sustainability, who point out that, although the results are there and are interesting, some figures “seem very high.” In fact, it focuses not so much on the heat that is concentrated around data centers but on the big problem: the amount of energy they need and the return to fossil fuels to meet peak demand. Image | Tedder In Xataka | Data centers in space promise to save the planet. And also ruin the earth’s orbit

The French AI startup profiting from geopolitical chaos just raised $830 million. For European data centers

The French startup Mistral has raised 830 million dollars and it has done so with one objective: to create AI data centers in Europe that will be based on NVIDIA chips and technological solutions. That’s good news, but it also has a disturbing side. Merci, Monsieur Trump. There is a geopolitical irony in the rise of Mistral. The French AI startup has become a reference in Europe, but it has done so not so much because of its models or technology (that too) but because of Donald Trump. Since the American president returned to power and began to destroy the era of globalization, the demand for “sovereign” European alternatives to the large US technology platforms has skyrocketed. Governments and companies that previously turned to Microsoft, Amazon or Google without thinking are now trying to look for options that free them from those dependencies. Mistral is precisely the clear alternative in terms of AI. 830 million to have its own infrastructure. The round that Mistral has raised is not venture capital, but debt financing granted mainly by French banks such as Bpifrance, BNP Paribas, HSBC and MUFG. It is an interesting aspect and shows that the company no longer needs to convince investors, but rather finance the infrastructure necessary to scale its business. Those $830 million are destined for its future European data centers, starting with its facilities in Bruyères-le-Châtel, near Paris. Said center will house 13,800 GB300 chips from NVIDIA and will begin operating before the end of June. Debt, not equity. There is an important difference between the venture capital rounds that have financed Mistral until now and this new round of debt. Venture capital is not returned: investors bet on a stake in the company and get paid if the company grows and is sold or goes public. The debt is repaid, and it is with interest, regardless of how the business is going. That Mistral has opted for this mechanism suggests that it is optimistic about the future, but it also represents added pressure for the company, which will not be able to afford consecutive quarters of losses. Betting with other people’s money has its problems, but doing so with borrowed money also has important problems. The success of the 13,800 chips. May that French data center get 13,800 GB300 chipsthe most advanced from NVIDIA, is not a minor detail. These AI accelerators are on the waiting list of many companies, and here Mistral competes with hyperscalers like Microsoft, Google or xAI that buy tens of thousands of units and have priority agreements. That this European startup has managed to secure that amount seems to demonstrate that it has negotiating capacity or a special relationship with NVIDIA and its CEO, Jensen Huang. European AI ecosystem. Mistral is little by little becoming the perfect European ecosystem for companies that want not to be exposed to dependencies on North American partners. Having everything under European control is what more and more governments are looking for in Europe, and here we are facing an effort that wants to offer that certain independence… which of course is anything but complete. Be that as it may, Mistral has become the great European seller of sovereignty as a product. But. Mistral expects to achieve 200 MW of computing capacity by the end of 2027, including a €1.2 billion facility in Sweden with 23 MW that will begin operating next year. These are decent numbers in a European Union that has barely raised its head in this segment, but they are very far from those in China and especially the United States. OpenAI and its partners have agreements worth several hundreds of billions of dollars in infrastructure, and while here we move in megawatt capacities, there we talk about gigawatts. The distance is still enormous. And the dependency still exists. The paradox that no one seems to want to allude to is important: the European “sovereign” infrastructure that Mistral is building depends entirely on chips designed by an American company and manufactured in Taiwan. If for any reason Washington decides to make Europe a banned region for its technology and prohibits the export of GB300 chips, Mistral’s expansion would be paralyzed. The quest for digital sovereignty is interesting, but the reality is that Europe will continue to depend on US technology and Taiwan’s manufacturing capacity to an even greater extent than the US o China depend on its rival. The old continent has activated some measures for mitigate the problembut that will not prevent it from continuing to exist in the long term. Paris, European capital of AI. The French startup has turned France into one of the great European references in AI. Mistral was valued at $12 billion after raising $1.7 billion in financing led by ASML. In addition, they expect to exceed 1,000 million in annual recurring revenue. This company is now joined by the recently launched startup Yann LeCun: Advanced Machine Intelligence Labs (AMI Labs) has already managed to raise more than 1 billion dollars and will also be based in Paris. Another detail should be highlighted: Bpifrance, the French public investment bank, is leading the round. That is significant, because that means that the one supporting this initiative is the French state. In Xataka | Mistral does not generate hype, it is a discreet AI, it does not boost the shares of any company, but it already makes more money than Grok

Data centers have eaten up the world’s RAM. Now they threaten to eat the batteries

If the question is “what are data centers hungry for,” the answer is a simple “yes.” We hadn’t talked about the RAM memory crisis not because would have finishedbut because it was nonsense keep repeating it. The summary is that things are still as bad as they were a few weeks ago and, although the machines are at full capacity to create more, everything is going to the same place: the AI ​​platforms of the data centers. But it is no longer that they have broken the market for RAM, SSD, hard drives and everything that has to do with chips: it is that they are now going after batteries. The Panasonic case. The Japanese giant advertisement a few hours ago its plan to triple its lithium-ion cell production capacity. They are going to expand their facilities dedicated to this, but they will also adapt some of their manufacturing plants for elements for the automotive industry to manufacture more batteries. All the extra batteries they can make will be few, to the point that they not only propose the change for Japanese plants: also for foreign ones like the one in Kansas. Because? The short answer is because of AI. The long answer is that AI can’t stop working for even a second, and that’s why computers need backup power sources. That energy comes from batteries that are installed between the racks and which, in the event of any outage or specific peak, they ‘pull’ in order to continue operating. And since the equipment requires an insane amount of energy to operate, many, many backup batteries must be made. They are still modules with hundreds of “stacks” that are embedded in the racks All sold. The forecast is such that the Japanese company estimates that, for the next fiscal year, it can sell batteries worth 800,000 million yen, about 5,000 million dollars. It would quadruple its current sales and that implies something else: everything is sold. Its customers have already bought 80% of Panasonic’s output, leaving non-customers to fight for just a fifth of the volume. That will increase prices, generate shortages and cause the same thing that is happening with RAM and other components: there are no units, prices skyrocket, companies see that there is demand and allocate their production to creating that product and the consumer market suffers the consequences. It’s exactly the same thing we’ve seen with HDDs, with Seagate and Western Digital pointing out that what they were going to produce during the next few months was already sold. And it has also happened with RAM. The situation with them became so desperate that the main manufacturers have begun to ask for payments three years in advance. Because as the boss of SMIC – one of the largest foundries in China – pointed out a few days ago, everyone wants to have the infrastructure of the next decade by… yesterday. Supercapacitors. Aside from the “bad” news, Panasonic is also working on something new. Compared to traditional capacitors, the Japanese company is developing supercapacitors for data centers. These are capacitors that can store more energy, but also deliver it more slowly. They are denser than batteries and are expected to be high-fidelity elements to support data center equipment during outages or peak loads. They wait have them ready by 2027. The renewables. In the end, these Panasonic batteries (and other manufacturers) are simple safety elements to ensure that uninterrupted flow of power in the hyperscalers’ racks. How does it affect us? Well, because the capacitors and equipment manufactured by Panasonic are also found in consumer hardware and if they now focus on data centers, the same thing will happen as with NAND chips and everything that uses a memory chip. And, in the background, there are also the most conventional batteries to store a large amount of energy from renewables. Because we have already mentioned that data centers consume a lot, so much so that even has turned to coal, gas is a common resource and there are companies that are opening its nuclear power plants. But if you opt for renewables, it will be necessary to equip data centers with tens of hundreds of batteries capable of absorbing the energy blow. In fact, there are already car battery manufacturers that they are converting. In short: everything bad… except for companies that manufacture those components. Images | panasonic In Xataka | If you were thinking about setting up a NAS to create your own cloud, we have bad news: AI has other plans

Shopping centers seemed condemned to agony. The reality is that they do not stop growing with million-dollar investments

The outlook looked bad. Very badly. The competition from online commerce, the change in consumer habits, the pressure that platforms such as Netflix or Amazon Prime were beginning to exert on cinemas and (as a cherry on top) the blow that the pandemic dealt to crowded spaces led some analysts back in 2020 to announce the “apocalypse of the “retail”. The ‘shopping center’ model, so prosperous in its day, seemed exhausted. After all… Who would want to go shopping with Amazon or pay for a movie with Netflix at home? Time has shown that those predictions were wrong. Apocalypse of retail? Today it may sound strange, but there was a time (not so long ago) when could be read frequently about the “apocalypse of the retail” in the press. Not all analysts saw it clearly and there were even who warned that the formula, imported from the United States, was not transferable to a market like the Spanish one, much less dense than the American one, but the logic seemed overwhelming: with the ecommerce growing and platforms like HBO or Amazon stomping in leisure, weren’t shopping centers doomed? The answer is no. On the contrary. A magnet for large investors. In 2025, the sector already showed signs of its good health by starting the year with five purchase and sale operations or transfers underway that amounted, in total, to about 1 billion euros. That was the first proof that shopping centers still awaken investor appetite, but that attraction appears to have strengthened. elEconomista.es publishes today a chronicle in which he slips that, a priori (and at the expense of what occurs at a macroeconomic level) the sector is aiming for a year of record investments. To be more precise, the newspaper speaks of operations worth about 3 billion of euros, an estimate that comes from the Colliers company. Beyond the forecasts and predictions, the data already closed for 2025 confirm that large commercial areas are experiencing a moment that has little to do with an economic “apocalypse.” In 2025 they will monopolize 59.5% of all the investment directed at retail, which translates into 1,484 million euros out of a total of 2,494 million. Not only is this a high figure, it far exceeds the capital allocated to other popular commercial formats, such as retail parks (352 million euros), small stores (524 million) or supermarkets (135 million). Is it the only sign? No. There is more. And they confirm that investors seem increasingly willing to bet on commercial areas in search of profitability. Its investment flow has been chaining increases for several years, which has allowed it to go from 406 million which it managed in 2022 to 1,484 million in 2025. Furthermore, the map of large stores continues to expand throughout the country. a few days ago The Newspaper revealed that, if nothing goes wrong, by 2028 Spain will add 28 new commercial parks with a total gross leasable area (GLA) of around 626,079 square meters. To these are added eight planned shopping centers that will reinforce the commercial park with 308,500 m2. Going down to detail. The list includes projects as ambitious as Valdebebas Shopping (Madrid), Infinity (Valencia), Breogán Park (A Coruña), Sur Córdoba Shopping (Cordova), Promenade Lleida (Lleida) or Metropolitan (Madrid), among others. “The majority of the spaces planned for the next three years are 20,000 m2 or less, that is, small or medium-sized, so their promotion and development is easier,” explains Eduardo Ceballos, from the Spanish Association of Shopping Centers and Parks (AECC). Greater than what was invested in new facilities are the funds dedicated to renovations. A percentage: 6%. That capital flows to shopping centers is no coincidence. According to shared data by the AECC in February, the sector closed 2025 with growth in both visits and billing. Specifically, the association estimates the increase in footfall in shopping centers and parks at 2.4% and a 6% increase in sales. Translated into hard and fast figures, that means 1,995 million consumers and just over 58,500 million in sales. The increase was largely possible thanks to restaurants (+10.8%), followed by the sale of clothing and accessories (+6.9%). Pre-pandemic levels. In AECC internal code assures having registered 32 purchase and sale operations of shopping centers and parks for a total of 2,000 million euros, which places the industry at 2018 levels, prior to the pandemic. The operations carried out by Bonaire, Parque Corredor, Intu Xanadú, Espacio Mediterráneo and Ballonti stand out above all. According to calculations by the sector’s employers’ association, right now in Spain there are around 592 shopping centers totaling 16.9 million m2 of GLA, a figure that is explained by the creation in 2025 of 132,000 m2 thanks to five new projects. Why this interest? The big question. If the factors that not so long ago made analysts fear an “apocalypse of the retail“have not disappeared (on the contrary, the ecommerce keeps growing), why are new shopping centers still opening? Why in 2025 have we visited them more often and spent more money on them? Why the hell do they attract million-dollar investments? For Ceballos One of the keys is the format’s demonstrated ability to adapt to local markets. At the end of the day, large stores continue to play with the trick of combining commerce, hospitality and leisure, also adapting to each market, which explains why the centers hold out while other more rigid surfaces (in the case of hypermarkets) they are in the doldrums. In full reinvention. Another key is that commercial centers and parks have not stood idly by. Maybe the context has changed, but they they have also done itespecially in the most disputed markets, where it is not unusual to find areas that have pivoted towards a clear commitment to luxury, big brands, the outlet concept or the leisure and restaurant offering. Increasingly, shopping centers are becoming less “commercial” and more “experiential.” What they seek is to guarantee experiences, to show themselves as spaces to be lived, marking distances with … Read more

The largest data centers on the planet are guarded by dogs. By robot dogs

The deployment of data centers to train the artificial intelligence It is a sign of technological power, but also economic power. This year alone, the big Americans are going to let themselves more money than NASA invested to take man to the Moon. More than $670 billion between Meta, Amazon, Microsoft and Google to create gigantic data centers. And within that investment, an important part is in safety with dogs. With robot dogs, specifically. It is the culmination of science fiction dystopia. In short. In the age of AI, data centers are the holy grail. We are continually seeing how companies sign contracts for thousands of million dollars with NVIDIA either amd (especially with NVIDIA) to provide them with the platforms with which to train their models. It’s only part of the equation, as there is another monumental investment in power, storage, RAM, dissipation and everything necessary to make these small cities work. Within the investment, there is security, and in BI They have published a report in which they detail that, within the budget, there are companies that are already including spending on robots that patrol both the perimeter and the internal corridors. The goal is security in every sense: patrol to detect threats, but also to identify any problems that occur with the equipment before they escalate and become something more serious. brand dogs. In the report, two companies are pointed out: Boston Dynamics and its dog Spot (with which we were able to play a few years ago) and Ghost Robotics with your Vision 60. Since Boston Dynamicsthe company owned by hyundai For a few years now, they have told the American media that they have been visiting data centers for some time because there is great interest. “We have seen an increase in interest in data centers in the last year, which is probably not surprising given the investment in that space,” Merry Frayne, the company’s senior director of product management, tells the outlet. For these companies, it is tremendous advertising, but also a potential customer in a “new” sector. Because it is possible that the police do not have the budget to get hold of many, but within the billions that are invested in data centers, dogs are just another sheet in the accounting excel. You can mount the sensor you want ‘Patrolling the center. And what is your task? Well… quite a task, really. The representative of Boston Dynamics, and other operators, point out that the dogs are not limited to acting as a “mobile surveillance camera”, but have other tasks: Patrol exterior perimeters to ensure that there are no problems with fences and accesses. Walk through server rooms, cooling rooms, and power rooms to look for anomalies such as water leaks, hot spots that may indicate a short circuit, or accumulations of moisture. Also sensors to detect gases, microphones to analyze noise and, ultimately, the sensor you want to put on it. Capture visual data from everything, such as analog pressure gauges or level indicators. Constantly, and as some robot vacuum cleaners do, map with LiDAR as they pass to see that there are no elements out of place. Some specific centers in which they are already being tested are Novva Data Centers in Utah or Oracle at the Industry Lab in Chicago. And dogs, in addition to cameras, have all kinds of thermal sensors and even conversational interfaces based on models like ChatGPT to interact with people. Measurement of noise levels Object identification Thermal sensors Compensate. It’s really nothing new. We have already seen robot dogs in other industrial sectors such as oil, mining or manufacturing. security forces. In China, in fact, there are deploying to assist firefighters in extreme situations or in institutesbut if in those scenarios they are seen as a tool, here they seem more like a substitute. Because there are those who have done the math and, in a market like the American one, a couple of full-time human guards can cost about $300,000 annually. The initial cost of a Spot ranges from $175,000 to $300,000, depending on the equipment. The cost of a Vision 60 is $165,000. And, as we see, they do much more than a security guard by being full of sensors. Frayne says, “Clients typically start to see a payback on their investment in about 18 months.” Michael Subhan, business director at Ghost Robotics, comments that “instead of having two human guards for $300,000, you can have one human guard and one robot.” A Spots battery charging. And it’s better, since it lasts less than two hours with the standard battery They also get tired. These robots also have their needs. They need to change batteries and install charging points and the environment must be well structured so that the routes are efficient and the sensors such as the LiDAR work well. They can climb stairs and avoid obstacles, but performance suffers in other environments and, in addition, the placement of fixed cameras and sensors in the building must be planned. That is to say, it seems that it is not as easy as saying “I build the center however I want, buy four robodogs and it will work”, but rather that you have to plan the traditional elements and the dogs to achieve a good integration. who are you HUGE Market. Although we have discussed two specific cases in which these robo-guardian dogs are being tested, both Boston Dynamics and Ghost Robotics have not gone into more details. In the end, it is security, and this falls within confidentiality agreements. Boston Dynamics points out that it is an “emerging market.” And Subhan has mentioned that “in the United States alone there are 5,000 data centers and 800 to 1,000 are currently being built, so we see it as a great market for us.” According to some estimatesthe market for robot dogs and industrial drones is currently around 500,000 units, but is expected to double by 2030, generating a market of 21 billion … Read more

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.