Data centers for AI are an energy hole. Jeff Bezos’s solution: Build them in space

In the next two decades we will see data centers at Gigavatio scale orbiting the Earth. Or at least that is the prediction that has launched The founder of Amazon and Blue Origin, Jeff Bezos. He said it during his speech at the Italian Tech Week in Turin, where he was able to establish conversation with John Elkann, president of Ferrari and Stellantis. Bezos’s proposal. Space data centers would take advantage of solar energy 24 hours a day, cloudless, rain or night cycles that interrupt the supply. According to Bezosthese “giant training clusters” of artificial intelligence would be more efficient and, eventually, more economical than terrestrial facilities. “We can exceed the cost of land data centers in space in the coming decades,” he said. Why now talks about this. The infrastructure demand for AI is becoming a large hole for the planet. Current data centers consume massive amounts of electricity and water to cool its servers, a problem that is aggravated with each new artificial intelligence model. Given this pressure, large technology explore alternatives: from Locate them in ships o Nordic countries until sink into the ocean. And of course, if we have capacity problems on Earth, some technological ones already think about taking the letter to send them to space. The technical advantages. In space, temperatures range between -120 ° C under direct sunlight and -270 ° C in shadow, which would greatly simplify equipment cooling. Constant solar energy would eliminate dependence on land electrical networks. Bezos places this development as’Natural evolution‘of a process that has already begun with weather and communications satellites. “The next step will be the data centers and then other types of manufacturing,” he explained. The real challenges. As they point out from Tom’s hardwarebuilding a spatial data center of a Gigavatio would require solar panels that would cover between 2.4 and 3.3 million square meters, with an estimated weight of 9,000 to 11,250 metric tons only in photovoltaic material. Transporting all that equipment to space would cost between $ 13,700 and 25,000 million with current technology, needing more than 150 launches. To this is added the difficulty of maintenance, updates and the inherent risk of space releases. Parallelism with AI. Bezos compared The current moment of artificial intelligence With the bubble Puntocom of the early 2000s. “We should be extremely optimistic about the social and beneficial consequences of AI,” he said, although he warned of the possibility of speculative bubbles. His message: Do not confuse possible excesses of the market with the reality of technological advances, whose benefits consider that “they will spread widely and reach everywhere.” When It will be done reality?. Bezos places the temporary horizon “in more than 10 years, but no more than 20”. Today, the project is commercially unfeasible, but its vision starts from the premise that the launch costs will continue to go down and the technology will mature. It remains to be seen, after two decades, part of our digital infrastructure is in orbit, beyond the existing one. In Xataka | Nvidia has control of the most powerful chips of AI: OpenAi, Broadcom and TSMC want to end their XPUS

There is a perfect storm with AI and data centers. And it will cause the DRAM and Nand memories to become a luxury

A remarkable rebound to the prices of NAND memories (used in SSD units) and the drams of our PCs and laptops is coming. For two years users We have benefited From a time of bonanza in these components, but that ends. And the AI ​​and fever has the fault to create more and more data centers. When the memoirs were cheap. In March 2023 The prices of the NAND and dram memories were falling to lead. The pandemia had caused an extraordinary demand, but once the confinement ended, the situation was invested. The manufacturers had produced too much – waiting for the demand to remain – and found an exaggerated inventory. People no longer wanted so many memory modules or so many SSD units, and prices collapsed. AI changes everything (and this, too). The effects of that imbalance have been extended for two years, but the arrival of Chatgpt caused a Fever by the AI ​​that has ended up causing another fever: that of the data centers. These facilities use thousands of GPUS and these GPUS make use of huge amounts of memory. Above all, HBM memories that Since its creation They were oriented to business applications: they were much more expensive, but also much more powerful. Price evolution of SSD Samsung 980 Pro of 1 TB. In mid -2023 the units raised their lowest price. From there, the price began to rise. Source: Camelcamelcamel. Price increases will go to more. SSD units such as Samsung 980 Pro of 1 TB are a good example of what is happening. In Camelcamelcamel We can see that evolution of prices that marks minimal in mid -2023 and then rise. These units have been replaced by the 990 pro of 1 TB with an evolution less pronounced in the increasestrue (in fact, it is around 100 euros, an interesting price), but everything indicates that this curve will soon follow the tendency of its predecessor. The forecasts of the Trendforce consultant are clear: the DRAM and NAND memories are going to climb a lot and very fast. And the DRAM memories will also go up. The prediction is the same for the DRAM memories market that, for example, are used in the DDR4 and DDR5 memory modules of our PCs and laptops. According to Trendforce In the third quarter of 2025 – which has just begun – we will see a rise of more than 40% in DDR4 memory modules. In the case of GDDR5 memories we will have a break and the climb will reach 8%. Most expensive pcs and gaming. This type of increases especially affects the end users who buy PCs and laptops to work, but also to play. Memory modules for graphics cards will also notice this quarter notably according to Trendforce. The GDDR6 memories will do it up to 33% and the GDDR7 up to 10% according to their estimates. HBM memories to power. Data centers that now all large technological ones are rushing to build need huge amounts of memory, and that is conditioning the balance between supply and demand both in the business market and in the market for end users. In fact, memory manufacturers are increasingly focusing on focusing production on HBM memories – used in AI accelerators – and leaving traditional DRAM and Nand memories. Micron points out that its production of HBM modules for all 2026 It is already soldand SK Hynix seems to be in a similar situation: the demand for these modules is extraordinary. The Raspberry Pi as an example. We are already seeing the consequences of this type of movements. The Raspberry Pi, who had gathered memory modules during the bonanza season, were forced to raise the prices of the new models a few days ago for the shortage of memory. Thus, the Raspberry Pi Compute Module 4 and 5 in their 4 GB variants rose five dollars, and those of 8 GB rose 10 dollars. The company’s own CEO, Eben Upton, explained that “memory costs about 120% more than it cost a year ago.” Why not create more memoirs? The solution seems obvious: if more memories are needed, more factories should be created. However, manufacturers are reluctant to this for several reasons. The first, the enormous cost of these plants, which amounts to tens of billions of dollars. The second, that these factories take years in come to produce. And the third, who do not want there is a “AI bubble” and this explodes would make them meet again with an exaggerated inventory and some factories that they no longer need. Bad matter. Image | Samsung In Xataka | Samsung has its greatest competitor at home. His future with the chips depends on his rivalry with SK Hynix

Drastically reduce the consumption of data centers is crucial for AI. And China has had an idea: to submerge them in the sea

China is About to submerge a data center In the sea, near Shanghai, as a solution to a problem that we will gradually begin to see more: Great energy consumption of the AI. The installation, which will come into operation in mid -October, is one of the first commercial projects of this type in the world and points to a new way of cooling servers without depending on traditional cooling systems that devour electricity. The background problem. Data centers are the backbone of the Internet and AI, but They generate huge amounts of heat. Keeping them refrigerated by air conditioning or evaporation of water consumes a brutal amount of energy, and with the rise of artificial intelligence, the demand of these facilities has shot. China seeks to reduce the carbon footprint of this critical infrastructure, and its commitment It goes through sinking it underwater. How it works. The yellow capsule that They have built Near Shanghai houses servers that remain cold thanks to the ocean currents, without the need for active cooling systems. According to Yang Ye, vice president of Highlander, the maritime company that develops the project with state companies, “underwater operations have inherent advantages” and can save approximately 90% of the energy for refrigeration. The installation will extract almost all its electricity from nearby marine wind farms, with more than 95% renewable energy. The technical challenges. Putting servers under the sea is not easy. They must be protected from the corrosion of salt water, for which they use a special coating with glass scales on the steel capsule. Also They have installed An elevator that connects the main structure with a section that remains on the water, allowing the access of maintenance equipment. Another challenge is to build the Internet connection between the Submarine and Tierra Firme Center, a more complex process than with conventional facilities. Universities researchers in Florida and Japan They have warned In addition to these centers could be vulnerable to attacks by sound waves driven by water. Environmental doubts. Although the project promises to reduce emissions, questions remain about its ecological impact. The heat emitted by servers could alter the surrounding marine ecosystem, attracting some species and driving others. Andrew Want, marine ecologist from Hull University, Point out That “these are unknown aspects at this time, sufficient research is not yet being carried out.” Highlander says that an independent 2020 evaluation on its test project in Zhuhai indicated that the water remained well below the acceptable temperature thresholds, but Shaolei Ren, an expert from the University of California in Riverside, warns That climbing these centers will also climb the heat emitted. There are few precedents. Microsoft tested this technology off the coast of Scotland in 2018, recovering the capsule in 2020 after declaring that The project had been completed successfully. However, he never marketed it. The Chinese project advances with the support of government subsidies: Highlander received 40 million yuan for A similar project in the province of Hainan in 2022, which is still operational. The installation of Shanghai will serve clients such as China Telecom and a state computing company of AI. What comes now. Experts agree that these underwater centers will probably not replace the traditional ones, but will complement the existing infrastructure in specific niches. According to Rencurrent projects seek to demonstrate “technological viability”, but much remains to be resolved before a massive deployment. What is clear is that, if these types of projects face all technological challenges and manage to greatly reduce the energy consumed of the data centers, it will be a great point in favor for the company that manages to provide its solution in the AI ​​race. Cover image | AFP In Xataka | China was the great pollut the planet: now it is emerging as the first “electrostate” in history

Madrid plays 23.4 billion with data centers. The risk of losing them is in the electrical infrastructure

Madrid has managed to position itself as The great HUB Digital of Southern Europe For the data centers industry, but the electrical infrastructure of the twentieth century cannot support the growth of the 21st century. Why is it important. The Community of Madrid leads Spain in data centers with 23.4 billion euros in investments planned until 2028. But 82% saturation This leadership puts this leadership against other European regions. In figures: Madrid concentrates 54.8% of the national capacity of data centers with 216 MW in operation. The forecasts point to 522 MW when the works under construction and up to 1.7 GW in 2030. The sector has grown 33% last year and will generate 35,000 jobs in six years. The threat. Ayuso is preparing allegations against what he considers A “over -regulation” of the Ministry of Ecological Transition, but the real problem is on the network. Electric distributors denied six out of ten access requests last year. Without immediate improvements, Spain would have already lost 60,000 million in investments, according to the employer’s calculations, Spain DC, collected by Digital economy. Between the lines. The Madrid paradox is evident: The region produces just 1,334 GWh … … but consume 27,487 GWh per year. It is an energy black hole that works because Spain exports electricity and technological ones sign long -term contracts. But that does not solve the saturation of the distribution network. What is happening. The Government He has put a Royal Decree until September 15 which will force data centers to report their environmental footprint, energy consumption and water use. Madrid considers that it can subtract competitiveness, but it is a minor problem compared to the lack of electrical capacity. Deepen. Spain DC claims an urgent modernization plan, and The electric ones ask the CNMC to raise the remuneration rate of 6.46% to 7.5% To invest in a network. The cost will be paid by consumers at the light bill, but without that investment Madrid will lose the train centers train against Frankfurt, Amsterdam or Paris. In Xataka | Emptied Spain has been filled with solar mills and panels, but waste energy for a simple reason: there are no cables Outstanding image | Community of Madrid

Spain is becoming an authentic mecca of data centers. Uruguay has some lessons about it

Spain is fashionable Between the Big Tech. Practically all have chosen our country to Create new data centers. Investments are notable in different communities, but Aragon is undoubtedly One of the ones that has bet most of these facilities, but there are (at least) a problem. The water. This is what a reportage from El País in which we talk about the risks that these new data centers raise Not only in Spain, but in other countries such as Mexico or Chile, where there are also strong investments of this type. Aragon tends a red carpet to Amazon In the case of Spain, it lends itself especially to what has happened in recent months in Huesca, where Amazon already had three data centers for its AWS platform (in the Burgo de Ebro, Villanueva de Gállego and the Phylus polygon in Huesca Capital), but Project new one in Walqa. The company announced last year an investment of 15.7 billion Ed dollars in the region between 2024 and 2033. This project raised quite controversial at the beginning of the year. It was then that the residents of the Rural neighborhood of Cuarte began to receive letters Notifying them of an expropriation of land next to the Walqa Technology Park. Among the concerns of these neighbors was the layout of a New high voltage electric line that crossed the townin addition to the high consumption of water resources. The neighbors met with Amazon representatives in February and finally managed to make the technology deviant that layout of the high voltage line outside the town. Amazon too reached an agreement to finance infrastructure to supply water to cuarte and other populations thanks to new channeling works from the San Julián de Banzo spring. The energy problem is still striking. These data centers, to which the one who projects one in La Cartuja, in Zaragoza, will consume 10,800 GWH, a huge figure that in fact exceeds the consumption of electricity throughout the province in 2024, which It was 10.54 GWh. To solve that problem the company has paid 1.5 million euros to expand the electricity grid to all your data centers. But Water consumption is even more remarkable. Carlos López, a member of Ecologists in Action in Aragon, explained in the country how Amazon will install several wells inside their plots to extract water from the subsoil and thus refrigerate the equipment. It is estimated That these data centers will consume more than 755,000 cubic meters of water a year to refrigerate equipment, but according to López there will be no control and “it will not be able to demonstrate how much water they will extract.” A Amazon spokesman clarified in that report that these wells “are subject to regulatory supervision” and are raised as a reserve water source. The company already indicated this year that it is using 48% more water of what I expected for a simple reason: The heat. It remains to be seen, of course, what happens when these centers are operational: it will be then when those energy and water consumption and their real impact on Aragon can really be valued, both for the consumption of their citizens and the rest of the industry – and especially the irrigation – as in the case of the environmental impact. That makes it very difficult value the true return of this type of projects for countries such as Spain. Although it is true that during its construction employment is generated, the operation does not usually require so many positions. In the recent data center project that Meta is creating in Talavera de la Reina (Toledo), it is expected that some 5,000 jobs for its construction will be created. However, when it is operational target will use about 250 professionals for its management and maintenance. Documents obtained by the country seal that in October 2021, in the three data centers that existed in Aragon “the total direct employees in each of the three centers in Aragon did not exceed twenty at that time.” That red carpet with which some autonomous communities are receiving these investments can end up giving many dislikes. A similar case: Uruguay Everything seemed promising in the new data center project that Google wanted to install in the Science Park, in the Uruguayan department of Canelones, attached to Montevideo. Google Data Centers in Storey, Nevada. Source: Google. This data center, the second of the company in Latin America, It began to build In August 2024 with an investment of more than 850 million dollars. However, the project has been surrounded by an important controversy since its inception. TO Daniel Penaresearcher at the Faculty of Social Sciences of the University of the Republic (Uruguay), Something scaled him in that project of the searches giant. In July 2022 this expert analyzed the project that Google presented, but realized something important: At no time were details about water consumption or energy that would impose said data center. The Uruguayan Ministry of Environment denied access to that data, and in December filed a lawsuit With the help of lawyer Carolina Neme. Months later Pena could access the information and discovered that in a first stage the data center will need 3.8 million liters of water per day (3,800 cubic meters). In the second that requirement was bent: it would need 7.6 million liters of water (7,600 cubic meters). But not any type of water. Drinking water. Pena said that the water needs by that data center were “considerable.” The average monthly consumption of a home for three or four people is 15 cubic meters, which means that the data center raises consumption equivalent to that of about 55,000-60,000 people a day. Google ended up modifying several aspects of the project, and among them that of that use of drinking water. The company ended obtaining permission To build it, when among other things he pointed out that instead of using drinking water, he would use a call -based cooling system Chillersclosed circuits that recirculate … Read more

The region with the largest energy deficit in Spain is staying data centers

Spain is being filled with data centers. A report The Iberian Peninsula reveals from the real estate consultant CBRE has the interest of large technology companies. The fact is striking, but it is even more the fact that the great focus of these technology is in a region that a priori It would not seem ideal For these facilities: Madrid. Hyperscalers. The cbre study cited In five days It points out this unique concentration in Spain of various data centers projects of the so -called “hyperscales” (Hyperscalers). A Hyperscaler is a Mass provider of cloud services that operates a gigantic network of data centers distributed throughout the planet. Amazon is a good example of this type of companies, but there are more, and they all seem to focus their attention on the Iberian Peninsula. Big Tech bet on Spain … Elliot Zounon, responsible for the report, explained how “there is no investor, a large operator or technological that does not have in its strategic plans to establish its data center project in the Iberian market.” But especially for Madrid. Especially striking was the deployment of projects that indicated the current and future capacity expected in the Community of Madrid, and which amounts to a total of 203 MW. Some of the most important companies in the sector, such as Microsoft, Google, Oracle, IBM, Kyndryl or Ovhcloud have data centers in the community. Various projects with an investment of 23.4 billion euros Until 2028 they propose sensitive growth in this area, and it is expected that by 2026 the capacity of Madrid ascends at 222 MW. Madrid, near the “flap-d”. In the European Union this market has been dominated by the group called Flap-Dwhich is an acronym for Frankfurt, London, Amsterdam and Paris, to which in recent times Dublin has joinedwith a capacity of 328 MW. Madrid is part of the so-called Tier-2, a kind of “second division” of cities with a lot of capacity in data centers. The capital is ahead of Milan, Zurich, Berlin and Oslo, and is also in this Barcelona group, which occupies the tenth position of the TIER-2 with 42 MW installed. And the energy, what? This proliferation of data centers in the Community of Madrid is paradoxical, especially since it is the region that produces less energy from all over Spain and It depends almost completely on external supply. In 2024 Madrid produced 1,334 GWh, more or less the same as in 2021, while its annual electricity consumption in 2024 was of 27,487 GWh. Thus, the community concentrates 11% of the national electricity demand. Of course: Spain is becoming a real Power Exporter Powersomething that favors that role in Madrid as a focus of attention for the creation of future data centers. Emptied Spain produces, the big cities consume. The truth is that the situation of the Madrid energy deficit is logical if we take into account that it brings together a great population and industry density. Here, as in other great Spanish capitals, Energy inequality is clear: while energy occurs in much more depopulated regions – the example of Aragon with wind It is remarkable – that energy ends up taking advantage of in large cities. Our country He has opted very strong for renewablesbut Madrid is a separate case: for not, in Madrid There are no wind farms. Not everything is megawatts. The choice of Madrid not only depends on the gross megawatts, but also on a combination of intangible advantages that technological ones take into account. The capital concentrates interconnection nodes and a dense network of operators that facilitate the exchange of data traffic (something crucial for cloud services and AI applications). The presence of corporate venues also influences, as does the fact that logistics costs are reduced against remote locations that can have cheaper energy, but are more isolated in terms of network and services. The human factor. There is also the Welfa Market and its technical profiles. For companies, deploy infrastructure near where talent is compensated, and professionals in the sector They usually establish their residences in large cities like Madridprecisely because there and other capitals it is where the job offer is concentrated. The same happens in the case of that “first division” of large capitals with data centers in Europe. Frankfurt, London, Amsterdam and Paris also agglutinate that range of technical profiles. The risk of being an energy black hole. Its practically zero self -production converts the Community of Madrid into a kind of “energy black hole”: it absorbs resources generated far and depends totally on the strength of the Spanish network, which recently suffered a worrying one – although it is difficult to be repeated– General blackout. But. Even with that energy deficit, hyperscalars reach these agreements with long -term contracts (PPAS, Power Purchase Agreements), previous agreements with networks and even investments in renewables. The idea is to disconnect the location decision of these data centers where the local energy production is. Madrid must of course ensure its capacity for interconnection and supply – perhaps with network reinforcement if necessary – but energy production in Spain (even Pull energy in the trash) It is a guarantee for this type of facilities. Image | Kyndryl | Community of Madrid In Xataka | Spain was supposed to have a “antiapagones” plan. It has encountered an insurmountable obstacle: politics

will invest 30,000 million euros in data centers for AI

Europe cannot lose the train of the artificial intelligence (AI). You can’t afford it. This technology already has a very deep impact on the economy, scientific and technological capacity, and the military development of a country, and currently USA and China lead with forcefulness In this area. So far Europe seemed to settle for the wake of the two great powers they are disputing world supremacybut its strategy is about to change. And is that according to CNBC The European Union plans to invest 10,000 million euros in the construction of thirteen data centers for AI, as well as 20,000 million euros in a network of “Gigavatio Class” facilities. These latest data centers are the largest and most ambitious, and their denomination indicates that by their size they consume a lot of electricity. In fact, a gigavatio is equivalent to one billion watts, and a small city can consume this amount of energy. At the moment sixteen European countries have been interested in receiving these facilities, and, According to CNBCthe first of these large data centers will reside in Munich (Germany). Each Gigavatio class installation will cost between 3,000 and 5,000 million euros, and will bring together no less than 100,000 avant -garde gpu for AI (they will be possibly chips NVIDIA H100). All this paints very well, but raises a doubt that we cannot ignore: it is not clear how the countries involved in this plan will resolve the supply of electricity to These demanding facilities. It will cost Europe a lot to follow the rhythm of the US and China The US government led by Donald Trump is determined to lead in the field of the cost of what costs. And in principle this initiative, baptized by the new administration as ‘Stargate project’will cost 500,000 million dollars. This money will leave the coffers of the Japanese investment group SoftBank; of those of OpenAI, the creators of Chatgpt; of those of Oracle, and, finally, it will also be provided by the investment firm Emiraratí MGX. These companies will support the construction during the next four years of an advanced network of data centers that will house the high performance computing infrastructure necessary to sustain US leadership in the AI field. The spearhead of these facilities It is already being built in Texas (USA), in a town called Abilene. And it is colossal. In fact, this first data center of the ‘Stargate’ project will bring together, According to OpenAimore than two million chips for ia. The ‘Stargate’ infrastructure should be fully ready before President Trump’s current mandate expires When the US government announced to Bombo y S pay this plan left a great question open: how did he plan to solve the supply of electricity required by the new facilities? Large data centers for AI consume a lot of electricity, which has caused Some technology have opted for investing in nuclear centrals to guarantee the supply of electricity that these facilities require. At the moment this question is not completely resolved. And it is not because the ‘Stargate’ infrastructure should be completely ready before President Trump’s current mandate expires, and a new nuclear power plant can hardly come into operation in four years. Even so, Openai and Oracle They have officialized that have reached an agreement to build the necessary infrastructure to Deliver additional 4.5 GW to your data centers. Interestingly, SoftBank does not participate in the financing of this expansion, although, as I mentioned a few lines above, it does in the ‘Stargate’ project. Anyway, in this equation there is another unknown that also has a lot to say: China. “We hope that China significantly increase its investments in AI and semiconductors in response to the US domain in AI,” CBM consultancy analysts foresee. It makes sense. These two great powers are being disputed world supremacy, so it is understandable that each significant step that give one of the two Receive a more or less overwhelming answer from the other. We can be sure that 2025 will be a year even more agitated than 2024 in the geopolitical and technological fields, so we will be attentive to the steps that US and China will surely give. And Europe. Also Europe. Image | Christina Morillo More information | CNBC In Xataka | Huawei attacks Nvidia positions in China: he wants to have dominant hardware in inference processes in AI

The most bestial data centers on the planet, gathered in this graphic

The development of AI has promoted a New ‘Armamentistic’ career globally. It is not sought to dominate another territory, but to get the more computing power, the better. The main technology companies are deploying centers from Data around the world With a goal in mind: train the artificial intelligence. There are data centers that are authentic burged, and in this graph we can see the most powerful data clusters in the world with an outstanding protagonist: Elon Musk. Cluster. Before entering numbers, a nuance. When we talk about calculation power, we can talk about a computer cluster or a supercomputer. The latter is a system Extremely powerful which can be built with processors specially designed to reach extreme calculation powers or, most commonly common, from thousands of high performance servers. They are used for scientific simulations and tasks that require a huge calculation process, and its cost is brutal. On the other hand, We have the “affordable” version of a supercomputer: the computer cluster. It is a series of interconnected work stations that work in parallel solving problems. It is similar to a supercomputer, but the advantage is that It is a more flexible system Because, as you need more teams, you can expand the cluster. In addition, the components are more standard, which also allows the cost to be lower. But well, it is a concept that has blurred in recent years. The 100,000 club. That said, let’s go back to the graph elaborated by Visual Capitalist With the data of EPOCH AI. In it, we can see the most powerful clusters currently, but with some trap: they are both planned and operational. X, Elon Musk’s company, lit the XAI Colossus Memphis Phase 1 last year, a huge data center with 100,000 NVIDIA H100 GPU With the aim of training ‘Grok’, his AI model. It was something that He even surprised Jensen HuangCEO of Nvidia. It is a computing monster with an enormous calculation power, but the figure is expected to increase up to 200,000 GPU. We will see later the energy consequences of this. Following Musk’s company, we have Meta by stating that they have a cluster “greater than 100,000 GPU H100“For her model ‘calls 4’. Then there are those who maintain something else the mystery. For example, Microsoft with its cluster For Azure, Copilot and the OpenAi AI estimated they have 100,000 GPU between H100 and H200, Two worlds. Out of that 100,000 club we have Oracle With its 65,536 NVIDIA H200, another Musk company -you with the Cortex Phase 1 and its 50,000 GPU, and the United States Department of Energy with The Captainhe Most powerful supercomputer in the world. Officers or estimated, what is clear with this graph is that there is a country that has taken the calculation of AI seriously: United States. They are the ones that seem to push stronger with their data centers in the United States (of the 10 clusters, the first nine are in the US and the last in China) and are not only building inside their borders: Also outside. An example is the finish plan for Build data centers in Spain or the one who has practically Manhattan’s size. European expansion. In the graph, we can see two European clusters. On the one hand, The Jupiter from the Jülich Supercomputing Center in Germany with its confirmed GPUs. On the other hand, the Nexgen In Norway, with about 16,300 GPUs. Europe has undertaken several financing initiatives with the objective of Promote your competitiveness Thanks to programs such as Genai4eu and its budget of 700 million euros between 2024 and 2026. The objective is to build large data centers and, for the call of 2025, 76 proposals were presented in 16 different countries. Now, that development of the European AI must be aligned with Ai actthe agreement in force since February of this 2025 that ensures transparency and an ethical AI. Number vs. efficiency in China. Who has put the batteries in AI, beyond US companies, is China. Following one Road map very different from the westernChina is focusing on having (supposedly) less GPU working, operating with greater efficiency, much lower costs than those of American companies and with equivalent results. Deepseek or the most recent Kimi They are two samples of it. Nvidia rubs her hands. And of all this battle for AI, there is a clear winner: Nvidia. As much as it may be, and beyond who has more or less GPU to do the job, the clear winner is Nvidia. In China it is not so clear due to the commercial veto, but the main world data centers use Nvidia’s architecture with their GRAPHS H100 Y H200. And that if we talk about “normal” cards for AI, since they have the B200 with four times the performance of H100. In fact, the company seems so focused on that AI career that would have neglected what led by AMD for years: Your players cards. Those are servers of Lenovo data centers. Companies seek to reduce the footprint reusing hot water after dissipation to, for example, fill pools or showers. Image | Xataka The planet, not so much. And the consequence of that expansion of the AI is that data centers not only need huge energy amounts To function, also water to dissipate the heat of the equipment. There is an important absent in the graph, Google, which also operates its data centers for AI and that, together with others as goal or Microsoftneeds nuclear centrals to feed its facilities. Consumption is so exaggerated that renewables are insufficient during demand peaks, Using fossil fuels like coal or gas ( esteem That, the 200,000 Colossus GPU consume 300 MW, enough to feed 300,000 homes) and, as we said, the Water use has become discussion material in the candidate territories to house new data centers. So much dissipation needs China to already Building at the bottom of the ocean. In Xataka | China wants to become AI world engine. … Read more

Madrid has already set the card brake on the classrooms. The scissors will not cut all centers by educational equally

The Community of Madrid just put date to an idea that had been around for some time: Limit the use of screens among the little ones. From the 2025/26 course, children in children and primary schools of public and concerted centers will not be able to work with digital devices individually. The official objective is to protect them from excessive or inappropriate use, something that, they say, will benefit more than half a million children. What changes in Infant and Primary? In practice, this means that in childhood and primary school it ends to have each one its tablet or its laptop. According to the Community of Madrid, teachers will not be able to send duties that require screen outside school and, within the classroom, only shared use and always with a pedagogical purpose will be allowed. This use will be very measured: in the first cycle of children (up to 3 years) there will be no contact with screens, in the second cycle (3 to 6 years) it will be limited to an hour weekly, in 1st and 2nd primary school also at one hour, in 3rd and 4th at an hour and a half and up to two hours per week in the last two courses. The exception of high school. ESO plays with other rules. Instead of a veto, the Community of Madrid leaves the decision in the hands of each institute. They will be the ones who mark whether or not they are used tablets, laptops or mobiles in class, adapting the rules to the reality of their students, their maturity and the way in which the subjects impart. Private schools: the great nuance. Not all schools are in the same bag. As Madrid’s Diario stands outthis regulation is only imposed on centers with public funds. The private ones are left out, although they are encouraged to apply their own criteria to regulate the use of technology in the classrooms. Exceptions and special cases. The regulations leave some open doors. For example, students with special needs can use tablets or computers without limits if recommended by a psychopedagogical report. In addition, the decree allows devices to be used in those options or programs that are not understood without technology, such as some digital or robotics projects. Progressive Supervision and Adaptation. It will not be a change overnight. The educational inspection will be responsible for controlling that the decree is applied correctly, but also to help the centers so that the change is as traumatic as possible. In addition, schools with projects where each student has their own device will have an extra year, until 2026/27, to adapt and reduce the use of screens. Open context and debate. Not everyone sees this change with the same eyes. The more than 400 allegations received that the debate is still open: are we protecting children or limiting their contact with tools that will be key in their future? The Madrid community is committed to the first reading, but the pulse between traditional education and digitalization is far from resolving. Images | Freepik (1, 2) In Xataka | The icing on the cake to the works of Madrid: the city has become a gymkana of reforms, cuts and discomfort

They already use a mixture created by algorithms for their data centers

Goal has used a concrete mixture designed by algorithms in one of its data centers. According to the companythis formula promises to be more sustainable, faster to apply and has been developed with open source tools. With this approach, not only is it sought move towards zero emissions: Also accelerate the construction of infrastructure that grow to time, As the data center that is raising under provisional structures demonstrates. Invisible concrete weight. Few materials are as omnipresent as concrete. It is used on roads, bridges, homes … and also in data centers where a good part of our digital life is housed. The problem is that manufacturing its components, especially cement, generates a huge amount of CO2. The World Economic Forum indicates that It represents about 8% of global emissions. Goal has proposed to reduce that footprint without compromising the resistance or work speed. And that’s where his new model enters. An AI that does not create chatbots, but mixtures. To develop this system, goal was allied with Amrize – One of the world’s largest cement manufacturers – and with the Urban-Champaign University of Illinois. Together they have created an AI model that proposes concrete compositions. The model is based on Bayesian optimization And it is built with Botorch and AX, two open source tools developed by the goal itself. A slab test in the Rosemount Data Center The challenge was not minor: each mixture involves combining different types of cement, aggregates, water, additives and supplementary materials such as scum or flying ashes. The exact proportion, its origin or even the time of the year can alter the result. Traditionally, they explain, validate a new formula has been for weeks. With AI, this process accelerates because the system learns from the previous data, proposes new promising combinations and refines its predictions after each test. Implementation of the concrete formulation generated by AI in the data center From the laboratory to the field. One of the first large -scale validations was made in the data center that Meta builds in Rosemount, Minnesota. There, the contractor Mortensen applied the new mixture in one of the building’s support slabs. The objective was not only to check its resistance, but also its workability and the final finish: these slabs must be perfectly smooth and durable. The result, according to the firm, exceeded all technical standards. The formula designed by AI not only met the demands of resistance and cure, but also behaved well into work: it was poured without problems and offered an adequate surface. After two iterations, and with minimal human adjustments, the model had generated a recipe that improved the usual industrial formulas in speed, resistance and potential for emission reduction. An open model. The system developed by goal is not a commercial product or a closed tool. The company has published the code, data and technical approach in An open github repository called SustainableConcrete. The idea is not to keep the formula, but share the method: a way of applying artificial intelligence to concrete design that can adapt to other works, suppliers or materials. Touch wait to know if we will see more initiatives like this. This could facilitate the adoption of alternative mixtures in a variety of constructions. As we have seen, goal has not invented a new material. What he has done is to use AI to find new concrete formulas. Images | Xataka with Gemini 2.5 Flash | Mark Zuckerberg | Goal (1, 2) In Xataka | Nvidia says that China has the best open source AI in the world. These praises have a very clear intention

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.