making cell towers mini data centers for AI

A few days ago we heard the news that NVIDIA had invested $1 billion in Nokiataking over 2.9% of the Finnish company. Although the check in itself is striking news, since for many people, Nokia had been lost off the map for many years, the movement makes all the sense in the world: it is the Western response to many of the Chinese technology companies that for years have been investing in the deployment of 6G. And of course, with NVIDIA behind them, telephony base stations can serve much more than just providing coverage to millions of devices: becoming small distributed data centers for AI. The plan behind the investment. NVIDIA and Nokia are not just designing equipment for mobile networks. They are redefining what a cell tower is. The idea is that each base station (the towers and small installations that we see on buildings and streets) become a computing node with the ability to execute operations involving AI technologies in real time. “An AI data center in everyone’s pocket”, according to Justin Hotard, CEO of Nokia. The key here is to bring processing closer to the user in order to eliminate latency, which is usually one of the most frequent problems in AI applications that require real-time processing, such as instant translation, augmented reality or autonomous vehicles. Without latency, everything changes. When we ask an AI to translate a conversation or analyze live images, every millisecond counts. Sending that data to a distant server, processing it, and returning it introduces a significant delay that mars the final experience. The most logical solution is to decentralize: that the AI ​​lives close to the userin the telecommunications infrastructures themselves. In this sense, NVIDIA will contribute chips and specialized software, while Nokia will adapt its 5G and 6G equipment to integrate that computing capacity. As announced, the first commercial tests will begin in 2027 with T-Mobile in the United States. The Nokia effect on the stock market. Nokia shares they shot up 21% after the news broke, reaching highs not seen since 2016. NVIDIA and OpenAI have become King Midas of technology: everything they touch goes up. The investment is also a boost to the strategy of Hotard, who since his arrival in April has accelerated Nokia’s shift towards data centers and AI. The company, which already acquired Infinera for 2.3 billion to strengthen its position in data center networks, it is now positioned as the only Western supplier capable of competing with Huawei in the complete supply of telecommunications infrastructure. EITHERafter space race. While Europe and the United States accelerate their 6G plans, China has been investing aggressively in this technology for years. This alliance between NVIDIA and Nokia is a somewhat late response, but necessary. Jensen Huang, CEO of NVIDIA, explained in his speech in Washington that the goal is “to help the United States bring telecommunications technology back to America.” It is not just about infrastructure, but about strategic control. And whoever dominates this network of brains distributed throughout cities and roads will control the AI ​​applications of the future. And now what. The McKinsey consulting firm esteem that investment in data center infrastructure will exceed $1.7 trillion by 2030, driven by the expansion of AI. Nokia and NVIDIA want their piece of the pie, but they are also betting on a structural change: that mobile networks stop being mere data tubes and become intelligent computing platforms. It remains to be seen if this model works commercially and whether operators are willing to update their infrastructure. Cover image | NVIDIA In Xataka | Xi Jinping wants two things: first, to create a global center that regulates AI. The second, that it is in Shanghai

The “foodies” have turned the historic centers of Italy into hell, so the cities are getting serious

Italy is at war. In a not so particular one that it shares with other countries and cities: the battle to stop mass tourism. He is trying with all his might through higher rates, entrance fees that they folded After initial success, a veto key boxes and even taxes on tourist dogs. Now, several cities have agreed on one thing: stop the ‘foodies’. As? Prohibiting the opening of new restaurants in historic centers. In short. Going through the historic center of any Italian city is like entering a culinary amusement park. There is not only restaurants wherever you lookbut these constitute a fair in which eye-catching posters appealing to tradition and artisans who prepare fresh pasta in front of the windows of the premises, like circus animals, are a constant. Now, cities like Rome, Turin, Florence, Palermo and Bologna have launched restrictions when opening new restaurants in their historic centers. Displacing the population. Although Italians love their traditional cuisine as much as anyone, they are getting tired of their city centers becoming theme parks. There are especially bleeding streets, like Via Maqueda in Palermo or Via del Pellegrino in Rome (to a lesser extent), which are basically a succession of premises. As he comments The New York Timeshundreds of new restaurants have opened over the last decade in just a few streets of those tourist spots, establishments that dress in tradition, but are not and displace the local population far from their homes. It is something that is seen in many other cities in the world in which the tourism is doing that the price of land rises in very specific points, also that of rents, and the locals see how traditional businesses disappear while others linked to that consumerism flourish. “We must protect the center”. In the case of Italy, the aim is to fight against gastronomic gentrification, which is replacing historical markets and local stores with businesses aimed at mass tourists, and they also want to protect the authenticity and daily life of citizens. But we also want to preserve tradition and diversity compared to more homogeneous or franchised models. Luisa Guidone, Councilor for Commerce of Bologna, comment that “the center must be protected, maintaining the mix of existing stores that allow citizens to have their daily experience when shopping.” Everyone makes their war. As we say, the prohibition or limitation on opening premises is not part of a national initiative, but rather of each municipality. In Palermo, new restaurant licenses have been expressly prohibited in emblematic areas such as Via Maqueda. In Florence, no new openings of bars, restaurants or any food establishments in more than 50 streets in the center and some peripheral ones. In the aforementioned Bologna, until June 2028, new projects aimed at commercial activities that want to open in the historic center and in Rome or Turin will be carefully studied. more of the same (especially around the Vatican). Then, there are exceptions. For example, Florence allows you to open establishments such as art galleries, bookstores or crafts, anyone that is not focused on mass hospitality. Not just food. But this goes beyond gastronomic gentrification. In it Corriete di Bologna we can read that the restrictions They imply that, until 2028, it will be prohibited to open new money exchange stores, call centers (which are telephone centers, Internet connection points and money transfer points) in the historic center, as well as “buy gold” or automatic cash machines.slot machine‘. Debate. Now, promoting something like this is complicated when tourism represents almost 12% of the Italian economy and the gastronomic tourism It is an important source of income. In fact, in the NYP article they include statements from tourists who only want to eat. Also those responsible for FIPE, the Italian Federation of Food and Tourism Companies, who point out that “sometimes, the Coliseum is an excuse for an American among a cacio e pepe and one amatriciana“In addition, it is criticized that each city is waging war on its own and there is no law promoted at the national level. In any case, as we said at the beginning, it is evident that Italy has a problem with this mass tourism that is displacing the population that really lives in those cities. Traditional businesses have closed or have been converted, going from selling useful foods for citizens to traditional dishes wrapped in a striking way for tourists. And finding the balance seems tremendously complicated. Images | Anna Church, Maxime Steckle, Matej Buchla In Xataka | “Fodechinchos free”: in a bar in Galicia, tourismphobia is being redirected against Spaniards from other regions

Building data centers in space was the new hot business. Elon Musk just broke it with a tweet

The debate over the feasibility of building gigantic data centers in orbit had been heating up for months. It is Silicon Valley’s new big idea to solve the insatiable energy appetite of artificial intelligence. Until, as usual, Elon Musk has entered the conversation with the subtlety of a hammer. Elon Musk has joined the chat. After weeks of debate about the feasibility of building servers in space, Eric Berger, editor of Ars Technica, argued that will end up being a more plausible option when the technology exists to assemble satellites in orbit autonomously. It was the moment chosen by Elon Musk to enter the conversation. “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links,” wrote the CEO of SpaceX. “SpaceX is going to do it,” he said. A phrase that has probably fallen like a blow on startups that are taking advantage of the momentum of AI to go out in search of financing. Why the hell do we want servers in space? The idea of ​​moving computing to Earth orbit responds to a very real crisis: AI is an energy monster, and Demand for data centers continues to grow. Given this panorama, space offers two advantages that are impossible on Earth: Almost unlimited energy: In a sun-synchronous orbit, solar panels receive sunlight almost continuously (more than 95% of the time). Free Cooling: Land-based data centers consume millions of liters of fresh water to cool. With a large enough radiator, the gap can be “an infinite heatsink at -270°C.” The heat would be radiated into the vacuum without wasting a single drop of water. The new titans of space AI. Musk is not the first to see the business. In fact, he arrives at a party where the first contracts are already being distributed. Jeff Bezos predicted during the Italian Tech Week that we will see “giant training clusters” of AI in orbit in the next 10 or 20 years. Eric Schmidt, the former CEO of Google, bought rocket company Relativity Space precisely for this purpose. And Nvidia, the undisputed king of AI hardware, has actively backed startup Starcloud, which plans to launch the first NVIDIA H100 GPU into space this November, with the goal of eventually building a monster 5-gigawatt orbital data center. Why Musk would win. The vision of Bezos, Schmidt and Starcloud faces two colossal obstacles: the cost of launch and the construction of the servers themselves. Calculations for a 1 GW data center would require more than 150 launches with current technology. And Starcloud’s plan for a 4 kilometer wide array is a logistical nightmare. Elon Musk has Starship, the giant rocket on which all of his competitors’ business models depend to be profitable. And you don’t need build a new orbital data center. Just adapt and scale the one you already have. 10,000 satellites and counting. SpaceX’s Starlink constellation no longer competes against satellite internet, goes for terrestrial fiber. Musk’s company has already launched 10,000 satellites and is preparing the deployment of the new V3 satellites, designed for Starship with high-speed laser links. According to SpaceX itself, each Starship launch will add 60 terabits per second of capacity to a network that is already, in practice, a global computing and data mesh. While Starcloud needs to hire a rocket and assemble 4km-wide solar and cooling panels, Musk simply needs Starship to finish development to continue launching satellites. In Xataka | Starlink stopped competing with satellite Internet companies a long time ago: now it is going for something much bigger

NVIDIA has risen to the top for its AI data centers. Your next big leap: cars

NVIDIA has unveiled its platform Drive AGX Hyperion 10a computing and sensor system designed for any manufacturer to produce Level 4 autonomous vehicles. Uber has already signed an agreement to deploy 100,000 units across its global network starting in 2027, and Stellantis, Lucid and Mercedes-Benz have also joined the project. Why is it important. For years, autonomous driving has been a persistent promise often wrapped in marketing. NVIDIA has turned that promise into an industrial offering with standardized architecture, certified chips, and out-of-the-box simulations. It does not sell autonomous cars, but it does sell the operating system that will make them possible. The contrast. Tesla has been selling autonomy as a leap of faith for a decade, with permanent updates, its own fleet and promises of “millions of autonomous Teslas” every year. NVIDIA, on the other hand, offers an open platform where any manufacturer can plug in their hardware. Tesla wants to be an equivalent to Apple in cars. NVIDIA prefers to be something more similar to Windows. Between the lines. Automotive only accounts for NVIDIA 1.3% of its revenue, but that segment is growing faster than the rest. In any case, Uber’s announcement has no real timetable for those 100,000 units unless it has been made public. Waymo, which has been developing its robotaxis for years, is already its sixth generation and it has the financial muscle of Alphabet behind it, it barely operates 2,000 of them. There is a considerable gap between ambition and reality. The backdrop. Drive Hyperion 10 is based on two Thor chips (2,000 teraflops each), fourteen cameras, nine radars, one LiDAR and twelve ultrasonic sensors. NVIDIA has designed it with full redundancy: if a component fails, the vehicle stops safely to avoid chain errors that multiply the potential damage. Lucid will be one of the first in offering level 4 autonomous driving to individual customers and not just fleets. Its interim CEO has admitted that so far they have disappointed in terms of driving assistance. Their commitment to NVIDIA is the classic implicit recognition: it is better to buy the brain than to build it. The money trail. NVIDIA will not continue building robotaxis for now, but for now it sells infrastructure: chips, simulation software, synthetic data… And it charges for each vehicle that uses its platform. It’s a more predictable revenue model than depending on full autonomy to arrive one day. Huang, in any case, has said that that moment is near. The interesting thing is not whether he is right, but that his definition no longer depends on blind faith. It depends on regulators, certifications and industrial tests. Autonomy has ceased to be science fiction and has become an engineering problem. And those problems are solved with processes, not with promises. In Xataka | China has turned the electric car market into a crazy race. And Porsche pays for it with billion-dollar losses Featured image | Xataka

Data centers do not want to depend on the conventional electrical grid. Solution: build your own plants

AI data centers have sparked a new fever: the so-called “bring your own power.” The demand and consumption The pressure these plants impose is so enormous that they do not want to depend on external sources. The solution is theoretically simple, and we are already seeing how when a new data center is built, it is normal for some type of power plant to be built next to it. We are seeing it now. The data centers that OpenAI and Oracle are building in West Texas are accompanied by the creation of a natural gas-based power plant. Both xAI’s Colossus 1 and Colossus 2 in Memphis take advantage of gas turbines. And as they also indicate in The Wall Street Journalmore than a dozen Equinix data centers across the US are powered by stand-alone fuel cells. If the conventional electrical grid cannot be used, nothing happens: you create a power plant and that’s it. The US has an electrical problem. The technology giants would prefer to connect to the conventional grid, but bottlenecks in the supply chain, bureaucracy – permits, licenses – and the slowness in building the necessary transmission infrastructure prevent this. According to the ICV firmThe United States would need to add about 80 GW of new generation capacity per year to keep pace with AI, but right now less than 65 GW per year are being built. There is another direct consequence of this problem: the rise in the electricity bill. Data centers that look like cities. The needs and ambition of AI companies has made data centers become calculation and resource consumption monsters. One can only consume as much electricity as 10,000 stores in the Walmart electronics chain, WSJ estimates. Before 2020, data centers represented less than 2% of US energy consumption. By 2028 they are expected to represent up to 12%. A 1.5 GW data center, for example, would have consumption similar to that of the city of San Francisco, with about 800,000 inhabitants. China has a lot of advantage over the US in this. While the US deal with that lack of powerChina does not stop investing in new energy generation. According to data According to the National Energy Administration, the Asian country added 429 GW of new energy generation in 2024, while the US only added 50 GW. It is true that China has four times the population, but its centralized planning is helping to avoid problems that affect the US electrical grid. The white knight to the rescue. Faced with this shortage, natural gas has become the preferred resource for on-site energy generation. Although large turbines have long delivery times, smaller turbines or fuel cells that use natural gas are being used because of their rapid availability and installation. Renewables lose steam. Meanwhile, things are not promising for renewable energies (solar and wind, especially). There are about 214 GW of new generation theoretically in projectbut spending on such technologies could decline due to the potential loss of tax credits: the Trump administration criticizes that those clean energies do not provide a constant flow necessary for AI. The nuclear alternative. Faced with this apparent decline of nuclear energy, there is a growing interest in compact nuclear reactors (SMR), which allow us to provide the advantages of this type of center and a flexibility that can be very interesting for AI data centers. amazon, Google, Goal either Microsoft They are betting part of their future on nuclear powerbut that It doesn’t mean there aren’t challenges to overcome.. Image | Wolfgang Weiser In Xataka | World record in nuclear fusion: the German Wendelstein 7-X reactor has broken all records

Data centers shooting the light of light in the houses

Spain is betting very strong for the development and creation of new data centers. The AI ​​boom has infected our country, and although that attracts investment and economic capital, it can also lead to serious problems for consumers. Especially a very clear one: that we pay more for the light. Spain, care with data centers. In recent months we have seen how Data centers construction projects grow In our country. It is estimated that in the Community of Madrid there will be a power of 1.7 GW in 2030, which is paradoxical, because it is the region with greater energy deficit in Spain and the one that is staying with a good part of these projects. Aragon, in another league. Aragon has so many projects of data centers that He showed his disappointment When he knew that the reinforcement of the electricity network for those facilities throughout Spain will be 3.8 GW. Those responsible for the Aragonese government described the figure of “scarce”, especially considering that this region has projects that projects that projects that projects that They would exhaust that capacity alone. The US teaches us (worrying) future. A Bloomberg investigation It reveals how in the last five years the creation of new data centers is making the light invoice rise remarkably. Those centers, previously dedicated to expanding the cloud infrastructure and now totally focused on the AI ​​boom, are behind that increase in light invoice. Energy consumption is triggered in these facilities, and ends up affecting electricity prices in surrounding regions. Prices that almost quadruplic. In 2020 Baltimore residents paid average $ 17 per MW/h. In 2025 that price is $ 38 per MW/h. In Buffalo the thing is even worse, and prices have tripled in five years: they have gone from $ 11 to 33 per MW/h. In the areas of the United States close to large concentrations of data centers, the wholesale price of electricity has risen up to 267% In the last five years. LMP are nodes of the electricity grid that determine the wholesale price of electricity. Almost three out of four have seen price increases when they are close to data centers. Those who are in farther areas have come to see their reduced prices. Source: Bloomberg. Unequal climb. The study reveals how wholesale electricity prices in the US have increased significantly in recent years, although it is true that these increases have been applied unequally at the geographical level: certain areas have seen modest increases, but others have seen how the light of the light was fired and grew up to that aforementioned 267%, close to quadruple. The condemnation of data centers. 70% of the points at which price increases were recorded are less than 80 kilometers of data centers with significant activity. It is a fact that makes it clear that the impact of these data centers on the light of the residents is clear. And this goes to more. Current estimates, Indicates Bnefprevent the energy demand for data centers in the US will double for 2035 and will be the largest increase in energy demand since the 60s. Thus, in ten years that demand will represent 9% of the total. At the global level, the data centers are expected to consume more than 4% of the electricity consumed in 2035. If those facilities were a country, they would be the room in energy consumption, only behind China, USA and India. Perfect storm. The demand is also linked to the rise of cryptodivisas, the impulse of manufacturing in the US and the “electrification of the economy”, which includes areas such as electric vehicles or domestic heating systems. The withdrawal of traditional mining facilities in areas such as Baltimore has only aggravated the economic problem: there is less energy supply and more demand, which makes prices again increase. The world already knows what comes over and is reacting. What is happening in the US is already causing reactions in other countries. Holland: Water and energy needs made the Amsterdam City Council in 2019 imposed A moratorium for the construction of new data centers. Singapore: also established a pause for the creation of this type of facilities between 2019 and 2022, although the government made clear which would be more selective in future projects. Ireland: In 2024 the country reached a worrying milestone. Data centers They already consumed more than households. 5% of the country’s total consumption went through in 2015 to 18% in 2022 and 21% in 2023. Household consumption represented that year 18%. The solution: that the Big Tech pay that invoice. Public service companies in the US such as Dominion Power are clear that “data centers should pay the full cost of their energy consumption.” Large technological ones know very well that these facilities raise an extraordinary energy demand, and They are investigating solutions like him Use of SMR reactors for their AI data centers. The idea is interesting, But complex. Supply and demand. Spain faces a future in which energy supply and demand could be unbalanced as it is already happening in the United States. If data centers begin to impose more and more load on the network, it is reasonable to think that the cost of electricity increases and causes the least desirable effect for users: the rise in the light invoice. Renewables could help mitigate the problem, but only If the network is capable of absorbing Both the new generation and the new mass demand of the data centers. Image | Microsoft

Data centers for AI are an energy hole. Jeff Bezos’s solution: Build them in space

In the next two decades we will see data centers at Gigavatio scale orbiting the Earth. Or at least that is the prediction that has launched The founder of Amazon and Blue Origin, Jeff Bezos. He said it during his speech at the Italian Tech Week in Turin, where he was able to establish conversation with John Elkann, president of Ferrari and Stellantis. Bezos’s proposal. Space data centers would take advantage of solar energy 24 hours a day, cloudless, rain or night cycles that interrupt the supply. According to Bezosthese “giant training clusters” of artificial intelligence would be more efficient and, eventually, more economical than terrestrial facilities. “We can exceed the cost of land data centers in space in the coming decades,” he said. Why now talks about this. The infrastructure demand for AI is becoming a large hole for the planet. Current data centers consume massive amounts of electricity and water to cool its servers, a problem that is aggravated with each new artificial intelligence model. Given this pressure, large technology explore alternatives: from Locate them in ships o Nordic countries until sink into the ocean. And of course, if we have capacity problems on Earth, some technological ones already think about taking the letter to send them to space. The technical advantages. In space, temperatures range between -120 ° C under direct sunlight and -270 ° C in shadow, which would greatly simplify equipment cooling. Constant solar energy would eliminate dependence on land electrical networks. Bezos places this development as’Natural evolution‘of a process that has already begun with weather and communications satellites. “The next step will be the data centers and then other types of manufacturing,” he explained. The real challenges. As they point out from Tom’s hardwarebuilding a spatial data center of a Gigavatio would require solar panels that would cover between 2.4 and 3.3 million square meters, with an estimated weight of 9,000 to 11,250 metric tons only in photovoltaic material. Transporting all that equipment to space would cost between $ 13,700 and 25,000 million with current technology, needing more than 150 launches. To this is added the difficulty of maintenance, updates and the inherent risk of space releases. Parallelism with AI. Bezos compared The current moment of artificial intelligence With the bubble Puntocom of the early 2000s. “We should be extremely optimistic about the social and beneficial consequences of AI,” he said, although he warned of the possibility of speculative bubbles. His message: Do not confuse possible excesses of the market with the reality of technological advances, whose benefits consider that “they will spread widely and reach everywhere.” When It will be done reality?. Bezos places the temporary horizon “in more than 10 years, but no more than 20”. Today, the project is commercially unfeasible, but its vision starts from the premise that the launch costs will continue to go down and the technology will mature. It remains to be seen, after two decades, part of our digital infrastructure is in orbit, beyond the existing one. In Xataka | Nvidia has control of the most powerful chips of AI: OpenAi, Broadcom and TSMC want to end their XPUS

There is a perfect storm with AI and data centers. And it will cause the DRAM and Nand memories to become a luxury

A remarkable rebound to the prices of NAND memories (used in SSD units) and the drams of our PCs and laptops is coming. For two years users We have benefited From a time of bonanza in these components, but that ends. And the AI ​​and fever has the fault to create more and more data centers. When the memoirs were cheap. In March 2023 The prices of the NAND and dram memories were falling to lead. The pandemia had caused an extraordinary demand, but once the confinement ended, the situation was invested. The manufacturers had produced too much – waiting for the demand to remain – and found an exaggerated inventory. People no longer wanted so many memory modules or so many SSD units, and prices collapsed. AI changes everything (and this, too). The effects of that imbalance have been extended for two years, but the arrival of Chatgpt caused a Fever by the AI ​​that has ended up causing another fever: that of the data centers. These facilities use thousands of GPUS and these GPUS make use of huge amounts of memory. Above all, HBM memories that Since its creation They were oriented to business applications: they were much more expensive, but also much more powerful. Price evolution of SSD Samsung 980 Pro of 1 TB. In mid -2023 the units raised their lowest price. From there, the price began to rise. Source: Camelcamelcamel. Price increases will go to more. SSD units such as Samsung 980 Pro of 1 TB are a good example of what is happening. In Camelcamelcamel We can see that evolution of prices that marks minimal in mid -2023 and then rise. These units have been replaced by the 990 pro of 1 TB with an evolution less pronounced in the increasestrue (in fact, it is around 100 euros, an interesting price), but everything indicates that this curve will soon follow the tendency of its predecessor. The forecasts of the Trendforce consultant are clear: the DRAM and NAND memories are going to climb a lot and very fast. And the DRAM memories will also go up. The prediction is the same for the DRAM memories market that, for example, are used in the DDR4 and DDR5 memory modules of our PCs and laptops. According to Trendforce In the third quarter of 2025 – which has just begun – we will see a rise of more than 40% in DDR4 memory modules. In the case of GDDR5 memories we will have a break and the climb will reach 8%. Most expensive pcs and gaming. This type of increases especially affects the end users who buy PCs and laptops to work, but also to play. Memory modules for graphics cards will also notice this quarter notably according to Trendforce. The GDDR6 memories will do it up to 33% and the GDDR7 up to 10% according to their estimates. HBM memories to power. Data centers that now all large technological ones are rushing to build need huge amounts of memory, and that is conditioning the balance between supply and demand both in the business market and in the market for end users. In fact, memory manufacturers are increasingly focusing on focusing production on HBM memories – used in AI accelerators – and leaving traditional DRAM and Nand memories. Micron points out that its production of HBM modules for all 2026 It is already soldand SK Hynix seems to be in a similar situation: the demand for these modules is extraordinary. The Raspberry Pi as an example. We are already seeing the consequences of this type of movements. The Raspberry Pi, who had gathered memory modules during the bonanza season, were forced to raise the prices of the new models a few days ago for the shortage of memory. Thus, the Raspberry Pi Compute Module 4 and 5 in their 4 GB variants rose five dollars, and those of 8 GB rose 10 dollars. The company’s own CEO, Eben Upton, explained that “memory costs about 120% more than it cost a year ago.” Why not create more memoirs? The solution seems obvious: if more memories are needed, more factories should be created. However, manufacturers are reluctant to this for several reasons. The first, the enormous cost of these plants, which amounts to tens of billions of dollars. The second, that these factories take years in come to produce. And the third, who do not want there is a “AI bubble” and this explodes would make them meet again with an exaggerated inventory and some factories that they no longer need. Bad matter. Image | Samsung In Xataka | Samsung has its greatest competitor at home. His future with the chips depends on his rivalry with SK Hynix

Drastically reduce the consumption of data centers is crucial for AI. And China has had an idea: to submerge them in the sea

China is About to submerge a data center In the sea, near Shanghai, as a solution to a problem that we will gradually begin to see more: Great energy consumption of the AI. The installation, which will come into operation in mid -October, is one of the first commercial projects of this type in the world and points to a new way of cooling servers without depending on traditional cooling systems that devour electricity. The background problem. Data centers are the backbone of the Internet and AI, but They generate huge amounts of heat. Keeping them refrigerated by air conditioning or evaporation of water consumes a brutal amount of energy, and with the rise of artificial intelligence, the demand of these facilities has shot. China seeks to reduce the carbon footprint of this critical infrastructure, and its commitment It goes through sinking it underwater. How it works. The yellow capsule that They have built Near Shanghai houses servers that remain cold thanks to the ocean currents, without the need for active cooling systems. According to Yang Ye, vice president of Highlander, the maritime company that develops the project with state companies, “underwater operations have inherent advantages” and can save approximately 90% of the energy for refrigeration. The installation will extract almost all its electricity from nearby marine wind farms, with more than 95% renewable energy. The technical challenges. Putting servers under the sea is not easy. They must be protected from the corrosion of salt water, for which they use a special coating with glass scales on the steel capsule. Also They have installed An elevator that connects the main structure with a section that remains on the water, allowing the access of maintenance equipment. Another challenge is to build the Internet connection between the Submarine and Tierra Firme Center, a more complex process than with conventional facilities. Universities researchers in Florida and Japan They have warned In addition to these centers could be vulnerable to attacks by sound waves driven by water. Environmental doubts. Although the project promises to reduce emissions, questions remain about its ecological impact. The heat emitted by servers could alter the surrounding marine ecosystem, attracting some species and driving others. Andrew Want, marine ecologist from Hull University, Point out That “these are unknown aspects at this time, sufficient research is not yet being carried out.” Highlander says that an independent 2020 evaluation on its test project in Zhuhai indicated that the water remained well below the acceptable temperature thresholds, but Shaolei Ren, an expert from the University of California in Riverside, warns That climbing these centers will also climb the heat emitted. There are few precedents. Microsoft tested this technology off the coast of Scotland in 2018, recovering the capsule in 2020 after declaring that The project had been completed successfully. However, he never marketed it. The Chinese project advances with the support of government subsidies: Highlander received 40 million yuan for A similar project in the province of Hainan in 2022, which is still operational. The installation of Shanghai will serve clients such as China Telecom and a state computing company of AI. What comes now. Experts agree that these underwater centers will probably not replace the traditional ones, but will complement the existing infrastructure in specific niches. According to Rencurrent projects seek to demonstrate “technological viability”, but much remains to be resolved before a massive deployment. What is clear is that, if these types of projects face all technological challenges and manage to greatly reduce the energy consumed of the data centers, it will be a great point in favor for the company that manages to provide its solution in the AI ​​race. Cover image | AFP In Xataka | China was the great pollut the planet: now it is emerging as the first “electrostate” in history

Madrid plays 23.4 billion with data centers. The risk of losing them is in the electrical infrastructure

Madrid has managed to position itself as The great HUB Digital of Southern Europe For the data centers industry, but the electrical infrastructure of the twentieth century cannot support the growth of the 21st century. Why is it important. The Community of Madrid leads Spain in data centers with 23.4 billion euros in investments planned until 2028. But 82% saturation This leadership puts this leadership against other European regions. In figures: Madrid concentrates 54.8% of the national capacity of data centers with 216 MW in operation. The forecasts point to 522 MW when the works under construction and up to 1.7 GW in 2030. The sector has grown 33% last year and will generate 35,000 jobs in six years. The threat. Ayuso is preparing allegations against what he considers A “over -regulation” of the Ministry of Ecological Transition, but the real problem is on the network. Electric distributors denied six out of ten access requests last year. Without immediate improvements, Spain would have already lost 60,000 million in investments, according to the employer’s calculations, Spain DC, collected by Digital economy. Between the lines. The Madrid paradox is evident: The region produces just 1,334 GWh … … but consume 27,487 GWh per year. It is an energy black hole that works because Spain exports electricity and technological ones sign long -term contracts. But that does not solve the saturation of the distribution network. What is happening. The Government He has put a Royal Decree until September 15 which will force data centers to report their environmental footprint, energy consumption and water use. Madrid considers that it can subtract competitiveness, but it is a minor problem compared to the lack of electrical capacity. Deepen. Spain DC claims an urgent modernization plan, and The electric ones ask the CNMC to raise the remuneration rate of 6.46% to 7.5% To invest in a network. The cost will be paid by consumers at the light bill, but without that investment Madrid will lose the train centers train against Frankfurt, Amsterdam or Paris. In Xataka | Emptied Spain has been filled with solar mills and panels, but waste energy for a simple reason: there are no cables Outstanding image | Community of Madrid

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.