a data center that will run on wind energy

In the silent race that the world is waging to dominate digital infrastructure, every movement matters. And Brazil, far from being a spectatoronce again occupies a strategic place. The arrival of the TikTok project in the Brazilian northeast confirms a shift in the world technology map: critical infrastructures are no longer concentrated only in the United States, Europe or Asia, but are beginning to expand towards regions that offer abundant renewable energy and direct international connection. The advertisement. TikTok have decided to install a mega data center in the Pecém Industrial and Port Complex, in the state of Ceará. The company detailed in its press release that it will allocate more than 200,000 million reais —about 32,000 million euros—, the largest investment it has made in Latin America. Of that amount, 108 billion will be allocated exclusively to high-tech equipment until 2035; the rest will finance infrastructure, energy systems and future expansions. Operations are planned for 2027, and local authorities estimate the creation of more than 4,000 jobs. The infrastructure that the AI ​​era demands. Data centers have become the engine that makes AI, cloud and streaming possible. As Wired remembersthe push of artificial intelligence has skyrocketed the demand for computing and has opened a global competition to build larger and more efficient infrastructures. Brazilian interest in attracting data centers is supported by both its renewable energy matrix – cheap and abundant – and connectivity what Fortaleza offersentry point for most the submarine cables that link the country with the United States, Europe and Africa. A data center powered only by wind. For the initial phase, TikTok will work with Omnia, a local data center operator, and with Casa dos Ventos, one of the largest renewable energy developers in the country. The project is presented as an example of digital infrastructure powered entirely by clean energy. TikTok and its partners will build exclusive wind farms to supply the center, which will allow them not to use energy from the public grid. Depending on the platformthis will avoid any pressure on local supply. Technically, the company states that it will use a closed water reuse circuit combined with air cooling to reduce water consumption. However, as the Government of Ceará has pointed outrefrigeration will be 100% air-based, and the use of water will be limited to human activities and maintenance. Furthermore, the installation will incorporate PG25 technologywhich allows servers to operate at higher temperatures with less need for cooling, substantially reducing energy expenditure. The voices that question the project. Not everything is celebrations. The main resistance comes from the Anacé indigenous people, who denounce, as reported by El Paísthat part of the complex would occupy territories that they consider ancestral. Their organizations affirm that no prior consultation was carried out and express concern about the possible socio-environmental impacts: both on the use of water and on the transformation of the territory. TikTok maintains that it complies with Brazilian regulations and emphasizes that its energy and cooling model will minimize any pressure on natural resources. The Government of Ceará add thatThe companies involved must invest 15 million reais per year in the communities around the Pecém complex. On the global board of digital infrastructure. The megaproject is part of a broader strategy. Lula’s Government approved measures to reduce taxes and attract data centers, with the intention of transforming Brazil into a regional digital hub. In parallel, the United States promotes initiatives such as the stargate project to maintain competitiveness in artificial intelligence, while China accelerates the expansion of its technology companies abroad. TikTok, of Chinese origin, thus fits into a delicate diplomatic balance that Brazil tries to maintain. Beyond the economic investment, a data center of this scale raises debates about privacy, digital sovereignty and local data storage, dimensions increasingly present on the Brazilian legislative agenda. The speed of digitization. The TikTok megaproject in Ceará symbolizes the tension of a world that is digitizing at unprecedented speeds: it promises clean energy, employment and modernization, but it also reopens discussions about territory, regulation and environmental memory. Between the technological ambition of a digital power and the concerns of a community that defends its land, Brazil once again places itself at the intermediate point of global forces and local demands. The contrast is inevitable: while institutions celebrate the promise of a future powered by wind and data, indigenous communities in the northeast remember that the technology that connects the world also leaves footprints on the ground they walk on. At this intersection between progress and complaints the true impact of TikTok’s new digital heart in Latin America will be defined. Image | PXHere and Greenwish Xataka | Researchers removed Instagram and TikTok from 300 young people to see if their anxiety decreased. The results speak for themselves

In Finland they already know how to deal with excess heat from data centers: convert it into district heating

Helsinki has found an unexpected ally to decarbonize its heating in the midst of the rise of artificial intelligence: waste heat from data centers. The same heat that servers generate when processing millions of queries, training AI models, or moving Internet traffic is no longer wasted. In the Finnish capital, this thermal flow – which is growing at the same rate as the digital world – is beginning to become shelter for tens of thousands of homes. A digital sector that is now heating up cities. For years, data centers were known for one uncomfortable characteristic: they generated a lot of heat and needed huge cooling systems to dissipate it. Now that residual heat is already being channeled to the Helsinki heating network, thanks to agreements signed with operators such as Equinix, Telia and Elisa. Data Center Dynamics remember that the company It has been testing this model for more than a decade – the first pilot tests date back to 2010 – but now the scale is completely different: the thermal demand of the city is enormous and the volume of heat generated by the digital economy is growing non-stop. The result can already be seen, a single data center can heat up to 20,000 homes, according to official figures from Helen. The Telia plant, for example, already recovers up to 90% of the heat generated by its servers, enough to heat 14,000 apartments, and in a few years it could double that figure to 28,000. A change in the way heat is produced. Digital heat recovery is more than just a technological curiosity. It represents a change in the way district heating is conceived. In the words of the Finnish company“the electricity consumed by data centers always ends up being converted into heat.” The difference is that now that heat is no longer released outside: it is reused. The engineering behind urban heat. Finland can convert digital heat into district heating because it has a network of district heating especially advanced: a network of pipes that distributes hot water to homes, schools and public buildings. The process is as follows. A data center generates heat: the servers run 24/7 and are continuously cooled. That heat, instead of being dissipated outside, is captured. It is then recovered and transferred; To do this, data centers can install their own recovery systems or use those offered by the energy company. The heat is sent to an “energy platform”, where heat pumps raise it to useful temperatures. Then, the temperature is adjusted to the 85–90 ºC necessary so that the water can circulate through the urban network. This is where high-temperature heat pumps come into play—some of which, like Patola’sthey work even with outside air at –20 ºC. Finally, the heat is injected into the grid and distributed throughout the city to heat thousands of buildings. Closing the energy circle. To understand why Finland leads this model, we must look at an essential technological element: heat pumps. Not only domestic ones, but also large-scale industrial ones, capable of raising waste heat to temperatures useful for an urban network. Europe—and especially the Nordic countries— has become a world leader of this technology. Finland has 524 heat pumps per 1,000 homes, a figure second only to Norway, and its cities have been electrifying heating for decades. This combination—cold climate, tradition of district heatingheat pump industry and the need to decarbonize quickly—turns Finland into an urban-scale energy laboratory. A model with limits. Although the system works, it is not a panacea. As Middle Parenthesis remembersnot all data centers are close to cores with thermal demand, not all generate enough heat to justify the investment, heat recovery improves efficiency but does not reduce the electrical consumption of data centers, and in hot climates or widely dispersed cities, replicating it is much more difficult. Still, the trend is clear. With the expansion of AI and the growth of cloudthe amount of heat available will only increase. The Nordic countries – Sweden, Norway, Denmark – already take advantage of it, and large operators such as Microsoft and Google They explore similar systems across Europe. From silicon to the stove. The Finnish model shows that, even at the heart of digital infrastructure – those data centers that power our online lives – there can be hidden a useful and concrete source of energy for everyday life. The heat produced by our searches, our videos or our conversations with AI can be transformed, with the right infrastructure, into heating a home in Helsinki. In a world desperately seeking clean heat, Finland has already found a tangible, scalable and surprisingly logical answer: turning the thermal problem of the digital age into a solution for the Nordic climate. A silent reminder that, sometimes, the energy transition advances with a simpler approach: taking advantage of the heat that servers already produce tirelessly. Image | freepik and freepik Xataka | Someone cut five undersea cables in the Baltic. Finland already points to a ship from the “shadow fleet” as responsible

We already have the world’s first fast neutron nuclear reactor. We are going to use it for AI data centers

The growth of artificial intelligence is driving global electricity demand to historical figures. The expansion of data centers, the advance of electrification and the industrial rebound are straining aging networks that are already suffering from saturation in multiple countries. In this scenario, the digital sector—a large consumer of electricity for the development of AI—faces a paradox: it needs much more energy, but it must do so without increasing its emissions. And there arises a proposal that until recently would have seemed like science fiction: data centers powered by a compact fast neutron nuclear reactor. The Stellaria–Equinix deal that no one saw coming. The French startup Stellaria, born from commissariat to the atomistic energy (CEA) and Schneider Electric, announced a pre-purchase agreement with Equinix, one of the largest global data center operators. According to the press releasethe agreement secures Equinix the first 500 MW of capacity of the Stellarium, the molten salt and fast neutron reactor that the company plans to deploy starting in 2035. This reserve is part of Equinix’s initiatives to diversify towards “alternative energies” applied to AI-ready data centers. Autonomy, zero carbon and waste management. It is a brief summary of the first reactor breed and burn intended to supply data centers. As explained by Stellariaoffers: Completely carbon-free and controllable energy, enough to make a data center autonomous. Underground design without exclusion zone, thanks to its operation at atmospheric pressure and its liquid core. Ultra-fast response to load variations, essential for generative AI. Virtually infinite regeneration of fuel, part of which can come from current waste from nuclear power plants. Multi-fuel capability, from uranium 235 and 238 to plutonium 239, MOX, minor actinides and thorium. For Equinix, this means solving one of its great challenges: operating with guaranteed clean energy 24/7 without depending on the grid. For Europe, it marks the entry into a new generation of ultra-compact reactors: the Stellarium occupies just four cubic meters. The technology behind the reactor. The Stellarium is a fourth-generation liquid chloride salt reactor, cooled by natural convection and equipped with four physical containment barriers. It operates on a closed fuel cycle, capable of maintaining fission for more than 20 years without recharging. Stellaria’s roadmap establishes that in 2029 there will be the first fission reaction and six years later a commercial deployment and delivery of the reactor to Equinix. According to the company, The energy density of this type of reactor is “70 million times higher than that of lithium-ion batteries”, which would allow a single Stellarium to supply a city of 400,000 inhabitants. As fusion progresses, fast fission arrives first. To understand why a fast neutron reactor comes to the world of AI before fusion, just compare the technological moment of each. The merger is making spectacular progress—such as the record of the French WEST reactorwhich maintained a stable plasma for 22 minutes, or the Wendelstein 7-Xwhich sustained a high-performance plasma for 43 seconds—but remains experimental. ITER will not be operational this decade and commercial prototypes will not arrive until well into the 2030s. Advanced fission, on the other hand, is much closer to the market. Reactors like Stellaria’s, with molten salt and fast neutrons, do not require the extreme conditions of fusion and can be deployed sooner. The company plans its first reaction in 2029 and a commercial deployment in 2035. The data centers of the future will no longer depend on the network. Equinix already operates more than 270 data centers in 77 metropolitan areas. In Europe they are powered by 100% renewables, but their future demand for AI will require a constant, carbon-free source that does not congest the electrical grid. According to Stellariathis agreement “lays the foundation for data centers with lifetime energy autonomy.” And, if the company meets its schedule, Europe will become the first region in the world where artificial intelligence is powered by compact reactors that recycle their own nuclear waste. The technological race between advanced fission and fusion is far from over, but, today, the first fast neutron reactor intended for AI does not come from ITER or an industrial giant: it comes from a French startup. Europe has just opened a door that could transform, at the same time, the future of energy and computing. Image | freepik and Stellaria Xataka | Google hit the red button when ChatGPT came upon it. Now it is OpenAI who has pressed it, according to WSJ

Sam Altman is trying to buy his own rocket company to compete with SpaceX. The key: data centers

The rivalry between Sam Altman and Elon Musk has just reached its highest point: space. And all so that OpenAI can deploy its own data centers in space. The news. As revealed by the Wall Street Journalthe CEO of OpenAI has been exploring the purchase of Stoke Space, a Seattle startup that develops reusable rockets, with the goal of building data centers in space. Although talks with Stoke Space cooled in the fall, the move confirms a trend we’ve been observing for months: Silicon Valley is outgrowing the Earth to fuel AI. Sam’s plan. According to the Journal’s sources, Sam Altman was not looking for a launch provider, but rather an investment that would ensure OpenAI majority control of Stoke Space. Stoke Space, founded in 2020 by former Blue Origin engineers, is developing a fully reusable rocket called ‘Nova’ to compete with SpaceX’s Falcon 9. So that. Altman maintains a tense rivalry with Elon Musk, so the logic of this move would be to reduce OpenAI’s dependence on Musk’s rockets in the event that it decided to deploy servers in space. But above that there is a purely energetic motivation. The computing demand for AI is so insatiable that the environmental consequences of keeping it on Earth will be unsustainable. In certain orbits, however, solar energy is available 24/7 and the vacuum of space offers an infinite heat sink to cool equipment without wasting water. The fever of space data centers. Altman is not alone in this race. What until recently seemed like an eccentricity has become a serious project for big technology companies: And what does Musk say? The irony of Altman pursuing his own rocket company is that the industry’s undisputed leader, Elon Musk’s SpaceX, already has the infrastructure in place. While his competitors design prototypes and seek financing, Musk has cut off the debate with his usual forcefulness: in the face of the discussion about the need to build new orbital data centers, He assured that there is no need to reinvent the wheel: “It will be enough to scale the Starlink V3 satellites… SpaceX is going to do it.” Images | Brazilian Ministry of Communications | Village Global In Xataka | Building data centers in space was the new hot business. Elon Musk just broke it with a tweet

Data centers consume a lot of water, but it is probably less than we thought. It’s a book’s fault

We can criticize the AI ​​boom for many reasons, but there is one that deeply affected society: the environmental impact, more specifically water consumption of each interaction with the AI, necessary to be able to cool the servers. The problem is realbut everything indicates that it has been magnified and the origin would be a miscalculation in a popular book. the book. It is ‘Empire of AI’ written by Karen Hao and which we already talked about in Xataka. After interviewing hundreds of former employees and people close to the company, the author constructs a detailed and highly critical account of OpenAI, more specifically its CEO Sam Altman. Among the criticisms of this ‘AI empire’, Hao mentions the excessive water consumption of AI, going so far as to state that a data center would consume 1,000 times more water than a city of 88,000 inhabitants. The criticism. Andy Masley tells it in his newsletter The Weird Turn Pro. According to their calculations, in reality 22% of what the city consumes or 3% of the entire municipal system. Furthermore, Masley states that the book confuses water extraction (temporary withdrawal that is returned to the network) with real consumption. The calculation error. The author herself has responded to the article de Masley citing the email he sent to the Municipal Drinking Water and Sewage Service of Chile (SMAPA), from whom he requested information on the total water consumption of Cerrillos and Maipu, the towns he used to make the consumption comparison. The problem is that Hao requested the amount in liters, but they responded without specifying the units and everything indicates that they were actually cubic meters, hence the large discrepancy. The author has consulted again with the SMAPA to clarify this information. It seems that, indeed, there is an error. Estimates. How much water AI consumes has been a recurring question in recent years. In September 2024, a study published by Washington Post He calculated that, to generate a 100-word text with ChatGPT, 519 milliliters of water were needed. The calculation was made taking into account the total annual consumption of data centers and the type of cooling used. It’s truly outrageous. What companies say. AI companies are not very transparent regarding the water and energy consumption of their data centers. The big technology companies give the total annual consumption data in their sustainability reports. We know that a large part of the consumption goes to data centers, but it is not possible to know the real consumption of each search. Google has been the only one that has published specific energy and water consumption data from its AI. According to the company, the water consumption for each Gemini consultation was 0.26 milliliters, or in other words, about five drops of water. We cannot extrapolate this data to all data centers or all companies, but it does seem that previous estimates are quite exaggerated. Water controversy. All of this doesn’t mean there isn’t a problem with water and AI. In fact, the Cerrillos data center where the alleged calculation error is It was never built because the Chilean justice system paralyzed it. due to the climatic impact it was going to have, especially in the context of drought in which the region found itself. Data centers need a lot of water, so much so that initiatives are emerging to cool them submerging them in the ocean. The other problem. Water is just one of the problems data centers face, energy demand poses an even greater challenge. In 2024, Data centers already accounted for 4% of total electricity consumption in the United States and in the surroundings of some of these beasts the electricity bill has risen 267% in recent years. Big tech is already warning: there is no power for so many chips and they are being raised since create nuclear power plants until take their data centers to space. Image | Google In Xataka | What is happening in the US is a warning for Spain: data centers driving up electricity bills in homes

Aragón has just activated its second major data center project. The bet goes through a challenge that is difficult to ignore

Aragón is going through a unique moment: in just a few years it has gone from competing to attract data centers to announce three mega facilities new ones promoted by Forestalia that aim to strengthen their position on the European cloud map. The announcement by the regional government comes in the midst of a race to attract technological investment, but also in a territory where the electrical network works to the limit and every great project depends on decisions that have not yet been made. The result is a scenario as ambitious as it is full of unknowns, which will determine the real impact of this expansion. How these digital complexes work. A data center is, in essence, a technological heart that stores and processes information for millions of users and companies. Every series that is streamed or every operation carried out in the cloud passes through servers that require stable power and constant cooling. That is why the choice of location is so relevant: electrical capacity and operational security are needed. Aragón has been gaining ground on that map and today is seen as a strategic option for new facilities. The project. The Government of Aragon has detailed that the Búfalo Project includes three data centers in Magallón, Botorrita and Alfamén, backed by an investment of 12,048 million euros. The deployment includes DCM Data, DCM Dédalo and DCM Blue, whose works would begin between 2028 and 2029 and will extend for approximately eight years. According to official estimates, the construction will generate about 30,000 temporary jobs. In the operational phase, each facility will add hundreds of workers, with a total that clearly exceeds a thousand stable positions. Aragón on the international board. The accumulated investments in data centers exceed 70,000 million euros and place the community in the same conversation as consolidated European hubs. According to the President of the Government of Aragon, Jorge Azcón, the computing capacity that is being configured rivals that of Dublin and Paris and aspires to approach that of Frankfurt. The regional Executive also states that the data that will be managed will have a European scope, from Germany or France to Italy and the United Kingdom, reinforcing the international dimension of the project. Distributed renewable self-consumption. The Government of Aragon presents self-consumption as a distinctive element of the Búfalo Project, since approximately half of the energy consumption will be associated with wind and photovoltaic parks powered by Forestalia. This volume of generation allows for a renewable supply, although it does not eliminate dependence on the general network, which will provide the rest of the energy. The underlying idea is to combine own generation with existing infrastructure to sustain large-scale facilities. Press to see the message in X The word “self-consumption” may lead one to think that data centers and renewable plants share the same physical space, but this is not the case. Forestalia is setting up parks in various regions of Zaragoza and Teruel, located where the natural resource is most favorable. The data centers, as we say, will be in Magallón, Botorrita and Alfamén, and the connection between both worlds is made entirely through the Red Eléctrica network. It is a distributed scheme that coordinates generation and consumption without a single energy campus. A network to the limit. Aragon produces more electricity than it consumes and exports about 54% of its generation, but that abundance contrasts with a distribution network that functions practically at maximum. A report published in September 2025 sets its occupancy level at 94.3%, well above the national average of 84.3%. This saturation leaves little room to incorporate large consumers such as data centers. The result is a paradox: available energy, but an infrastructure incapable of delivering it to all projects. Projects that have already reached their peak. The bottleneck is not a future hypothesis, but a reality that already affects several operators. According to Heraldothe data centers in the pipeline have requested more than 6,000 MW and only a part has firm access, with cases such as Vantage, which has 90 MW authorized despite aiming for 300. Microsoft also depends on tenders in saturated nodes. The Government itself recognizes that everything will be linked to Red Eléctrica’s planning and the decisions of the central Executive. Water, a debate that is still open? The cooling of data centers has generated concern in Aragon since Amazon asked for late 2024 48% more water for the complexes that already operate in the region. Ecologistas en Acción and the Tu Nube Seca Mi Río platform then warned of the water impact of these facilities in the midst of a structural drought. Azcón maintains that future Forestalia centers will use a closed circuit with “practically imperceptible” consumption and affirms that the debate “is over.” In any case, everything indicates that this matter remains under public scrutiny. To facilitate the path of the Buffalo Project, The Government of Aragon has declared the initiative as of Autonomous General Interest, a figure that allows procedures to be simplified and the different administrations involved better coordinated. This declaration speeds up procedures, but does not resolve the main point of friction: the available electrical capacity. Hence, the regional Executive insists on its willingness to work with the central Government and Red Eléctrica, the only actors that can modify the network planning. Real progress will depend on those decisions. The announcement of the three new data centers, together with the rest of the initiatives in the pipeline, places Aragón at a decisive moment to consolidate its presence on the European cloud map. The investment is notable and so is the promised employment, but much of the result will depend on decisions that are not entirely in the hands of the community. The region has shown intention and movement, although it remains to be seen what the real scope of this bet will be. Images | İsmail Enes Ayhan | Jorge Azcón (X) In Xataka | The European Commission’s pendulum with AI is real: it will sacrifice privacy to … Read more

We already know which will be the most expensive data center in the world. If Bill Gates paid it, it would be almost zero

Already in 2024 we saw that infrastructure spending for AI was being insane. The trend has not relaxed, quite the opposite. Big tech continues to burn money as if there was no tomorrow (literally) and most of that spending is going to most valuable asset in the AI ​​race: data centers. How much do they really cost? Data centers in numbers Epoch AI has published Frontier Data Centersa complete database about data centers being built in the United States. Through satellite images, public documents and permits, they have obtained information about the estimated construction cost, as well as energy consumption and computing power. The award for the most expensive data center goes to Microsoft Fairwater, whose total cost It could reach $106 billion when completed in 2028. To put it in context, Bill Gates’ fortune is estimated to be 107 billion dollars. It would be fair to pay it. The forecast for Microsoft Fairwater even surpasses Meta Hyperion, the data center that It will be as big as the island of Manhattan which would cost 72,000 million. Next on the list is Colossus 2, by xAIwhose estimated cost is 44 billion dollars. It is closely followed by Meta Prometheus with 43 billion and the Amazon and Anthropic data center in New Carlisle with 39 billion. Epoch AI has collected more data, such as how much computing power each facility will have. This data is measured using the NVIDIA H100 GPUs for reference. They have also calculated the energy demand and who will be the main user of each of them. Below we leave you a table with the key information: Estimate DATE ESTIMATED cost ($) computing (EN gpUS H100) energy demand intended primary user microsoft fairwater September 2027 106 billion 5.2 million 3328 MW OpenAI meta hyperion January 2028 72 billion 4.2 million 2262 MW Goal xai Colossus 2 February 2026 44 billion 1.4 million 1379 MW xAI meta prometheus October 2026 43 billion 1.2 million 1360MW Goal amazon new carlisle June 2026 39 billion 770,000 1229 MW Anthropic oracle stargate July 2026 32 billion 1 million 1180MW OpenAI microsoft fayetteville March 2026 29 billion 920,000 1065MW OpenAI/Microsoft amazon ridgeland September 2027 32 billion 630,000 1008MW Anthropic Dizzying climb Looking at the case of Microsoft Fairwater, and always according to Epoch AI’s forecast, in March 2026 the investment will be $18 billion. A year later, in February 2027, it rises to 35,000 million, just four months later it shoots up to 71,000 million, to reach 106 billion in 2028. The price increase is dizzying and responds to several factors. The first is that the computational cost of training models has been increasing. For example, GPT 4 cost OpenAI over 100 million and rumors before the release of GPT-5 pointed to training rounds of 500 million each. Epoch AI also did an analysis on this and they estimated that the cost of training has multiplied by 2.6 year after year. On the other hand, there is the demand for GPUs, necessary for training the models and the most expensive component of all. An NVIDIA H100 GPU costs 25,000 dollars and its successor, the NVIDIA B200 also known as Blackwell, could be between 30,000 and 40,000 dollars. And this is just the GPUs, many are needed more components to get a data center up and running, such as power generators, high-speed networks or refrigeration, among others. The initial bottleneck was the shortage of GPUs, but it has been overcome by a more fundamental constraint: there is not enough power for so many chips. data centers They consume a lot of energy, Seriously, a lot. To put it in context, in 2024, data centers were already the 4% of United States electricity consumption and it is expected that Demand will double in the next five years. Nobody wants to live near a data center for one reason: mass consumption is raising energy prices up to 267% in nearby areas. Power supply has become a new choke point for the industry. Microsoft is already considering producing its own energy by creating nuclear power plants and others like Google and Amazon are considering taking data centers into space. Image | Microsoft In Xataka | AI data centers are an energy hole. Jeff Bezos’ solution: build them in space

AI data centers consume too much energy. Google’s ‘moonshot’ plan is to take them to space

Training models like ChatGPT, Gemini or Claude requires more and more electricity and water, to the point that the energy consumption of AI threatens to exceed that of entire countries. Data centers have become real resource sinks. According to estimates by the International Energy Agencythe electrical expenditure of data centers could double before 2030, driven by the explosion of generative AI. Faced with this perspective, technology giants are desperately looking for alternatives. And Google believes it has found something that seems straight out of science fiction: sending its artificial intelligence chips into space. Conquering space. The company Project Suncatcher has been revealedan ambitious experiment that sounds like science fiction: placing its TPUs—the chips that power its artificial intelligence—on satellites powered by solar energy. The chosen orbit, sun-synchronous, guarantees almost constant light. In theory, these panels could work 24 hours a day and be up to eight times more efficient than the ones we have on Earth. Google plans to test its technology with two prototype satellites before 2027, in a joint mission with the Planet company. The objective will be to check if its chips and communication systems can survive the space environment and, above all, if it is feasible to perform AI calculations in orbit. The engineering behind the idea. Although it sounds like science fiction, the project has solid scientific bases. Google proposes to build constellations of small satellites—dozens or even hundreds—that orbit in compact formation at an altitude of about 650 kilometers. Each one would have chips on board Trillium TPU connected to each other by laser optical links. Such light beams would allow satellites to “talk” to each other at speeds of up to tens of terabits per second. It is an essential capability to process AI tasks in a distributed manner, as a terrestrial data center would do. The technical challenge is enormous: at these distances, the optical signal weakens quickly. To compensate, the satellites would have to fly just a few hundred meters apart. According to Google’s own studyKeeping them so close will require precise maneuvering, but calculations suggest that small orbit adjustments would be enough to keep the formation stable. In addition, engineers have already tested the radiation resistance of their chips. In an experiment with a 67 MeV proton beam, Trillium TPUs safely withstood a dose three times higher than they would receive during a five-year mission in low orbit. “They are surprisingly robust for space applications,” the company concludes in its preliminary report. The great challenge: making it profitable. Beyond the technical problems, the economic challenge is what is in focus. According to calculations cited by Guardian and Ars Technicaif the launch price falls below $200 per kilogram by the mid-2030s, an orbital data center could be economically comparable to a terrestrial one. The calculation is made in energy cost per kilowatt per year. “Our analysis shows that space data centers are not limited by physics or insurmountable economic barriers,” says the Google team. In space, solar energy is practically unlimited. A panel can perform up to eight times more than on the Earth’s surface and generate almost continuous electricity. That would eliminate the need for huge batteries or water-based cooling systems, one of the biggest environmental problems in today’s data centers. However, not everything shines in a vacuum. As The Guardian recallseach launch emits hundreds of tons of CO₂, and astronomers warn that the growing number of satellites “is like looking at the universe through a windshield full of insects.” Furthermore, flying such compact constellations increases the risk of collisions and space debris, an already worrying threat in low orbit. A race to conquer the sky. Google’s announcement comes in the midst of a fever for space data centers. It is not the only company looking up. Elon Musk recently assured that SpaceX plans to scale its Starlink satellite network—already with more than 10,000 units—to create its own data centers in orbit. “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links. SpaceX is going to do it,” wrote Musk in X. For his part, Jeff Bezos, founder of Amazon and Blue Origin, predicted during the Italian Tech Week that we will see “giant AI training clusters” in space in the next 10 to 20 years. In his vision, these centers would be more efficient and sustainable than terrestrial ones: “We will take advantage of solar energy 24 hours a day, without clouds or night cycles.” Another unexpected actor is Eric Schmidt, former CEO of Google, who bought the rocket company Relativity Space precisely to move in that direction. “Data centers will require tens of additional gigawatts in a few years. Taking them off the Earth may be a necessity, not an option,” Schmidt warned in a hearing before the US Congress. And Nvidia, the AI ​​chip giant, also wants to try his luck: The startup Starcloud, backed by its Inception program, will launch the first H100 GPU into space this month to test a small orbital cluster. Their ultimate goal: a 5-gigawatt data center orbiting the Earth. The new battlefield. The Google project is still in the research phase. There are no prototypes in orbit and no guarantees that there will be any soon. But the mere fact that a company of such caliber has published orbital models, radiation calculations and optical communication tests shows that the concept has already moved from the realm of speculation to that of applied engineering. The project inherits the philosophy of others moonshots of the company —like Waymo’s self-driving cars either quantum computers—: explore impossible ideas until they stop being impossible. The future of computing may not be underground or in huge industrial warehouses, but in swarms of satellites shining in the permanent sun of space. Image | Google Xataka | While Silicon Valley seeks electricity, China subsidizes it: this is how it wants to win the AI ​​war

making cell towers mini data centers for AI

A few days ago we heard the news that NVIDIA had invested $1 billion in Nokiataking over 2.9% of the Finnish company. Although the check in itself is striking news, since for many people, Nokia had been lost off the map for many years, the movement makes all the sense in the world: it is the Western response to many of the Chinese technology companies that for years have been investing in the deployment of 6G. And of course, with NVIDIA behind them, telephony base stations can serve much more than just providing coverage to millions of devices: becoming small distributed data centers for AI. The plan behind the investment. NVIDIA and Nokia are not just designing equipment for mobile networks. They are redefining what a cell tower is. The idea is that each base station (the towers and small installations that we see on buildings and streets) become a computing node with the ability to execute operations involving AI technologies in real time. “An AI data center in everyone’s pocket”, according to Justin Hotard, CEO of Nokia. The key here is to bring processing closer to the user in order to eliminate latency, which is usually one of the most frequent problems in AI applications that require real-time processing, such as instant translation, augmented reality or autonomous vehicles. Without latency, everything changes. When we ask an AI to translate a conversation or analyze live images, every millisecond counts. Sending that data to a distant server, processing it, and returning it introduces a significant delay that mars the final experience. The most logical solution is to decentralize: that the AI ​​lives close to the userin the telecommunications infrastructures themselves. In this sense, NVIDIA will contribute chips and specialized software, while Nokia will adapt its 5G and 6G equipment to integrate that computing capacity. As announced, the first commercial tests will begin in 2027 with T-Mobile in the United States. The Nokia effect on the stock market. Nokia shares they shot up 21% after the news broke, reaching highs not seen since 2016. NVIDIA and OpenAI have become King Midas of technology: everything they touch goes up. The investment is also a boost to the strategy of Hotard, who since his arrival in April has accelerated Nokia’s shift towards data centers and AI. The company, which already acquired Infinera for 2.3 billion to strengthen its position in data center networks, it is now positioned as the only Western supplier capable of competing with Huawei in the complete supply of telecommunications infrastructure. EITHERafter space race. While Europe and the United States accelerate their 6G plans, China has been investing aggressively in this technology for years. This alliance between NVIDIA and Nokia is a somewhat late response, but necessary. Jensen Huang, CEO of NVIDIA, explained in his speech in Washington that the goal is “to help the United States bring telecommunications technology back to America.” It is not just about infrastructure, but about strategic control. And whoever dominates this network of brains distributed throughout cities and roads will control the AI ​​applications of the future. And now what. The McKinsey consulting firm esteem that investment in data center infrastructure will exceed $1.7 trillion by 2030, driven by the expansion of AI. Nokia and NVIDIA want their piece of the pie, but they are also betting on a structural change: that mobile networks stop being mere data tubes and become intelligent computing platforms. It remains to be seen if this model works commercially and whether operators are willing to update their infrastructure. Cover image | NVIDIA In Xataka | Xi Jinping wants two things: first, to create a global center that regulates AI. The second, that it is in Shanghai

It’s called ‘data poisoning’ and it’s poisoning them from within.

AI is everywhere and every time add more users. The logical step is that it would also be the target of malicious attacks. We have already talked about the dangers of ‘prompt injection’, a surprisingly easy attack to execute. He’s not the only one. AI companies are also fighting data poisoning. Poisoned data. It consists of introducing manipulated data into resources that will later be used for AI training. According to a recent investigationit does not take as many malicious documents to compromise a language model as previously believed. They found that with only 250 “poisoned” documents, models with up to 13 billion parameters were compromised. The result is that the model can be biased or reach erroneous conclusions. Prompt injection. It is one of the Problems AI Browsers Face like ChatGPT Atlas or Comet. By simply placing an invisible prompt in an email or a website, you can get the AI ​​to deliver private information by not being able to distinguish what is a user instruction and what is a malicious instruction. In the case of AI agents it is especially dangerous since they can execute actions on our behalf. AI to do evil. According to a Crowdstrike reportAI has become the weapon of choice for cybercriminals, who use it to automate and refine their attacks, especially ransomware. He M.I.T. analyzed more than 2,800 ransomware attacks and found that 80% used AI. The figure is overwhelming. Collaboration. They count in Financial Times that leading AI companies such as DeepMind, OpenAI, Microsoft and Anthropic are working together to analyze the most common attack methods and collaboratively design defensive strategies. They are turning to ethical hackers and other independent experts to try to breach their systems so they can strengthen them. Urgency. AI browsers and agents are already here, but we are on time because there has not yet been mass adoption. It is urgent to strengthen the systems, especially to prevent the injection of prompts that can so easily steal our data. Image | Shayna “Bepple” Take in Unsplash In Xataka | “The safety of our children is not for sale”: the first law that regulates ‘AI friends’ is here

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.