Meta spent a fortune on AI talent and data centers. Nine months later the result is: zero models

Mark Zuckerberg wanted to be the Florentino Pérez of AI. last summer began to sign galacticos in this segment and getting talent by letting go stacks of millions of dollars. He more popularOf course, it was the AI wunderkind Alexandr Wangwho became leader of its “Superintelligence” division. The funny thing is that the months go by and go by and in Meta they don’t seem to have absolutely anything to show. And that is very worrying. Delays. Despite having invested billions of dollars in that restructuring of the company to bet (practically) everything on AI, three internal sources confirm that Meta finds it very difficult to meet the planned deadlines. The race for generative AI waits for no one, and at the company headquarters nerves are on edge because the roadmap is not being met. Avocado, where are you? The new foundational AI model that Meta has been working on for months has been internally named Avocado, but at the moment it is not measuring up, something that reminds us what happened to Llama 4. Internal tests reveal that although it manages to surpass the aforementioned Llama 4 and the old Gemini 2.5, it falls short of Gemini 3.0 (and of course, the recent Gemini 3.1). Patience. Coming out with a model that is clearly worse than its rivals does not make sense, so Meta has decided to wait and delay the launch of its model. Avocado is expected to hit the market in May at the earliest. And meanwhile, Gemini. The situation is so critical that according to these sources, the leaders of the AI ​​division are considering something unthinkable: paying a license to Google to be able to use Gemini in their own products, something that for example will Apple do Siri. That would be a clear sign that for now this own model is not capable enough to power the AI ​​functions of WhatsApp, Instagram and Threads. Money does not equal speed. The company has spent billions of dollars on AI researchers, and has committed to invest 600,000 million dollars in building AI data centers. In January, Meta projected a capex of $135 billion dedicated almost entirely to these projectsalmost double the $72 billion it spent last year. Despite these investments, the company is currently missing from an area in which its competitors continue to advance. Internal tension. According to these sources, Meta is becoming a tinderbox. The “TBD Lab” (for “To Be Determined”), the unit led by Wang, is working under maximum pressure on models named after fruits (Avocado, Mango, Watermelon), but has clashed with old-school Meta managers like Chris Cox and Andrew Bossworth. The company is trying to integrate those models with Meta’s advertising business, which is what supports everything, but Wang doesn’t seem to handle that part of the business very well. Goodbye to open models. Meta stood out at the beginning of this AI race as the company whose open models —not Open Source— were above the rest. Llama became the norm in this area, but in this new stage that philosophy seems to change and China is the one that now leads that segment. Thus, there is talk that both Zuckerberg and Wang lean toward closed models, such as those of OpenAI (GPT) or Google (Gemini). This allows you to have full control over the code, a competitive advantage that Meta does not seem to want to give up. Few fruits of this tree. Despite the extraordinary deployment of resources, the current balance is poor. Meta’s only tangible product of those investments is Vibes, an application similar to Sora that has not managed to fully gel. Meanwhile, those initial talent signings have turned into abandonments: the trickle of AI researchers who leave the company to join others (or found their own projects) is increasing. In Xataka | Meta has been buying chips from NVIDIA and AMD for years. Now it also makes its own so as not to fall short

Chips connected by laser instead of cable. It seems like science fiction, but it aims to revolutionize data centers

If you have ever mounted a PCSurely one of the points on which you have had to pay the most attention is the connections. Because understanding the power of the processor, the GPU or the speed of the RAM is “easy”, but the motherboard is what allows us to interconnect all these components with ‘highways’ in which the data speed can be maximum. In the data centers and serversthis is the same: the better the connections between chips and equipment, the lower latency, higher bandwidth and better performance. These connections are made physically, but there is a French startup that wants to change the rules of the game with NVIDIA. As? Connecting the chips by laser. Chips connected by laser and NVIDIA taking out the wallet Improving interconnection speed is no small feat or a whim. NVIDIA has begun manufacturing its next generation platform, the one named Vera Rubin. It is a system that can be combined with others to multiply benefits. That union, as we say, is physical, but there comes a point at which physics is no longer enough. When that arrives, NVIDIA wants to be ready and, a few days ago, Reuters reported on a $4 billion investment by NVIDIA in two companies that are aggressively researching new technologies to help increase that interconnection speed: Lumentum and Coherent. This is a rack and the nightmare of those of us who hate cables. Specifically, that of the Wikimedia Foundation. Well, imagine that a large part of those cables go outside because the systems are connected by electricity Another of the companies in which they have invested is Scintil Photonics. It is a French startup that this in the testing phase of a technology that, if the industry adopts it, will mark a before and after in this connection on a team scale. The LEAF Light Evaluation Kit is, as detailed, the first dense wavelength division multiplexing single chip to go from theory to practice. It’s like another language, I know, but it’s basically what we were talking about: an optical chip interconnection system instead of copper. And that is the main advantage. With copper reaching physical limits of speed and density, optics are emerging as a solution when connecting clusters of thousands of processors. Each chip has an optical system that is responsible for emitting and receiving light, and in that light goes the data that is currently traveling through cables. The one from the French company it is not the first chip based on photonic communication, but they claim that their technology reduces the energy necessary for them to work by 50%, as well as latency. Results? Well we’ll see. The startup’s CEO, Matt Crowley, has commented that he has “six or seven companies interested in implementing the technology by 2028,” but that due to confidentiality agreements, he cannot name names. The Scintil Photonics prototype The complication in this will be that they get supply of the photonics systems, since the data center racks are built with the idea that they are scalables. That is, it is no longer just power, but how many tens of thousands of units you can interconnect, and a bottleneck in the manufacturing of any of the parties involved in optics would be equivalent to a lack of supply for their customers. At the moment, some prototypes have already been served to select companies for testing, but certainly, using light pulses instead of electrical signals is something that is very interesting in superclusters focused on huge data centers that can scale without the limitations of the physical connection. Images | Victorgrigas, M.I.T., GlobeNewswire In Xataka | Huawei no longer competes: it is building its own parallel reality

Building data centers in the Middle East seemed like a great deal. Until Iran arrived

A few days ago we said that Iran had attacked two data centers in the United Arab Emirates and one in Bahrain. It is the first deliberate attack on a data center and proof that it has become critical infrastructure at the level of power plants. The question is who thought it was a good idea to build data centers in one of the most unstable areas on the planet. A plan that comes from afar. In a trip to Saudi Arabia last yearTrump was accompanied by an entourage of technological leaders among whom were Elon Musk, Jensen Huang, Sam Altman or Sundar Pichai among others. At this meeting, massive investments were announced in the region with the construction of a massive data center complex. However, although it has been strengthened by this administration, the previous one was the one that started the path. In September 2024, Biden met with the leader of the Emirates to seek a strategic alliance that would allow them to develop their AI ecosystem. The reason. What has led technology companies to build in the Middle East is evident: saving. They count in Financial Times that the Gulf countries offered very interesting incentives, such as subsidies and cheaper energy. Furthermore, in this way all the problems they are having at home with the electrical gridpermits and resistance from many communities. The business seemed good. The map of AI in the Middle East. Emirates and Saudi Arabia are the countries with the most data centers, with 57 and 61 facilities respectively, according to Data Center Map. Of all of them, many are from American companies. Amazon alone has nine in the area, including those in the Emirates, Bahrain and also Saudi Arabia. Microsoft has data centers in the United Arab Emirates, Qatar and is building one in Saudi Arabia. Oracle, OpenAI and other partners are building a mega data center in Abu Dhabi which they expect to reach 5GW. The damage. Although the Middle East has gained presence on the map of big tech data centers, the concentration of infrastructure is still ridiculous compared to that of the United States itself, which has more than 4,000 installations. All in all, build a data center It’s not exactly cheap. Jensen Huang, CEO of NVIDIA, said a few months ago that Each gigawatt costs about $50 billion.. The irony. The same leaders who posed for a photo with Trump on that trip now see how their infrastructure is threatened and suffering the consequences of the conflict caused by the president himself. The idea of ​​investing in so much digital infrastructure in an unstable area was not such a good idea. The war against Iran It looks like it’s going to get longer. and nothing prevents Tehran from continuing to attack energy and technological facilities in the region. They were looking to reduce costs and it may end up being expensive, although seeing the projected capex for this yearthey can afford it. Image | Data Center Map (edited) In Xataka | The US is beginning to realize something worrying: AI data centers are skyrocketing its electricity bill

Oracle builds yesterday’s data centers with tomorrow’s debt

Yeah Stargate it smelled funny It’s because I did it. This has just been demonstrated by the decision of Oracle and OpenAI, who have decided to stop their expansion plans for the data center that was going to be the flagship of the project. This is not just a setback for the project: it is a turning point in that narrative that we have not stopped seeing and that seemed to defend that investment in AI could be unlimited. It’s not like that. OpenAI no longer trusts Oracle. According to reveal sources close to the project, OpenAI’s plans to expand the alliance with Oracle in its data center in Abilene (Texas) have been canceled. What initially It seemed like a solid partnership. to dominate the AI ​​computing segment has collided head-on with a reality: the sector seems to be growing faster than its foundations. Too slow. On Bloomberg indicate that the decision responds to an inability to scale at the pace that Sam Altman demands. OpenAI requires a compute density and deployment speed that Oracle cannot guarantee in the short term. That has forced OpenAI to look to other partners—including Microsoft—so as not to compromise its roadmap. Technological gap. This brake is a symptom of a potential critical problem for Oracle: the world requires data centers with the latest technology, the most modern chips and modern liquid cooling systems, but Oracle seems to be focused on a very slow update cycle. They are building yesterday’s data centers with tomorrow’s debt: although the infrastructures they are built were valid under previous standards, they are obsolete for the next generation of large language models (LLMs). The accounts do not come out. And as we said, the other problem with Oracle is that all these projects are financed with very high leverage and economic risk. Larry Ellison’s Company is jeopardizing future cash flows to create data centers that are “old” when they come into action. If AI revenues don’t materialize, Oracle will find itself in a dangerous position. Bubble. All of this contributes once again to AI bubble debate. No one seems to deny that this bubble exists, but this slowdown raises more and more doubts about excess investment in the sector. That OpenAI is now making this decision is a bad sign, and reinforces the theory that investment in AI has been absolutely overblown. This year alone, several AI giants have indicated that they will dedicate a capex of 650,000 million for data centers. The challenge of not being a Big Tech. OpenAI has a fundamental problem: it is trying to play with the elders. Google, Amazon and Microsoft already had gigantic cloud infrastructures, but also a financial situation that allowed them to consider their strategy in a different way. While OpenAI has not stopped signing agreements in which the figures involved are astonishing. OpenAI follows burning moneybut not only his: also that of others. The danger of the domino effect. That OpenAI has hit the brakes with Oracle can be dangerously contagious. If one of the leading companies in the sector takes a step back from its alliance with a key supplier, other clients could begin to think twice before reaching similar agreements. In Xataka | OpenAI says its deal with the Pentagon is secure. Seriously, really, you have to believe it, trust it, it assures you

create the mother of all data centers

Almost a decade ago we learned about Neom, a Saudi superproject orchestrated to diversify the economy and stop depending so much on oil. Within the ‘crazy things‘ inside Neom, The Line It was the largest: a linear city 170 kilometers long and 500 meters high to house nine million people. The project has been falling apartbut they have found a solution: convert The Line into a data center. You wouldn’t expect anything else, would you? Let’s go with context. Climbing, but downwards. The Line has gone from being the city of the future to something totally different. Over the years, the utopian megaproject of 500,000 million dollars without cars, automated, powered by renewables and that began to be built under strong controversies due to the forced displacement of native tribes it was deflating. Of the 170 kilometers and nine million inhabitants, expectations dropped to 2.7 kilometers long for a population of 300,000 inhabitants. The most recent and independent reports indicated that The Line was unrealizable and that not even a country like Saudi Arabia could bear the cost. There were experts who they pointed that it was something “unmoored from reality.” “New phase”. The problem is that there is already a certain infrastructure built and, being a failure as it already is (and as it is perceived by the rest of the world), the most sensible thing would be to reuse what has already been built to do whatever. And within that ‘whatever’, comes the new gold mine: data centers. In the area where they were going to build the megacity, there is plenty of space to house gigantic data centersbut also some operational advantages. A small part of the land that has already begun to be moved to build The Line. Something will have to be done with the work done From the country they have not said anything because, as we say, swould be accepting a failure of biblical proportionsbut for a few weeks it has been pointed out that this new phase, this conversion to data centers, would allow monetizing what until now has only been a pit of money. They already have the land, the earthworks and part of the electrical connections, and building data centers is easier than ‘pulling’ two skyscrapers kilometers and kilometers long. Neom IA. And this new approach fits with Saudi Arabia’s aspiration to become the global AI node. We have been telling for a few months how Saudi Arabia is investing a lot of money to attract companies that want to build data centers. For example, 7 billion in one fell swoop at NVIDIAhuge investment for build a city-sized data centerand have created a company called Humain in which both NVIDIA and AMD are already involved. The million-dollar purchases are not being restricted to investments in Western Big Tech. In September last year, the Saudi fund (which is ultimately owned by the country) was merged 55 billion dollars in a legendary video game company: Electronic Arts. He didn’t do it for his video games (which, admittedly, are in the doldrums), but to buy cultural influence in millions of homes. It has not been the only billion-dollar movement in the country in terms of video games, since they are now negotiating the purchase of a mobile games company for about 7,000 million dollars. Access to the Red Sea. Therefore, it is evident that the country wants to diversify its economy, even if that means investing astronomical amounts that, admittedly, are still infinitely smaller than The Line’s initial objective. And, apart from money, the Saudis have something equally important: the power to do what they want in terms of energy, territory and access to the Red Sea. data centers They need water to dissipate heat and, although the navy is not adequate (in fact, there is controversy over its freshwater needs), the Red Sea implies an outlet to the rest of the world. As? Through submarine cables. They are deploying cables and that access to the Red Sea would allow the data centers on The Line’s land to be integrated with international fiber optic nodes in Europe or Africa. “We are determined, by the grace and power of God, to achieve the transformation objectives. But we will also not hesitate to cancel or radically change any program or objective if we find that the public interest requires it” – Shura Council on Neom and The Line in September 2025 Challenges. They can also combine gas with renewables like solarwhere it has enormous potential on the ground, although there are some difficulties ahead. For example, temperatures are high and fresh water is scarce, although it could be used in heat exchange systems. Furthermore, the energy required to maintain the humidity and temperature conditions of the server rooms would be tremendous, complicating the design of the infrastructure. Promises and realities. In the end, and as different sources point out in Financial Timesit’s about getting money, diversifying the economy and data centers come into the equation. The location between three continents is good, there is plenty of land and access to both renewable and profitable energy (with projects like that of green hydrogen). And then there is the Red Sea. It certainly seems more likely that we’ll see a gigantic data center before anything else related to the Neom project. Current events are showing that Big Tech They have billions to invest in artificial intelligenceand Saudi pockets are deep to attract anyone. Some of the largest – Amazon, for example, which has just closed its data centers in Saudi Arabia by the Iranian attacks – may be attracted to the sovereign wealth fund. But of course, we will have to see if it is fulfilled. There we have the Jeddah Tower, Mukaabeither pharaonic airportother examples outside of Neom that, for the moment, are nothing more than promises. And Big Tech, with its hunger for computing, needs the data centers of the next decade… for before yesterday. Images | Neom In Xataka | AI is bringing … Read more

Data centers have made the electricity bill more expensive in the US. And the Government has said enough

Every time you ask a generative AI to solve a problem for you, a server on the other side of the world needs power to process it and cooling to keep from melting down. The problem is that this electricity meter that spins at full speed is not just that of the large technology companies: it is that of the entire community. The AI ​​revolution has a real physical and economic cost that has already begun to hit the pockets of families, unleashing a crisis that has forced the United States Government itself to hit the table. The US government has said enough. According to federal dataresidential electricity prices will increase a national average of 6% in 2025. Citizens, stifled by the cost of living, have begun to connect the dots and point to the huge data centers that are proliferating in their neighborhoods. As detailed Politicalthere are currently some 680 data centers planned in the country, gigantic infrastructures that will require energy equivalent to that of 186 large nuclear power plants. This brutal demand has provoked strong citizen opposition, how to explain Guardiannumerous communities have begun to reject and block these projects for fear that their bills will skyrocket. The pressure has been so strong that the rebellion has penetrated traditionally conservative fiefdoms. According to Financial TimesRepublican legislators in states such as Missouri, Ohio and Oklahoma have suggested halting the construction of data centers, while Florida Governor Ron DeSantis has pushed laws to regulate them and protect families from price increases. Faced with this scenario, Donald Trump’s administration has been forced to intervene. Washington’s “historical pact.” As reported The New York Timesexecutives from Google, Microsoft, Meta, Amazon, OpenAI, Oracle and xAI made the pilgrimage to Washington to meet with President Trump and sign the so-called “Taxpayer Protection Pledge” (Ratepayer Protection Pledge). The objective of the agreement is to shield consumers from rising electricity costs. Technology companies have committed to “build, provide or buy” the new electricity generation resources they need, assuming 100% of the costs of infrastructure and improvements to the transmission network. During the meeting, Trump left a phrase that perfectly summarizes the sector’s reputation crisis: “They need help with public relations, because people think that if a data center is installed, the price of electricity will go up.” The president assured that, thanks to the pact, that “will no longer happen.” For their part, managers such as Ruth Porat (Google) or Dina Powell McCormick (Meta) confirmed their commitment to pay for the infrastructure “whether or not they end up using that energy.” according to statements published by the New York media. We cannot understand this move by Washington without looking at the electoral calendar. Politically, as they point out Financial TimesRepublican strategists alerted the White House that energy inflation was an imminent risk ahead of the midterm congressional elections (midterms). The Democrats, like Senator Mark Kellywere already using citizen anger as a political weapon, calling Trump’s pact a simple “handshake agreement” that was insufficient. And the clash with reality: a network to the limit. On paper, the promise sounds perfect. As the specialized media ironically says Engadget“big tech agrees not to ruin your electricity bill.” However, journalism and energy sector experts agree that there is a gigantic distance from words to actions. As he warns Political, The agreement is, in essence, a voluntary “handshake”, without binding legal force. Rob Gramlich, former economic advisor cited by CNBCremember that the White House has no direct jurisdiction over this matter: the rules of the electric grid are decentralized and depend on the public service commissions of the 50 states. It is they, and not the federal government, who approve how costs are distributed. The damage in some areas has already been done. Argus Media reports that on the PJM network —the largest in the US, covering 13 states and including the world’s largest data center cluster in Virginia—capacity costs have skyrocketed by $23 billion, record rates that are locked in until 2028, making it “virtually impossible” to lower prices for consumers in the short term. An independent watchdog came to describe this situation as a “massive transfer of wealth” from citizens to corporations. Competition for resources is fierce. Abe Silverman, researcher at Johns Hopkins University cited by Politicalcompares the situation to “a bidding war for a ticket to a Taylor Swift concert.” There is a five-year waiting list for gas turbines, and their prices have doubled. This technological urgency not only makes the network more expensive, but is stopping the green transition in its tracks. As they explain Argus Mediathe immense demand for servers cannot be covered quickly enough with renewable sources. This is forcing power companies to delay the closure of polluting coal plants and invest heavily in natural gas generation, perpetuating dependence on fossil fuels. The greatest risk, Silverman warnsis what happens if Silicon Valley is wrong in its growth calculations: “You spend 3 billion to improve the network, and then the data center does not materialize (…) Who is left with the problem? Grandma.” Should Europe demand the same? If we cross the pond, the situation is no less worrying, and the regulatory approach is drastically different. According to data from the European Commissiondata centers currently consume 415 Terawatt-hours (TWh) globally (1.5% of the world total), a figure that, driven by AI, will double to 945 TWh in 2030. In the European Union, consumption was around 70 TWh in 2024 and will jump to 115 TWh by the end of the decade. Europe has launched a mandatory monitoring system under the Energy Efficiency Directive to demand transparency about this consumption and its water and carbon footprint. But in Spain, the problem is already a physical jam in the networks. As we have described in Xataka, The Spanish electrical grid is like a saturated highway to which, suddenly, “a convoy of trucks of industrial tonnage” has arrived. The technical regulations of the National Markets and Competition Commission (CNMC) caused a “cascade effect” that blocked connection permits. The … Read more

energy and data centers

When talking about Iran’s weapons, missiles are often mentioned. However, a fundamental leg of the country’s war machine is that of kamikaze drones. He Shahed-136 introduced in 2020, known as “loitering ammunition“, has been Iran’s strategic spearhead in the Middle East for years. Also a weapon that Russia has used in the Ukrainian war. After the beginning of the war against the United States and IsraelIran has directed these drones against its enemies. Not against bases, but against the two pillars that can do the most damage to the West. Energy and data centers. The drones. Since the Ukrainian war began, drones have proven to be the most fearsome weapon. There are more homemade ones, there are more sophisticated ones, but they all have something in common: power to destroythey can be operated at a good distance, they are very cheap, it is difficult to intercept them and the most advanced ones can be launched in swarms without risks for the operators. But Shahed’s drones are not like a street DJI with explosives: they are drones with a range of up to 2,000 kilometers that are ideal for attacking very effectively. The key is in the price: they are thrown a lot and, even if many are intercepted, the cost of that interception is extremely favorable for the attacker. It is estimated that a drone costs about $20,000 while a interceptor missile The average is between 300,000 and 400,000 dollars. That relationship is making even the US is using them. Ras Tanura. And it is these drones, and their variants, that Iran is using to attack critical infrastructure. Because they don’t have to hit the targets directly: they just need to land nearby or with the simple threat that they can reach that key infrastructure. We have an example in Ras Tanura. It is one of the largest oil refineries in the world that had to close its doors last Monday. Aramco (the owner) made the decision after debris from intercepted drones fell near the facilities in Saudi Arabia. This caused a crisis in the crude oil market, with the barrel rising in price meteorically and with a lot of Overcrowded cargo ships in the Strait of Hormuz. Data centers. But if power is critical, in the age of AI, data centers have also become a vital infrastructure. That is why these facilities are also in the crosshairs of an Iran that attackeddirectly, two installations of Amazon Web Services, or AWS, on March 1 and 2. AWS presence These are two data centers in the United Arab Emirates, while another Amazon facility in Bahrain also suffered some damage from a third attack. And specifically, computing on EC2 and cloud storage on both S3 and DynamoDB began to experience high error rates. Amazon itself confirmed that “these attacks have caused structural damage, disrupted power to our infrastructure, and, in some cases, required fire suppression activities.” They point out that the water damaged part of the equipment and, as a consequence, their clients should migrate their workload to servers in other parts of the world because the recovery “will be prolonged.” Market with anxiety. This has impacted the market, of course. If in the energy and crude oil segment it is evident that stopping a plant that ‘produces’ 550,000 barrels a day and cutting off a transit area through which passes 20% of the world’s oil has its consequences, which data centers becoming a target has also shaken the market. Major companies related to AI, semiconductors and storage suffered the consequences this past Monday/Tuesday/Wednesday. NVIDIA, Micron, Western Digital, ASML, Applied Materials, SK Hynix and Samsung traded lower on the worst day in recent months. It is not known if components can continue to be transported at the high rate we had if two of the busiest container shipping corridors of the planet suffer an alteration in traffic. But don’t worry, they are already recovering so that the AI wheel keep turning in any way. Images | Goal, Tasnim News Agency In Xataka | Ukraine has shown that wars are no longer won with tanks. They are earned with something that Spain has in its hands: PAMOV

Amazon increases its investment in Spain to 33.7 billion euros. All, of course, for data centers

amazon has announced that will expand your investment in data centers in Spain, and this amount will now reach 33.7 billion euros in total. Today’s announcement adds 18 billion euros to the 15.7 billion euros of investment announced by 2024. Amazon is going more in Spain. The company has taken advantage of the Mobile World Congress in Barcelona for an announcement that significantly reinforces its strategy in our country. The announcement highlights that there are plans to build facilities for manufacturing, storage and something interesting: server recycling in Spain. The promise of employment. Amazon’s forecast is that this Amazon Web Services (AWS) region, which reinforces its location in Aragónwill contribute 31.7 billion euros to Spain’s total GDP until 2035. They estimate that it will contribute “the equivalent of 29,900 full time jobs on average annually in local companies.” Of that figure, there will be 6,700 full-time jobs derived from Amazon’s direct investment in various areas such as data center operationsemployees of AWS providers, or workers who build the facilities. Supply chain. This investment includes an important part of the business consisting of facilities dedicated to the supply chain. These facilities, according to Amazon, will theoretically generate 1,800 jobs in Aragon. Thus, there will be a factory dedicated to the assembly and final testing of the servers, a logistics warehouse and a facility for the manufacturing and repair of AI servers. Let’s talk about energy… Amazon has not given too many details about what the energy and water needs that these data centers will have. However, it does indicate that they have committed to achieving net zero carbon emissions by 2040. To do this they are investing in 100 solar and wind projects across Spain, including seven new solar farms. According to their data, AWS data centers in Aragon have offset their electricity consumption with 100% renewable energy since opening in 2022. It remains to be seen if that is enough to prevent the Spanish electrical infrastructure, already saturated, from bursting. …and water. There is also talk about how AWS is going to face the water consumption of these centers: “AWS is also committed to returning more water to communities than it uses in its direct operations by 2030. By 2024, AWS had reached 53% of that goal. In Aragon, AWS supports five water projects with an investment of 17.2 million euros.” A pinch of capex. That investment is certainly part of the planned capex that Amazon has estimated for 2026. The total figure is 200,000 million dollarsa notable increase from the 131.8 billion dollars of capex in 2025. Thus, those 18 billion euros ($21.11 billion) at the current exchange rate represent just over 10% of that capex. AWS is doing (very well). Amazon may not be standing out for having its own AI model, but it certainly has value in its cloud infrastructure. In it fourth quarter of 2025 AWS’s revenue was $35.6 billion, achieving the most notable year-over-year growth (24%) in the last three years. It is evident that investment in infrastructure at a global level is working right now, and Spain has benefited from that momentum. In Xataka | Amazon is negotiating to invest 50 billion in OpenAI. The money would go in through the door and out through the window.

Spain had a completely saturated electrical grid. And then data centers arrived to blow it up even more

Imagine a highway on which not a single vehicle can fit anymore. But the problem is not that there is a lack of asphalt, but that the cars do not know how to drive efficiently and keep kilometer-long safety distances. The Spanish electrical grid was exactly that. It had been operating for years at the limit of its administrative capacity, and suddenly, a convoy of trucks of industrial tonnage and voracious appetite has arrived at the access ramp: data centers. These megainfrastructures, pillars of artificial intelligence and the cloud, promise to water the economy of millions, but their brutal need for supply threatened to burst the seams of an already saturated electrical system. To avoid collapse and not let the reindustrialization train escape, the Government has had to react and radically change the technical rules of the game. Cascading capacity collapse. To understand the collapse we have to look at how our way of consuming energy has changed. The energy transition is profoundly reconfiguring the model throughout the national territory. Requests to connect to transportation and distribution networks have skyrocketed. In addition to the electrification of industry and renewable hydrogen, there is now massive consumption associated with data centers for artificial intelligence. The problem broke out when the National Markets and Competition Commission (CNMC) established a “dynamic criterion” to calculate how much access capacity was available in the areas shared by several network nodes. As detailed by the Ministry for the Ecological Transition and Demographic Challenge (MITECO) in his press releaseapplying this criterion means that a single access requested at a node can cause a “cascading effect that drains capacity in the rest of the nodes that share the area”, blocking requests from dozens of kilometers away. Basically, a large data center asks for passage and, automatically, the system administratively blocks neighboring nodes as a precaution, even if physically the cables have plenty of space. Investments in the air and the ghost of the blackout. The consequences of this traffic jam directly affect the real economy and national security. Real estate and industrial paralysis. The situation is so critical that, as we already mentioned in our previous coverage citing the Asprima employers’ associationlast year only 12% of connection requests for new urban developments were granted. There are 350,000 homes at risk simply due to lack of electrical power. The risk of an electrical “zero”. The Official State Gazette warns that the increase in installations that are not able to withstand “tension gaps” poses a very high risk. If there is a disturbance and these generators are massively disconnected, exchange flows are produced that are incompatible with Spain’s limited interconnections with Europe. As the diary recalls The Countrythe objective is to avoid at all costs a repeat of massive blackouts like the one suffered by the Iberian Peninsula on April 28, 2025. It is not enough to put more cables. In areas limited by this dynamic criterion, it is no longer possible to enable new capacity simply by investing money in reinforcing the network with “more copper.” The expert in the sector Joaquín Coronado sums it up perfectly: the demand must be 100% active; It must provide flexibility and commit to the stability of the system. The Government’s emergency surgery. To unclog this Gordian knot, the Government and regulators have launched a three-way shock plan: The new Royal Decree of MITECO. The Ministry has been brought to public hearing (until March 16) a standard that updates the technical requirements to connect to the network. The master key is that now it is required that the demands “withstand voltage gaps”, do not introduce adverse oscillations and maintain the quality of the wave. By forcing installations not to disconnect in the event of small disturbances, the number of nodes affected in shared areas is reduced. This simple technical measure could bring out 50% more capacity in about 900 knots of connection to the high-voltage network. The “flexible permits” of the CNMC. To put an end to the binary model (either I give you all the capacity or I deny it), the CNMC has proposed four new types of permits, as we already broke down in Xataka. These range from allowing consumption only in certain time slots, to “dynamic” permissions where the operator can remotely disconnect a data center if there is an emergency on the network. The “technical amnesty” for data giants. In parallel, the Ministry of Industry has been urgently removed the “off-peak” requirement. Previously, to receive aid, you had to consume at night, an absurdity for a data center (which operates 24/7) and for today’s Spain, where solar energy has brought down prices at midday. The citizen cost and the fine print. The Government’s maneuver not only responds to a national emergency, but also places Spain as a pioneer on the continent. The country is anticipating the update of the European network codes, deploying a battery of technical specifications simultaneously that is already considered a milestone worldwide, as detailed The Country. In this deployment, the new regulations also settle a historical debt with energy storage: batteries will finally have their own specific regulatory framework, no longer being administratively treated as simple “generation by analogy” facilities. However, this deep digitalization so that the network supports such a complex mode of operation will not come for free, and the bill for modernization will end up looming in the consumer’s pocket. Forecasts for 2026 They already estimate direct increases in citizen receipts, with a 4% increase in tolls and a not inconsiderable 10.5% in electricity system charges. And while citizens assume the technical cost, the data giants – recipients of this regulatory red carpet – prefer to remain cautious in the face of the eternal Spanish bureaucratic obstacle. The technology sector warns that a key piece of the puzzle is missing: If the Government does not expressly include the National Code of Economic Activity (CNAE) corresponding to “Data Processing” in the official list of sectors entitled to receive the million-dollar electro-intensive aid, all … Read more

Data centers in space promise to save the planet. And also ruin the earth’s orbit

Wikipedia should update its page dedicated to the word “ambition” to include Elon Musk’s photo. The tycoon has announced a megaproject according to which his two companies SpaceX and xAI will work together to launch a constellation of one million satellites that will function as data centers in orbit. The problem is that although the idea It has its advantages, it also has an impact potentially terrible for the future of our planet. Energy efficiency. That is the great advantage of the space data centers that Musk proposes. In space, solar panels can perform optimally without the obstacles posed by Earth’s atmosphere and climate. According to SpaceX, the reduction in the cost of launching its rockets makes space a perfect alternative for AI data centers. The plan. He project that has been presented to the US Federal Communications Commission (FCC) consists of placing these satellites in sun-synchronous orbits between 500 and 2,000 km high. That would allow the satellites to act as interconnected nodes among themselves and also with the satellites of the Starlink network through optical laser links. The plan, of course, will have to overcome important challenges like refrigeration. Dissipating the heat generated by millions of chips in the vacuum of space is complex, since satellites act as “natural thermoses.” And radiation, what? The problem of cosmic radiation will also have to be solved. Advanced chips are very vulnerable to processing errors caused by energetic particles. It seems that AI processors are surprisingly resistant to this type of problembut the deployment of such chips on a massive scale in space could introduce new conflicts. On-site repair, nothing. In today’s data centers, if a problem arises, a technician can physically travel if necessary to solve it. In space, physical repair is not feasible, which would force a strategy of assuming that those chips that become functionally damaged will be completely lost. SpaceX would have to continuously launch substitutes to compensate for this “mortality” of components, which complicates logistics and costs. There are optimistic perspectives in this regard, and for some the bills do work out. Kessler syndrome. But above all there is a latent concern in the field of space security. Launching a million new satellites into already congested orbits multiplies the probability of chain collisions, validating the theory proposal in Kessler syndrome. A single major collision could generate a cloud of debris that would take decades to clear, further threatening climate monitoring missions or even global communications. There are already ideas to “regulate orbital traffic” by coordinating it, and SpaceX has its own “situational awareness” system, Stargazeto avoid problems, but of course, no system is completely perfect. air pollution. Without forgetting that the atmospheric impact is equally worrying. Some are estimated 25,000 Starship flightsand the re-entry of satellites that end their life cycle or die prematurely would cause metals and particles to be released into the upper atmosphere. According to experts, these chemical residues could damage the ozone layer and cause uncertain climate consequences. You can’t see anything. The astronomers, who They had already protested about Starlinkthey will have an even bigger problem with this new idea. The threat to astronomy is clear, because given the altitude and size of these satellites, it is likely that they form a bright band visible even to the naked eye, making scientific observation difficult and even changing the way we see the sunset. Orbital computing may have advantages, but before launching it we should remember that space—especially the space we see—is a shared and finite resource. In Xataka | Starlink’s dominance in space begins to move: another company already has permission for a constellation of 4,000 satellites

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.