Spain had a completely saturated electrical grid. And then data centers arrived to blow it up even more

Imagine a highway on which not a single vehicle can fit anymore. But the problem is not that there is a lack of asphalt, but that the cars do not know how to drive efficiently and keep kilometer-long safety distances. The Spanish electrical grid was exactly that. It had been operating for years at the limit of its administrative capacity, and suddenly, a convoy of trucks of industrial tonnage and voracious appetite has arrived at the access ramp: data centers. These megainfrastructures, pillars of artificial intelligence and the cloud, promise to water the economy of millions, but their brutal need for supply threatened to burst the seams of an already saturated electrical system. To avoid collapse and not let the reindustrialization train escape, the Government has had to react and radically change the technical rules of the game. Cascading capacity collapse. To understand the collapse we have to look at how our way of consuming energy has changed. The energy transition is profoundly reconfiguring the model throughout the national territory. Requests to connect to transportation and distribution networks have skyrocketed. In addition to the electrification of industry and renewable hydrogen, there is now massive consumption associated with data centers for artificial intelligence. The problem broke out when the National Markets and Competition Commission (CNMC) established a “dynamic criterion” to calculate how much access capacity was available in the areas shared by several network nodes. As detailed by the Ministry for the Ecological Transition and Demographic Challenge (MITECO) in his press releaseapplying this criterion means that a single access requested at a node can cause a “cascading effect that drains capacity in the rest of the nodes that share the area”, blocking requests from dozens of kilometers away. Basically, a large data center asks for passage and, automatically, the system administratively blocks neighboring nodes as a precaution, even if physically the cables have plenty of space. Investments in the air and the ghost of the blackout. The consequences of this traffic jam directly affect the real economy and national security. Real estate and industrial paralysis. The situation is so critical that, as we already mentioned in our previous coverage citing the Asprima employers’ associationlast year only 12% of connection requests for new urban developments were granted. There are 350,000 homes at risk simply due to lack of electrical power. The risk of an electrical “zero”. The Official State Gazette warns that the increase in installations that are not able to withstand “tension gaps” poses a very high risk. If there is a disturbance and these generators are massively disconnected, exchange flows are produced that are incompatible with Spain’s limited interconnections with Europe. As the diary recalls The Countrythe objective is to avoid at all costs a repeat of massive blackouts like the one suffered by the Iberian Peninsula on April 28, 2025. It is not enough to put more cables. In areas limited by this dynamic criterion, it is no longer possible to enable new capacity simply by investing money in reinforcing the network with “more copper.” The expert in the sector Joaquín Coronado sums it up perfectly: the demand must be 100% active; It must provide flexibility and commit to the stability of the system. The Government’s emergency surgery. To unclog this Gordian knot, the Government and regulators have launched a three-way shock plan: The new Royal Decree of MITECO. The Ministry has been brought to public hearing (until March 16) a standard that updates the technical requirements to connect to the network. The master key is that now it is required that the demands “withstand voltage gaps”, do not introduce adverse oscillations and maintain the quality of the wave. By forcing installations not to disconnect in the event of small disturbances, the number of nodes affected in shared areas is reduced. This simple technical measure could bring out 50% more capacity in about 900 knots of connection to the high-voltage network. The “flexible permits” of the CNMC. To put an end to the binary model (either I give you all the capacity or I deny it), the CNMC has proposed four new types of permits, as we already broke down in Xataka. These range from allowing consumption only in certain time slots, to “dynamic” permissions where the operator can remotely disconnect a data center if there is an emergency on the network. The “technical amnesty” for data giants. In parallel, the Ministry of Industry has been urgently removed the “off-peak” requirement. Previously, to receive aid, you had to consume at night, an absurdity for a data center (which operates 24/7) and for today’s Spain, where solar energy has brought down prices at midday. The citizen cost and the fine print. The Government’s maneuver not only responds to a national emergency, but also places Spain as a pioneer on the continent. The country is anticipating the update of the European network codes, deploying a battery of technical specifications simultaneously that is already considered a milestone worldwide, as detailed The Country. In this deployment, the new regulations also settle a historical debt with energy storage: batteries will finally have their own specific regulatory framework, no longer being administratively treated as simple “generation by analogy” facilities. However, this deep digitalization so that the network supports such a complex mode of operation will not come for free, and the bill for modernization will end up looming in the consumer’s pocket. Forecasts for 2026 They already estimate direct increases in citizen receipts, with a 4% increase in tolls and a not inconsiderable 10.5% in electricity system charges. And while citizens assume the technical cost, the data giants – recipients of this regulatory red carpet – prefer to remain cautious in the face of the eternal Spanish bureaucratic obstacle. The technology sector warns that a key piece of the puzzle is missing: If the Government does not expressly include the National Code of Economic Activity (CNAE) corresponding to “Data Processing” in the official list of sectors entitled to receive the million-dollar electro-intensive aid, all … Read more

Data centers in space promise to save the planet. And also ruin the earth’s orbit

Wikipedia should update its page dedicated to the word “ambition” to include Elon Musk’s photo. The tycoon has announced a megaproject according to which his two companies SpaceX and xAI will work together to launch a constellation of one million satellites that will function as data centers in orbit. The problem is that although the idea It has its advantages, it also has an impact potentially terrible for the future of our planet. Energy efficiency. That is the great advantage of the space data centers that Musk proposes. In space, solar panels can perform optimally without the obstacles posed by Earth’s atmosphere and climate. According to SpaceX, the reduction in the cost of launching its rockets makes space a perfect alternative for AI data centers. The plan. He project that has been presented to the US Federal Communications Commission (FCC) consists of placing these satellites in sun-synchronous orbits between 500 and 2,000 km high. That would allow the satellites to act as interconnected nodes among themselves and also with the satellites of the Starlink network through optical laser links. The plan, of course, will have to overcome important challenges like refrigeration. Dissipating the heat generated by millions of chips in the vacuum of space is complex, since satellites act as “natural thermoses.” And radiation, what? The problem of cosmic radiation will also have to be solved. Advanced chips are very vulnerable to processing errors caused by energetic particles. It seems that AI processors are surprisingly resistant to this type of problembut the deployment of such chips on a massive scale in space could introduce new conflicts. On-site repair, nothing. In today’s data centers, if a problem arises, a technician can physically travel if necessary to solve it. In space, physical repair is not feasible, which would force a strategy of assuming that those chips that become functionally damaged will be completely lost. SpaceX would have to continuously launch substitutes to compensate for this “mortality” of components, which complicates logistics and costs. There are optimistic perspectives in this regard, and for some the bills do work out. Kessler syndrome. But above all there is a latent concern in the field of space security. Launching a million new satellites into already congested orbits multiplies the probability of chain collisions, validating the theory proposal in Kessler syndrome. A single major collision could generate a cloud of debris that would take decades to clear, further threatening climate monitoring missions or even global communications. There are already ideas to “regulate orbital traffic” by coordinating it, and SpaceX has its own “situational awareness” system, Stargazeto avoid problems, but of course, no system is completely perfect. air pollution. Without forgetting that the atmospheric impact is equally worrying. Some are estimated 25,000 Starship flightsand the re-entry of satellites that end their life cycle or die prematurely would cause metals and particles to be released into the upper atmosphere. According to experts, these chemical residues could damage the ozone layer and cause uncertain climate consequences. You can’t see anything. The astronomers, who They had already protested about Starlinkthey will have an even bigger problem with this new idea. The threat to astronomy is clear, because given the altitude and size of these satellites, it is likely that they form a bright band visible even to the naked eye, making scientific observation difficult and even changing the way we see the sunset. Orbital computing may have advantages, but before launching it we should remember that space—especially the space we see—is a shared and finite resource. In Xataka | Starlink’s dominance in space begins to move: another company already has permission for a constellation of 4,000 satellites

Aragón produces so much energy that it no longer knows what to do with it. And that’s great news for data centers

Aragon has always served as a great battery for the rest of the country, sending gigawatts to the industrial centers of Catalonia or the Basque Country, but now the script has changed. The community now has a “problem” that many would envy: it produces so much energy that it has attracted those who need it most. As if it were a magnet, the technological giants have landed in the Ebro valley to convert the region in what The Country already calls “Spanish Virginia”, in reference to the North American state with the highest concentration of data centers in the world. The x-ray of a bittersweet record. To understand the magnitude of the change, you have to look at the counter. According to the data collected by The Aragon Newspaperthe community once again broke its historical record for electricity production in 2025, reaching 22,365 gigawatt hours (GWh), 2.1% more than the previous year. However, this milestone hides an important small print: the record was not achieved thanks to the wind or the sun, since these fell by 4.8% due to the drought (which sank the hydraulics by 19.1%) and a less windy year. Here comes the bittersweet part, to compensate for the green decline and cover the gap left after the great blackout in April, the gas combined cycles increased their activity by 112.2%. But the data that really confirms the change of era is not how much is produced, but how much is spent. While electricity demand in Spain grew by a modest 2.7%, in Aragon internal consumption shot up by 7.1%, a figure that the provincial media describes as “true structural change” and that it attributes directly to the takeoff of the Amazon Web Services (AWS) complexes in Villanueva de Gállego, El Burgo and Huesca. The rain of millions (and megawatts) This energetic appetite is no coincidence; It is the fuel for an unprecedented investment. As we have explained in Xatakathe autonomous government has given the green light to the expansion of AWS, which contemplates an investment of 15.7 billion euros in a ten-year plan. It is not about building isolated ships, but about creating an “AWS Region” (Europe Spain), a system of eight campuses interconnected by fiber optics that function as a single operational unit protected against failures. But it’s not all servers and algorithms in the cloud. From the Herald have detailed that Amazon will not only save data, but will also build a server recycling factory in Aragon. With an additional investment of 200 million euros, this circular economy plant promises to create up to 1,100 direct jobs, a balloon of labor oxygen that goes beyond highly qualified technical profiles. Jam in the network and flight to Teruel. The Aragonese paradox is that, although there is plenty of energy, there are no “roads” to transport it. The electrical distribution network in the community is at its limit, with an occupancy of 94.3%well above the national average. There is electricity, but there are no free outlets for so much industry. This saturation in the Zaragoza logistics hub has caused an unexpected movement towards “emptied Spain.” As my colleague in XatakaGiven the impossibility of connecting in the capital, AWS has decided to take one of its new centers to La Puebla de Híjar, a town in Teruel with barely 900 inhabitants. The choice is strategic: the N-232 highway acts as the backbone and, there, the electrical grid has the capacity (100 MW guaranteed) to feed the beast. Side B: water and territory. Every revolution has a cost, and in this case it is measured in natural resources. Digital euphoria collides with the physical reality of a dry land. The alarms went off, as reported The Countrywhen Amazon requested to expand its water concession by 48% to cool its servers. The conflict is palpable on the ground, the Gaén irrigation community in Teruel keeps negotiations blockedrefusing to give up water from the Ebro if that compromises the agricultural future of the area. The most critical view brings it Ecologists in Action. Its renewable viewer warns that the deployment is not harmless: there are more than 12,000 hectares of authorized solar plants and thousands of wind turbines in the pipeline. The organization warns that, if all the data center projects in the portfolio are approved, their electrical consumption could reach five times the current demand of the entire community, turning the Aragonese landscape into a continuous industrial estate and drying up its water resources. The new balance. Aragón closed the year 2025 at a fascinating crossroads. How to conclude The Aragon Newspaperthe community continues to be surplus, but less and less. Electricity exports have fallen from 56% to 52% in just one year. The region has achieved what seemed impossible: from being a mere service station to becoming the engine of the digital economy. But the question that remains in the air, between million-dollar investment figures and environmental warnings, is whether the electricity grid and water resources will withstand the weight of being Europe’s hard drive. Image | freepik Xataka | Aragón is not afraid of AI: it has just approved three more new mega data centers in full commitment to renewables

Electric car battery makers are retooling to make batteries… for AI data centers

In the United States there are a slowdown in the electric vehicle industry, which has caused more and more manufacturers in the sector to convert their business. According to account Financial Times, ten North American factories that produced batteries for electric cars are allocating a good part of their production to energy storage systems for AI data centers. It is the latest industry to readjust around artificial intelligence. The change of course. The media shares data from the consulting firm CRU, which states that these ten plants have canceled enough capacity to produce batteries for 2 million electric vehicles. Of these, seven will focus primarily on the energy storage systems (ESS) market. Among the names involved are Ford, which is converting a factory in Kentucky, and Stellantis along with its partner Samsung SDI, which are converting production lines at its Indiana plant. General Motors is also considering producing its own energy storage batteries, according to declared its head of batteries, Kurt Kelty, to the Financial Times. Why data centers need batteries. Data centers that process AI models require uninterrupted power supply to protect against blackouts or voltage fluctuations. With the construction boom of these centers in the United States, storage batteries have become a critical component of infrastructure. This opens up an alternative revenue stream for automotive companies struggling with electric vehicles. The Tesla example. It is worth taking a look at the numbers of Elon Musk’s company, since in addition to producing vehicles it also manufactures energy storage systems such as Megapack and Powerwall. In this sense, its battery business is turning out to be tremendously profitable, since the company reported income for energy and storage of $12.8 billion in its last quarter, a growth of 27% year-on-year. In 2021, that figure barely reached 2.8 billion. Meanwhile, its revenue from electric vehicle sales has fallen 9% to $64 billion. Political context difficult. Just like account FT, Since the Trump administration eliminated tax incentives for electric vehicle buyers put in place during the Biden era and lowered emissions standards, the electric vehicle market in the United States has seen a slowdown. This has led BloombergNEF to revise its forecast downwards: from expecting electric vehicles to represent 48% of total car sales in 2030, they now project only 27%. Electric vehicles currently account for about 8% of new car sales in the United States. The aid that is maintained. As well as mention In the middle, although these subsidies have been eliminated, the administration retains generous incentives for battery manufacturers: a production credit of $35 per kilowatt-hour and a 30% tax credit for investments in energy storage. In addition, tariffs on Chinese storage batteries are around 60%, allowing manufacturers to produce in the United States at prices close to parity with Asian imports. Between the lines. It is also worth highlighting important nuances. CRU’s Sam Adham counted to FT that battery manufacturers will not necessarily pass on what they save on costs to their customers (they may increase their margins, for what). In addition, according to the FT, the Korean companies that lead the production of storage batteries in the United States have less experience with the lithium iron phosphate technology used by these systems, compared to their Chinese rivals. It is not a total reconversion, for now. Wood Mackenzie’s data suggest that electric vehicles will continue to absorb a greater proportion of battery installations than energy storage until the end of 2030. “If there is a rebound in demand for electric vehicles, companies that have switched to storage systems could be left behind,” said Milan Thakore, an analyst at the consultancy. More sectors than They pivot towards AI. From the Semafor newsletter, also they mention another very interesting sector that is beginning to convert its business towards AI: cryptocurrency miners. And according to Morgan Stanley, facilities dedicated to cryptocurrency mining are seeing a more profitable business in the creation of data centers for AI. The economics of cryptocurrency mining have gotten worse and worse since the reward is lower, and converting these facilities into infrastructure for artificial intelligence is much more profitable. According to the calculations Morgan Stanley, transforming all bitcoin mining facilities in the United States could reduce the electrical capacity deficit for data centers by between 10 and 15 gigawatts. Cover image | CHUTTERSNAP and İsmail Enes Ayhan In Xataka | If AI is the “weapon” of the future, the US is already investing 25% of all world military spending in it

This year more will be invested in data centers than what the US spent to reach the Moon

We are witnessing live a technological race that is no longer measured only in announcements or demonstrations, but in tangible investments that grow at a speed that is difficult to ignore. In the United States, and also in other regions, large companies are allocating increasing amounts of money to build and expand the infrastructure that supports the current deployment of artificial intelligence services and the expansion of computing capacity that these companies pursue. Some speak of excessive enthusiasm and even a possible bubblebut the money already invested is part of the economic reality of the sector, while the projected figures point to an even larger scale. The question, therefore, is not whether the bet exists, but how big it really is. The numbers. If the first step is to assume that the investment exists, the second is to quantify it precisely. Data collected by The Wall Street Journal They suggest that Meta, Amazon, Microsoft and Alphabet (Google) could concentrate a joint expenditure of up to $670 billion in 2026 aimed at artificial intelligence infrastructure. We are talking about capital outlays associated with data centers, hardware and capacity expansion, not just “brick”. When a single annuity reaches that order of magnitude, the conversation shifts from expectations to measurable economic consequences. Dollars are not compared. What the analysis proposes is not a direct equivalence between amounts spent in different times, but rather a way of measuring the economic weight of each effort in its own historical context. Instead of adjusting old figures to current prices for inflation, the article uses the percentage of gross domestic product (GDP) as a common reference for separate projects over time. That shift in focus shifts the conversation from absolute money to relative magnitude within the U.S. economy. And it is precisely there where the investment associated with artificial intelligence acquires a historical dimension that is difficult to ignore. The investments. Among the great economic milestones that are often used as historical references in the United States, there are episodes as different as the Louisiana Purchase, the railroad expansion of the 19th century or the construction of the interstate highway system, all of them with different relative weights within the economy of their time. Using that same metric, this effort has been estimated around the following magnitudes: Louisiana Purchase: 3% of GDP Railway expansion: 2% of GDP Interstate highways: 0.4% of GDP Apollo Program: 0.2% of GDP As we can see, the planned investment in artificial intelligence infrastructure is around 2.1% of GDP. It’s not the same, but. Historical parallelism functions as a scaling tool, not as institutional equivalence. The large projects with which the current moment is compared were, in many cases, public initiatives financed directly or indirectly by the federal State, while investment in AI infrastructure corresponds mainly to corporate spending. That distinction is important, however, from a strictly economic perspective, the relative size of the effort remains comparable. The State does not pay the main bill. That the bulk of investment is private does not mean that the public sector remains on the sidelines. It’s no secret that the U.S. government influences the pace and shape of deployment through regulatory decisions, permitting, energy planning, and federal land use for new data center infrastructure. This set of levers is not a substitute for corporate capital, and at the same time it fits with a broader strategy aimed at preserving American leadership in the global race for AI. Historical comparison. This ends up pointing out something deeper than a simple number: it indicates the type of priority that a society decides to give to certain technologies at a specific time. When investment in AI infrastructure reaches a relative weight comparable to that of major American economic milestones, reading transcends the technology sector and enters the strategic realm. Images | POT | freepik In Xataka | Daniela Amodei, co-founder of Anthropic: “studying humanities will be more important than ever”

Spain wants to become a “bunker” for data centers with a very clear attraction: cheap energy

Spain finds itself facing a historic opportunity. In the offices of big technology companies—from Amazon (AWS) until Microsoft or Google—the map of the Iberian Peninsula shines with its own light. The geographical location and the deployment of fiber optics have made the country the ideal candidate to be the great “cloud” of southern Europe. However, there is a toll: these data centers (DPCs) consume electricity at an industrial pace. Only the Community of Madrid investments are played worth 23.4 billion euros linked to these projects, while regions like Aragon see how the demand from these centers threatens to absorb half of all the energy they occurs in the community. But until now, Spain had a barrier to entry: an electrical regulation designed for steel foundries, not for servers. In order not to miss the investment train, the Government has decided to make a move and change the rules of the game. A change of rules in the BOE. The Ministry of Industry and Tourism has activated the legislative machinery. The goal is to allow data centers can access to the Statute of Electrointensive Consumers, a category that until now was reserved for large heavy industry and that allows receiving million-dollar compensation on the electricity bill. In fact, the first step is now official. Through a resolution of the Secretary of State for Industry published last January, the Government has eliminated with a stroke of a pen and as a matter of urgency the main technical obstacle for the 2026 campaign: the “off-peak” requirement. The previous regulations required companies to consume at least 46% of their electricity during the cheapest hours (generally at night) to receive aid. This, which works for a factory that can put on night shifts, is impossible for a data center that operates 24/7. The new resolution considers this requirement fulfilled for all applicants this year, a “technical amnesty” designed to facilitate the entry of new actors. However, it is not an isolated patch. In parallel, the Ministry has submitted to public consultation a Royal Decree Project to reform the Statute in a structural way. The text, whose hearing process has already included the sector’s allegations, explicitly recognizes that the current regulations have been ‘misaligned’ and need to be adapted to strengthen the competitiveness of companies in the face of high energy prices. The end of the tyranny of the night. To understand the importance of this measure, you have to look at the sky. The old rule required consumption at night because, historically, that was when electricity was cheap. But the explosion of solar energy in Spain has changed the paradigm: now, the cheapest hours tend to occur at midday, when the sun shines brightly, generating what experts call the “duck curve” in prices. Maintaining the obligation to consume at night was not only a bureaucratic barrier for data centers, but also economic and ecological nonsense in the Spain of 2026. By eliminating this requirement, the Government not only helps technology companies, but also adapts the law to the reality of an electrical system dominated by renewables. Less bureaucracy and more compensation. The Government’s plan to seduce data centers does not consist of paying for their electricity directly, but rather of shielding them from indirect costs. The reform proposes two courses of action: money and simplification. Compensation of hidden charges: The new Statute will allow subsidizing costs that increase the bill but are not energy consumption, such as contributions to the National Energy Efficiency Fund (FNEE). According to industry sourcesthis charge is around 2 euros per megawatt hour and has a tendency to rise. Alleviating this burden is vital for technology companies’ numbers to turn out green. Administrative facilities: The entrance exam has been relaxed. Along with the elimination of off-peak hours, the BOE has set a new technical ratio (ratio between consumption and added value) of 0.61 kWh/€ by 2026. In addition, cumbersome requirements are eliminated, such as the requirement for very specific long-term renewal contracts, which generated a disproportionate administrative burden. The missing piece of the puzzle. Despite the red carpet rolled out by the Ministry, the sector remains cautious. From SpainDC, the association that brings together data centers in Spain, they value the elimination of the off-peak hour requirement as a “relevant advance”, but they warn that the party has only just begun and they still do not have the official invitation in hand. The problem is bureaucratic, but lethal: the CNAE (National Code of Economic Activity). To be an electro-intensive consumer, your activity must appear on a closed list of eligible sectors. If the Government reforms the technical requirements but does not expressly include the “Data Processing” code (6311) in that list, the reform will be a dead letter for them. “For data centers, the inclusion of the CNAE is a premise. Without it, certification is still not within our reach,” employers warn the Energy Newspaper. Added to this is the underground tension due to the capacity of the network: it is not enough for energy to be cheap, there must be “plugs” available. The Electrical Network It is saturated in key pointsand the sector demands urgent investments so that the promised megawatts actually reach the servers. A seduction in the testing phase. Spain has sent a clear message to international markets: it wants to be Europe’s great data warehouse and is willing to modify its sacred industry laws to achieve it. The BOE resolution for 2026 It is the test of faitha temporary safe passage to prevent the flight of investments. However, the ultimate success of the strategy depends on the fine print that is written in the coming months. If the structural reform of the Royal Decree ends up including data centers in the official list of beneficiary sectors, Spain will have completed its transformation: from a country of sun and sand, to a country of sun and data. Image | freepik Xataka | Meta is spending millions and millions of dollars convincing us of one thing: that data … Read more

Data centers are so important that Meta has spent millions on advertising to change our perception of them

Meta has spent 6.4 million dollars on an advertising campaign between November and December of last year to convince the American public of the benefits of its data centers, according to the New York Times. The ads, aired in eight state capitals and Washington, DC, featured idealized images of American towns revitalized by these facilities. exists an increasingly significant social rejection on the installation of data centers dedicated to AI, especially due to the impact they have on the excessive consumption of basic resources like light and water. And of course, first we have to convince that they are key so that Meta and the rest of the big technology companies can continue with their operations. The Goal campaign. According to the media, the ads featured emotional stories about Altoona (Iowa) and Los Lunas (New Mexico), two locations where Meta operates data centers. With guitar music and shots of farms and football fields, the videos promised jobs and prosperity. “We are bringing jobs here, for ourselves and for our next generation,” the voiceover said. According to Michael Beach, CEO of Cross Screen Media, Meta “could have purchased these ads with the goal of influencing political decisions and reaching legislators.” Ryan Daniels, spokesperson for Meta, limited himself to say to the NYT that the company pays the full costs of the energy used by its data centers, without commenting on the advertising campaign. Meta is not alone. Just like account NYT, Amazon is funding a similar campaign in Virginia through Virginia Connects, a nonprofit created by the Data Center Coalition. From the Financial Times they point In addition, other operators such as Digital Realty, QTS and NTT Data are also acting more intensely to defend the construction of new facilities. Endurance. In the United States, social rejection has caused the cancellation of multimillion-dollar projects in Oregon, Arizona, Missouri, Indiana and Virginia. Democratic Senator Chris Van Hollen explained He told the NYT that the issue has become “a priority on Capitol Hill” when his voters began to complain en masse about electricity bills. Just like share The media, this month, Van Hollen presented a law to regulate the energy consumption of data centers. Even President Donald Trump spoke out on the matter: “The big tech companies that build them must pay their own way,” wrote a few weeks ago on Truth Social. electricity bill. Data centers have become critical infrastructures for the development of artificial intelligence, but there is increasing social tension over their installation. In October, Bloomberg counted that in the last five years the wholesale price of electricity in areas near large concentrations of data centers in the United States had increased by up to 267%. In Baltimore, residents paid $17 per megawatt-hour in 2020; In 2025 that figure reaches $38. On the other hand, the medium demonstrated In their research, 70% of the points where electricity price increases were recorded were less than 80 kilometers from data centers with significant activity. From Bloomberg they estimate that the energy demand of these facilities in the United States will double by 2035, becoming the largest increase since the 1960s. The situation in Spain. Our country is also experiencing a boom in the construction of data centers. The Community of Madrid, paradoxically the region with the greatest energy deficit in Spainconcentrates a good part of these projects and is expected to reach a power of 1.7 gigawatts in 2030. The consulting firm CBRE pointed out in a report that “there is no investor, operator or large technology company that does not have in its strategic plans to establish its data center project in the Iberian market.” Madrid, together with Barcelona, ​​already competes with cities such as Milan, Zurich or Berlin, although still far from the leading European group in terms of power capacity formed by Frankfurt, London, Amsterdam, Paris and Dublin. What awaits us. According to Bloomberg, the forecasts they point because data centers will consume more than 4% of the world’s electricity in 2035. If these facilities were a country, they would be fourth in energy consumption, only behind China, the United States and India. Meanwhile, big technology companies are already exploring solutions such as modular nuclear reactors (SMR) to power your facilities, or send data centers to space. Cover image | Mark ZuckerbergGoal In Xataka | “The assemblies are not going to be done by AI”: we talk to the kids who have become carpenters, truck drivers and tinkerers

who puts the most data centers into orbit

He map of world data centers It shows that there is no decentralized internet and that they are proliferating like mushrooms. In fact, planet Earth has fallen short and big tech companies already have their eyes set on the sky to plant a data center in space due to issues such as energy demand, environmental impact and, why not say it, to avoid regulation. The “panacea” of space. Faced with the threat of energy consumption similar to that of Japan in 2030according to data from the International Energy Agency or the brutal density of Data center Alley in Loudonin northern Virginia, with nearly 250 operational facilities, space envisions the possibilities of having satellites equipped with solar panels that capture energy directly from the sun, thermal dissipation in space and the absence of terrain limitations. There’s less left. For it to be viable, it takes at least a decade, as esteem University of Central Florida research professor and former NASA member Phil Metzger. However, it is one thing for the bills to work out economically and another for technologically having to wait so long. According to Josep Jornetprofessor of computer and electrical engineering at Northeastern University and satellite researcher, in just a couple of years we will begin to see evidence. And he is clear: space is the next frontier to conquer: “There was a gold rush in the West. Now there is a space race and everyone wants to place their technology in space.” Money galore. The Catalan scientist is clear that companies have incentives to move quickly and invest to get ahead to dominate the AI ​​race in general and space in particular: “Everyone wants to say they have the first platform to reach this milestone (…)So companies are spending money like there is no tomorrow.” However, Google, SpaceX and Blue Origin they are already working in developing technologies for this purpose and they are not the only ones: SpaceX. At the end of the year the Wall Street Journal uncovered Elon Musk’s company’s plan to realize data centers in space. Its CEO explained in a tweet how he would do it: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” More specifically, they are working on modifying and improving their rockets to make them capable of hosting computing loads for AI. Blue Origin. The American media also put on the table Jeff Bezos’ project, which at the time revealed at the Italian Tech Week that it’s a matter of time before we see “giant training clusters” of AI in orbit in the next 10 to 20 years. The company has a team dedicated to developing the technology required for centers in space. Google. Last November the Mountain View company speak of their experimental project Project Suncatcher: in 2027 and with the collaboration of Planet Labs they will launch two test satellites with their own AI processing chips. Others. There are other smaller corporations working in this area. The most notable is StarCloud, a startup backed by NVIDIA that a few weeks ago launched a satellite with an NVDIA H100. This GPU is used to run a version GemmaGoogle’s open language model. You need energy (and knowing how to use it). Although the foundations have already been laid, the road is not exactly downhill. Jornet details that one of the big obstacles will be having enough energy for these orbital data centers to function: “The Sun can be a great source of energy, but to properly harness it, orbiting data centers would need huge solar panels kilometers long or a constellation of smaller panels that could number in the tens of thousands.” Life in space is hard. There are more melons to open, such as how AI chips will withstand harmful space radiation, as well as heat dissipation and cooling. On Earth thousands of liters of water are used. In space there is no such option and although temperatures are low, there is no air to cool the chips naturally. The bill to the Earth. Even ignoring the environmental impact in space, it also leaves its mark on Earth. At least, in the short term: rocket launches not only consume fossil fuels, but also damage ecosystems and animals in the environment, as happens at Cape Canaveralwhich now hosts about 80 launches a year. In Xataka | The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation In Xataka | The problem with data centers is not that they are running out of water or energy: it is that they are running out of copper Cover | Pixabay

The US electrical grid does not support so many data centers so they have had an idea: disconnect them to avoid blackouts

One third of all data centers in the world They are in the US and that is putting a huge burden on the electrical grid. One of the consequences that consumers are noticing is the price increases on the invoice, But electricity operators already foresee another problem: blackouts. What is happening. They tell it in WSJ. The US power grid is beginning to become strained, with grid operators expecting blackouts during periods of high demand. The solution they propose to avoid this is to make data centers disconnect from the network and use their own energy reserves temporarily. The technology companies have not been amused and talk about “discriminatory measures.” Why is it important. In 2023, data centers already consumed 4% of all the country’s electricity and the forecasts are that by 2028 that percentage will increase to 12%. The electrical grid is not prepared to support so much demand and, although it is already expanding, the pace of construction of new data centers is faster. Network operators face a difficult dilemma: powering data centers while maintaining supply to consumers. ‘Kill switch’. PJM Interconnection It is the organization that oversees the energy market in the Midwest, where they have already suffered from the problem of price increases. The concern that blackouts will occur is on the table and PJM has proposed that technology companies create their own energy sources or accept that their supply will be cut off if the network becomes too saturated. They are not the only ones who have raised something like this. With demand expected to double by 2035, Texas passed a law last year that contemplates a ‘kill switch’ that allows large consumers, such as data centers, to be disconnected at times when the network is under “extreme stress.” What the technologies say. As we said, the companies that own these data centers have not been very happy with the proposal. The Data Center Coalitionof which companies such as Google, Microsoft and AWS are part, have stated that the proposal is discriminatory since data centers need a reliable and stable network. They also warn that depending on their own energy reserves could have a negative environmental impact, by forcing them to use solutions such as diesel generators. Waiting times. There is an intermediate scenario in which technology companies can obtain benefits if they accept these conditions. As the electrical infrastructure does not support so much demand, data centers have to wait several years to be connected to the network, normally between 3 and 5 years, although there have been cases up to 8 years. Southwest Power Pool, the grid operator in Texas, has offered data centers a deal: give them access to the grid sooner in exchange for agreeing to be disconnected during times of high demand. According to a recent study Funded by Google, data centers that have more flexible connections (i.e., those that build their own power sources and accept temporary disconnections) typically connect to the grid several years faster than those that do not. Bring your own energy. Despite the reluctance towards that off button, generating your own energy is the most realistic solution and the one towards which the industry seems to be moving. Google recently bought an electrical company in order to obtain its own energy. Others big tech Amazon, Microsoft, Oracle or xAI are also exploring create your own energy solutions such as natural gas and solar panels. Image | Google In Xataka | Drastically reducing data center consumption is crucial for AI. And China has had an idea: submerge them in the sea

a third of the world’s data centers are in a single country

Currently there are more than 11,000 data centers operating worldwidewhich is said soon. Seeing the huge investment by technology companies, The figure is going to grow exponentially in the coming years. Now, thanks to the interactive map of Data Center Map We know where they are. An overwhelming majority of them are in the northern hemisphere, with one country accounting for almost a third of the total. United States rules USA To no one’s surprise, the country with the largest number of data centers is the United States. Considering that the major cloud infrastructure companies are American, this is also not surprising. In total they have 4,303 data centers spread throughout the territory, but not on a regular basis: there are regions in which the concentration is brutal. In the state of Virginia alone there are a whopping 668 data centers, which is more than Germany, the second country on the list with 494 centers. The weather too We already know that data centers consume a lot of energy and much of it goes into cooling their components. The hotter it is outside, the more it will cost to cool it and therefore the more energy is consumed, as well as water. According to the American Society of Heating, Refrigerating and Air Conditioning Engineers, The ideal temperature for a data center is between 18 and 27 degrees Celsius. Location has a notable impact on electricity and water expenses, which is why technology companies usually choose places with lower temperatures to set up their infrastructure. The south also wants its piece of the pie Indonesia It is striking that, despite the temperature recommendation, there are many data centers in countries where heat is a problem. Rest of World has done an extensive analysis about this phenomenon and estimates that at least 600 facilities are operating in areas outside the optimal range. In fact, following the list of countries with the highest number of data centers, we see that Indonesia is in third place with 184 facilities, followed by Brazil with 196. Both have a average temperature of more than 26 degrees, which means that for much of the year temperatures exceed that threshold. Singapore A striking case is that of Singapore, where the average temperature is more than 28 degrees. It has 78 data centers, a low figure compared to those we have mentioned, but they are concentrated in a very small area, which makes it one of the countries with a higher data center density. Other countries where demand for data centers is increasing are IndiaVietnam and the Philippines, all of them with quite hot climates. The heat challenge Why build in such hot areas? For many countries, data being within their own borders is more important than optimal operating temperature. The risk that arises is that, with the temperatures increasing year after yearwhat is now a manageable situation can become a difficult problem to solve, especially in areas such as Southeast Asia and the Middle East. They say in Rest of World that precisely in Singapore there is an initiative in which more than 20 technology companies and universities participate with one objective: to develop a refrigeration system Specific for humid and hot climates. The most common cooling system is air, but in these areas it is most effective to use a hybrid cooling system that uses air when possible and water when it is hotter. In some areas with extreme temperatures such as the United Arab Emirates, they are even considering build them underground. In China they are testing an even more radical solution: build a data center under the sea. Image | ChatGPT, with data from Data Center Map In Xataka | Aragón is not afraid of AI: it has just approved three more new mega data centers in full commitment to renewables

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.