Connecting to unknown networks can be risky for your personal data. Protecting you don’t cost even 2 euros per month

It is a reality: cyber attacks have become increasingly common. There is a lot of undesirable loose looking to get sensitive information, whether large companies or a user like us. If we work from home or we usually connect with our network we have a safety layer, but what if we use ourselves A network on which we don’t have any control? The good news is that there are several ways to protect our Internet traffic, wherever we are. The simplest, useful and effective way to do so is to use a VPN, and if we can afford it, better bet on one of payment. In fact, There are very cheap: Surfshark’s barely costs 1.99 euros a month. Protect your traffic and IP with a good VPN As we say, it exists A good variety of free VPNperfect if we need to use something at a timely moment. The problem they have is that, in addition to being little safe, They work limitedly in terms of traffic or speed volume. For this reason, the ideal is to bet on a payment like this Surfshark, which also has a great price. One of the advantages that this has is that We can install it in an unlimited number of devicesideal to take it in the laptop, on the mobile or on the tablet (or everywhere at the same time). With this, we can protect our Internet traffic, thus gaining a greater dose of privacy. Moreover, it also helps us hide our IP, information that is better to keep away from undesirable. Surfshark VPN is included in its Starter Plan, which also comes with another tool called ALTERNATIVE ID. With it, we can create a series of fictitious data to use them on web pages where we do not want to enter our real information. That way, we keep our personal data at a good collection. As we have commented before, for 1.99 euros A month we have a quality VPN. That means that its two -year plan comes out for a total price of 47.76 euros, a fairly affordable price to have this tool with us for a long season. There does not end the thing, because we will also receive three extra monthsin such a way that we will have surfshark for 27 months instead of 24. You may also interest you NORDVPN – Basic Plan (Monthly) * Some price may have changed from the last review Some of the links of this article are affiliated and can report a benefit to Xataka. In case of non -availability, offers may vary. Images | Chase Chappell in Unspash In Xataka | In Xataka |

Spain is becoming an authentic mecca of data centers. Uruguay has some lessons about it

Spain is fashionable Between the Big Tech. Practically all have chosen our country to Create new data centers. Investments are notable in different communities, but Aragon is undoubtedly One of the ones that has bet most of these facilities, but there are (at least) a problem. The water. This is what a reportage from El País in which we talk about the risks that these new data centers raise Not only in Spain, but in other countries such as Mexico or Chile, where there are also strong investments of this type. Aragon tends a red carpet to Amazon In the case of Spain, it lends itself especially to what has happened in recent months in Huesca, where Amazon already had three data centers for its AWS platform (in the Burgo de Ebro, Villanueva de Gállego and the Phylus polygon in Huesca Capital), but Project new one in Walqa. The company announced last year an investment of 15.7 billion Ed dollars in the region between 2024 and 2033. This project raised quite controversial at the beginning of the year. It was then that the residents of the Rural neighborhood of Cuarte began to receive letters Notifying them of an expropriation of land next to the Walqa Technology Park. Among the concerns of these neighbors was the layout of a New high voltage electric line that crossed the townin addition to the high consumption of water resources. The neighbors met with Amazon representatives in February and finally managed to make the technology deviant that layout of the high voltage line outside the town. Amazon too reached an agreement to finance infrastructure to supply water to cuarte and other populations thanks to new channeling works from the San Julián de Banzo spring. The energy problem is still striking. These data centers, to which the one who projects one in La Cartuja, in Zaragoza, will consume 10,800 GWH, a huge figure that in fact exceeds the consumption of electricity throughout the province in 2024, which It was 10.54 GWh. To solve that problem the company has paid 1.5 million euros to expand the electricity grid to all your data centers. But Water consumption is even more remarkable. Carlos López, a member of Ecologists in Action in Aragon, explained in the country how Amazon will install several wells inside their plots to extract water from the subsoil and thus refrigerate the equipment. It is estimated That these data centers will consume more than 755,000 cubic meters of water a year to refrigerate equipment, but according to López there will be no control and “it will not be able to demonstrate how much water they will extract.” A Amazon spokesman clarified in that report that these wells “are subject to regulatory supervision” and are raised as a reserve water source. The company already indicated this year that it is using 48% more water of what I expected for a simple reason: The heat. It remains to be seen, of course, what happens when these centers are operational: it will be then when those energy and water consumption and their real impact on Aragon can really be valued, both for the consumption of their citizens and the rest of the industry – and especially the irrigation – as in the case of the environmental impact. That makes it very difficult value the true return of this type of projects for countries such as Spain. Although it is true that during its construction employment is generated, the operation does not usually require so many positions. In the recent data center project that Meta is creating in Talavera de la Reina (Toledo), it is expected that some 5,000 jobs for its construction will be created. However, when it is operational target will use about 250 professionals for its management and maintenance. Documents obtained by the country seal that in October 2021, in the three data centers that existed in Aragon “the total direct employees in each of the three centers in Aragon did not exceed twenty at that time.” That red carpet with which some autonomous communities are receiving these investments can end up giving many dislikes. A similar case: Uruguay Everything seemed promising in the new data center project that Google wanted to install in the Science Park, in the Uruguayan department of Canelones, attached to Montevideo. Google Data Centers in Storey, Nevada. Source: Google. This data center, the second of the company in Latin America, It began to build In August 2024 with an investment of more than 850 million dollars. However, the project has been surrounded by an important controversy since its inception. TO Daniel Penaresearcher at the Faculty of Social Sciences of the University of the Republic (Uruguay), Something scaled him in that project of the searches giant. In July 2022 this expert analyzed the project that Google presented, but realized something important: At no time were details about water consumption or energy that would impose said data center. The Uruguayan Ministry of Environment denied access to that data, and in December filed a lawsuit With the help of lawyer Carolina Neme. Months later Pena could access the information and discovered that in a first stage the data center will need 3.8 million liters of water per day (3,800 cubic meters). In the second that requirement was bent: it would need 7.6 million liters of water (7,600 cubic meters). But not any type of water. Drinking water. Pena said that the water needs by that data center were “considerable.” The average monthly consumption of a home for three or four people is 15 cubic meters, which means that the data center raises consumption equivalent to that of about 55,000-60,000 people a day. Google ended up modifying several aspects of the project, and among them that of that use of drinking water. The company ended obtaining permission To build it, when among other things he pointed out that instead of using drinking water, he would use a call -based cooling system Chillersclosed circuits that recirculate … Read more

How to store data for 100 years

Think for a moment: if you could send a file to the future for someone to open it within 100 years, how would you make it sure that it is still readable? That is the real challenge: the devices and formats that we have at our disposal were not designed to survive beyond A few years. Today’s technology tends to plan the immediate, without taking into account that physical supports degrade and that the formats evolve. For something to last a century, more than choose a storage support. Neither hard or SSD discs are prepared for long -term conservation. According to Backblazethe annualized failures (AFR) rate of 2024 was 1.57% and some models reached around 4.5% in short periods. These values, of course, are assumed for daily operation, not for a horizon of a century like we point to. In SSD, the ability to keep the data stored when it is not active or connected to energy falls with temperature and wear. He JEDEC standard Jesus218A can help us understand the scope: it demands at least 1 year at 30 ° C For conventional units and 3 months to 40 ° C in units oriented to business environments at the end of useful life. Once again, they are useful as operational copies, not as a centenary passive file. Far from being obsolete, The magnetic tape is still a large -scale file pillar. LTO offers estimated durability of decades in good condition and a cost per competitive terabyte for “cold” storage. Organizations such as CERN Document of tape libraries with planned migrations. The market accompanies: in 2024 176.5 EB of compressed capacity, 15.4% more interannual were sent, more interannual, According to the consortium. The counterpart is clear: it requires compatible hardware and constant supervision. Among the optical supports, Worm discs They still have room. The M-Disc paid well in accelerated aging testsbut the projections of centuries or millennia are Laboratory extrapolationsno independent consensus. In practice, we talk about decades and modest capabilities: useful as a legible copy with a compatible reader, little suitable for large volumes due to its slow writing. Sometimes, the most advanced is not digital. The permanent role regulated by ISO 9706 and the microfilm with Le-500 qualification in ISO 18901They are still in force for critical documents. Its advantage is clear: they are read with light, without depending on software or energy, and have defined regulatory frameworks and storage requirements. Different regulators demand Le-500 for certain uses, which maintains these means in the “mix” of preservation. They are not worth Petabytes, but they are still reliable for the essential. One of the great enemies is obsolescence. That is why many institutions plan format migrations before they are without compatible software or hardware. Many times, however, they must resort to techniques such as emulation To recreate old systems. It is not so different from what we see in the world of video games, when access to certain platforms is fading over time. Beyond the present, there are developing technologies with centuries. Project Silica, from Microsoft, records laser quartz glass data. The company speaks of “tens to hundreds of thousands of years” of potential life. And, as we saw a while ago, he did a test storing ‘Superman’ (1978) in a 75.6 GB glass piece. DNA storage is another of the options tested in research environments. They are promising lines, still far from the general use. Extending the life of the data does not depend only on support: budget, responsible and measurable objectives is needed. Governments and organizations already have plans to preserve long -term information, which implies well -defined strategies and periodic reviews. Today, when we talk about digital information, the great challenge is to save it in non -experimental supports that endure the passage of time without continuous supervision … and that can be read within a century. Images | Xataka with Gemini 2.5 Pro | Microsoft In Xataka | Online games have made their preservation complicated. The solution can be what this video game has done In Xataka | SSD, NAS, cold storage units: all physical and digital options to store teras and teras

Threads is eating the toast to X. He is doing it without making noise and with some data that already starts to weigh

Not all the mountain is oregano and not all text -based social networks are X. After the Elon Musk arrival at xthen Twitter, there were few users who fled from the bird’s social network in search of more green pastures. It was then that Mark Zuckerberg, Meta CEO, opened the doors of his: Threads. Two years They have passed Since then and the platform has grown to become a powerful X rival, one that already begins to stand up. The data. As confirmed yesterday Zuckerberg and Adam Mosseri (Instagram CEO) In its official profiles, Threads has managed to reach 400 million monthly active users. It is a significant figure, especially considering that they closed 2024 with Something more than 275 million users that in April the amount amounted to 350 million. 23 Twitter tricks – This social network is completely dominated! Monthly active users (MAU) are a key indicator to quantify the performance, state and health of, in this case, a social network. This metric measures the number of unique users who have visited the platform at least once during the past month. That the Mau grow is a good sign. And what about X? The only information related to X we have in some of the latest statements by Linda Yaccarino, which until a few weeks ago was the CEO of X. According to YaccarinoAs of April 2025 x, it had 600 million monthly active users. The growth was, however, softer. In September 2024 X it had 570 million Mau that, in January, became 586 million. That is, in general terms X it has more monthly active users than Threads, but the growth of Threads is much higher. On the other hand, it is also evident that Threads has more growth capacity than X, a social network that Elon Musk apart, already showed signs of having touched the roof when it was Twitter. Anyway, they are data that must be taken with tweezers because in both cases the source are the ceos of the platforms and, therefore, interested parties. A matter of use. However, there are more third -party data that support the growth of Threads. According to SimilarWebThreads and X have a very similar amount of daily active users (Dau) in their mobile apps: as of June 2025, the X and Android apps had 132 million Dau (15.2% less than the previous year), while Threads added 115.1 million (127.8% more than the previous year). The photo is very different if we look at the use of the websites, with X adding 145.8 million daily visits and Threads only 6.9 million. On the other hand, and according to Apptopia data Shared by Sheel MohnotThreads wins X in another key metric: the engagement. Threads users enter the app 6.6 times a day and each session lasts 190 seconds, while those of X enter 7.3 times and 150 seconds pass. This, in other words, means that Threads users spend 21 minutes daily using the app, while X spends 18 minutes. Threads has fewer users, but those who have them retains them for longer. Threads has a large user tractor: integration with Instagram THE ALMENDRUCO trick. It is no secret that goal has made Threads grow based on integrating it with one of the great titans of the industry of the industry of the doom scrolling: Instagram. IG users see some THREADS posts in their feeds, they can interconnect both profiles and even receive Instagram Push notifications with Threads content. Goal has its weapons and is using them quite intelligence. And Bluesky? Another of the most popular alternatives is Bluesky. After growing greatly at the end of last yearthe platform currently adds 38.2 million registered users, although its real activity is much lower than that of X and Threads users. According to the data of your own API, you have between 600,000 and 700,000 users publishing content and around 1.2 million interacting with the posts. At the moment, it still does not shade the proposals of Meta and X. Cover image | Xataka In Xataka | Bluesky and Fediverso want to stand up to the goal. These are the alternatives to Instagram that are already on the table

The region with the largest energy deficit in Spain is staying data centers

Spain is being filled with data centers. A report The Iberian Peninsula reveals from the real estate consultant CBRE has the interest of large technology companies. The fact is striking, but it is even more the fact that the great focus of these technology is in a region that a priori It would not seem ideal For these facilities: Madrid. Hyperscalers. The cbre study cited In five days It points out this unique concentration in Spain of various data centers projects of the so -called “hyperscales” (Hyperscalers). A Hyperscaler is a Mass provider of cloud services that operates a gigantic network of data centers distributed throughout the planet. Amazon is a good example of this type of companies, but there are more, and they all seem to focus their attention on the Iberian Peninsula. Big Tech bet on Spain … Elliot Zounon, responsible for the report, explained how “there is no investor, a large operator or technological that does not have in its strategic plans to establish its data center project in the Iberian market.” But especially for Madrid. Especially striking was the deployment of projects that indicated the current and future capacity expected in the Community of Madrid, and which amounts to a total of 203 MW. Some of the most important companies in the sector, such as Microsoft, Google, Oracle, IBM, Kyndryl or Ovhcloud have data centers in the community. Various projects with an investment of 23.4 billion euros Until 2028 they propose sensitive growth in this area, and it is expected that by 2026 the capacity of Madrid ascends at 222 MW. Madrid, near the “flap-d”. In the European Union this market has been dominated by the group called Flap-Dwhich is an acronym for Frankfurt, London, Amsterdam and Paris, to which in recent times Dublin has joinedwith a capacity of 328 MW. Madrid is part of the so-called Tier-2, a kind of “second division” of cities with a lot of capacity in data centers. The capital is ahead of Milan, Zurich, Berlin and Oslo, and is also in this Barcelona group, which occupies the tenth position of the TIER-2 with 42 MW installed. And the energy, what? This proliferation of data centers in the Community of Madrid is paradoxical, especially since it is the region that produces less energy from all over Spain and It depends almost completely on external supply. In 2024 Madrid produced 1,334 GWh, more or less the same as in 2021, while its annual electricity consumption in 2024 was of 27,487 GWh. Thus, the community concentrates 11% of the national electricity demand. Of course: Spain is becoming a real Power Exporter Powersomething that favors that role in Madrid as a focus of attention for the creation of future data centers. Emptied Spain produces, the big cities consume. The truth is that the situation of the Madrid energy deficit is logical if we take into account that it brings together a great population and industry density. Here, as in other great Spanish capitals, Energy inequality is clear: while energy occurs in much more depopulated regions – the example of Aragon with wind It is remarkable – that energy ends up taking advantage of in large cities. Our country He has opted very strong for renewablesbut Madrid is a separate case: for not, in Madrid There are no wind farms. Not everything is megawatts. The choice of Madrid not only depends on the gross megawatts, but also on a combination of intangible advantages that technological ones take into account. The capital concentrates interconnection nodes and a dense network of operators that facilitate the exchange of data traffic (something crucial for cloud services and AI applications). The presence of corporate venues also influences, as does the fact that logistics costs are reduced against remote locations that can have cheaper energy, but are more isolated in terms of network and services. The human factor. There is also the Welfa Market and its technical profiles. For companies, deploy infrastructure near where talent is compensated, and professionals in the sector They usually establish their residences in large cities like Madridprecisely because there and other capitals it is where the job offer is concentrated. The same happens in the case of that “first division” of large capitals with data centers in Europe. Frankfurt, London, Amsterdam and Paris also agglutinate that range of technical profiles. The risk of being an energy black hole. Its practically zero self -production converts the Community of Madrid into a kind of “energy black hole”: it absorbs resources generated far and depends totally on the strength of the Spanish network, which recently suffered a worrying one – although it is difficult to be repeated– General blackout. But. Even with that energy deficit, hyperscalars reach these agreements with long -term contracts (PPAS, Power Purchase Agreements), previous agreements with networks and even investments in renewables. The idea is to disconnect the location decision of these data centers where the local energy production is. Madrid must of course ensure its capacity for interconnection and supply – perhaps with network reinforcement if necessary – but energy production in Spain (even Pull energy in the trash) It is a guarantee for this type of facilities. Image | Kyndryl | Community of Madrid In Xataka | Spain was supposed to have a “antiapagones” plan. It has encountered an insurmountable obstacle: politics

There are two suspicious companies of the theft of critical data of TSMC and none of them is China: the two are Japanese

TSMC leadership has a price. This Taiwanese company is The largest semiconductor manufacturer on the planet and has built its success on the tuning of Extremely competitive integration technologies. Your most advanced photolithography is currently The 2 Nm; In fact, it is about to start the large -scale manufacture of chips of this class. All probability of their competitors, they could know their most sophisticated processes, especially those that are linked to their 2 nm node. And, apparently, some of them are trying to get this information. As We explain three days agothe Taiwanese authorities have arrested three TSMC employees because they have allegedly stole commercial secrets of this company. As we can expect, behind this detention is TSMC itself, as He has revealed The Taiwan Superior Prosecutor’s Office in a statement. According to Nikkei Asiathose responsible for this company have realized that two employees and a former employee have been made with critical information about their photolithography of 2 Nm. This information is very valuable. In fact, it could be used by a competitor to optimize its own semiconductor manufacturing processes. Two unexpected suspects: Tokyo electron and rapidus corporation The research has not yet determined if this stolen information has reached another company, but United Daily News ensures that researchers have registered the offices of the Japanese company Tokyo Electron. The latter is specialized in the design and manufacture of wafering processing equipment, and currently its most ambitious project is the tuning of wafering engraving machines by plasma. These equipment are involved in the definition of the pattern that will later be transferred to the wafer. Rapidus is making a chip manufacturing plant in northern Japan in which it plans to produce 2 Nm semiconductors According to SCMPTokyo Electron has confirmed that he has fired an employee of his Taipéi subsidiary (Taiwan) for being involved in the theft of TSMC’s critical information. This Japanese company also ensures that He is collaborating with the Taiwanese authorities They are carrying out the investigation. “That Tokyo Electron is located in the center of attention for this incident is an unfortunate accident,” has declared ASUSHI OSANAIProfessor at the University of Waseda (Japan). However, this company is not the only Japanese company that has been involved in this conflict. And is that Money.udn.com maintains that some of the TSMC employees who have been arrested have delivered to Rapidus corporation Hundreds of photographs and data linked to their most advanced process integration techniques. This company is intended to compete from you to you with TSMC, Intel or Samsung in the chip production market. Interestingly, it is very young: it was founded on August 10, 2022 by the Japanese government with an initial capital of 7,346 million yen (just under 46 million euros) contributed by, and here comes the interesting, Sony, Toyota, Nec, Softbank, Kioxia, Denso, Nippon Telegraph and Mufg Bank. Rapidus is currently putting a circuit manufacturing plant integrated in northern Japan, in the city of Chitose (Hokkaido), in which it plans to produce 2 Nm semiconductor. The first prototypes of these chips are already ready, but large -scale manufacturing It will not arrive at best until 2027. Anyway, as in relation to Tokyo Electron, the possible implication of Rapidus in the theft of data to TSMC has not been officially confirmed. In fact, it is possible that the authors of this crime have acted on their own and have offered the stolen information to Rapidus without this last company having requested or accepted. Those responsible for the investigation will have to settle. More information | Money.udn.com | SCMP In Xataka | South Korea fears US reprisals. To avoid their old lithography equipment, they take dust on a warehouse

will invest 30,000 million euros in data centers for AI

Europe cannot lose the train of the artificial intelligence (AI). You can’t afford it. This technology already has a very deep impact on the economy, scientific and technological capacity, and the military development of a country, and currently USA and China lead with forcefulness In this area. So far Europe seemed to settle for the wake of the two great powers they are disputing world supremacybut its strategy is about to change. And is that according to CNBC The European Union plans to invest 10,000 million euros in the construction of thirteen data centers for AI, as well as 20,000 million euros in a network of “Gigavatio Class” facilities. These latest data centers are the largest and most ambitious, and their denomination indicates that by their size they consume a lot of electricity. In fact, a gigavatio is equivalent to one billion watts, and a small city can consume this amount of energy. At the moment sixteen European countries have been interested in receiving these facilities, and, According to CNBCthe first of these large data centers will reside in Munich (Germany). Each Gigavatio class installation will cost between 3,000 and 5,000 million euros, and will bring together no less than 100,000 avant -garde gpu for AI (they will be possibly chips NVIDIA H100). All this paints very well, but raises a doubt that we cannot ignore: it is not clear how the countries involved in this plan will resolve the supply of electricity to These demanding facilities. It will cost Europe a lot to follow the rhythm of the US and China The US government led by Donald Trump is determined to lead in the field of the cost of what costs. And in principle this initiative, baptized by the new administration as ‘Stargate project’will cost 500,000 million dollars. This money will leave the coffers of the Japanese investment group SoftBank; of those of OpenAI, the creators of Chatgpt; of those of Oracle, and, finally, it will also be provided by the investment firm Emiraratí MGX. These companies will support the construction during the next four years of an advanced network of data centers that will house the high performance computing infrastructure necessary to sustain US leadership in the AI field. The spearhead of these facilities It is already being built in Texas (USA), in a town called Abilene. And it is colossal. In fact, this first data center of the ‘Stargate’ project will bring together, According to OpenAimore than two million chips for ia. The ‘Stargate’ infrastructure should be fully ready before President Trump’s current mandate expires When the US government announced to Bombo y S pay this plan left a great question open: how did he plan to solve the supply of electricity required by the new facilities? Large data centers for AI consume a lot of electricity, which has caused Some technology have opted for investing in nuclear centrals to guarantee the supply of electricity that these facilities require. At the moment this question is not completely resolved. And it is not because the ‘Stargate’ infrastructure should be completely ready before President Trump’s current mandate expires, and a new nuclear power plant can hardly come into operation in four years. Even so, Openai and Oracle They have officialized that have reached an agreement to build the necessary infrastructure to Deliver additional 4.5 GW to your data centers. Interestingly, SoftBank does not participate in the financing of this expansion, although, as I mentioned a few lines above, it does in the ‘Stargate’ project. Anyway, in this equation there is another unknown that also has a lot to say: China. “We hope that China significantly increase its investments in AI and semiconductors in response to the US domain in AI,” CBM consultancy analysts foresee. It makes sense. These two great powers are being disputed world supremacy, so it is understandable that each significant step that give one of the two Receive a more or less overwhelming answer from the other. We can be sure that 2025 will be a year even more agitated than 2024 in the geopolitical and technological fields, so we will be attentive to the steps that US and China will surely give. And Europe. Also Europe. Image | Christina Morillo More information | CNBC In Xataka | Huawei attacks Nvidia positions in China: he wants to have dominant hardware in inference processes in AI

The most bestial data centers on the planet, gathered in this graphic

The development of AI has promoted a New ‘Armamentistic’ career globally. It is not sought to dominate another territory, but to get the more computing power, the better. The main technology companies are deploying centers from Data around the world With a goal in mind: train the artificial intelligence. There are data centers that are authentic burged, and in this graph we can see the most powerful data clusters in the world with an outstanding protagonist: Elon Musk. Cluster. Before entering numbers, a nuance. When we talk about calculation power, we can talk about a computer cluster or a supercomputer. The latter is a system Extremely powerful which can be built with processors specially designed to reach extreme calculation powers or, most commonly common, from thousands of high performance servers. They are used for scientific simulations and tasks that require a huge calculation process, and its cost is brutal. On the other hand, We have the “affordable” version of a supercomputer: the computer cluster. It is a series of interconnected work stations that work in parallel solving problems. It is similar to a supercomputer, but the advantage is that It is a more flexible system Because, as you need more teams, you can expand the cluster. In addition, the components are more standard, which also allows the cost to be lower. But well, it is a concept that has blurred in recent years. The 100,000 club. That said, let’s go back to the graph elaborated by Visual Capitalist With the data of EPOCH AI. In it, we can see the most powerful clusters currently, but with some trap: they are both planned and operational. X, Elon Musk’s company, lit the XAI Colossus Memphis Phase 1 last year, a huge data center with 100,000 NVIDIA H100 GPU With the aim of training ‘Grok’, his AI model. It was something that He even surprised Jensen HuangCEO of Nvidia. It is a computing monster with an enormous calculation power, but the figure is expected to increase up to 200,000 GPU. We will see later the energy consequences of this. Following Musk’s company, we have Meta by stating that they have a cluster “greater than 100,000 GPU H100“For her model ‘calls 4’. Then there are those who maintain something else the mystery. For example, Microsoft with its cluster For Azure, Copilot and the OpenAi AI estimated they have 100,000 GPU between H100 and H200, Two worlds. Out of that 100,000 club we have Oracle With its 65,536 NVIDIA H200, another Musk company -you with the Cortex Phase 1 and its 50,000 GPU, and the United States Department of Energy with The Captainhe Most powerful supercomputer in the world. Officers or estimated, what is clear with this graph is that there is a country that has taken the calculation of AI seriously: United States. They are the ones that seem to push stronger with their data centers in the United States (of the 10 clusters, the first nine are in the US and the last in China) and are not only building inside their borders: Also outside. An example is the finish plan for Build data centers in Spain or the one who has practically Manhattan’s size. European expansion. In the graph, we can see two European clusters. On the one hand, The Jupiter from the Jülich Supercomputing Center in Germany with its confirmed GPUs. On the other hand, the Nexgen In Norway, with about 16,300 GPUs. Europe has undertaken several financing initiatives with the objective of Promote your competitiveness Thanks to programs such as Genai4eu and its budget of 700 million euros between 2024 and 2026. The objective is to build large data centers and, for the call of 2025, 76 proposals were presented in 16 different countries. Now, that development of the European AI must be aligned with Ai actthe agreement in force since February of this 2025 that ensures transparency and an ethical AI. Number vs. efficiency in China. Who has put the batteries in AI, beyond US companies, is China. Following one Road map very different from the westernChina is focusing on having (supposedly) less GPU working, operating with greater efficiency, much lower costs than those of American companies and with equivalent results. Deepseek or the most recent Kimi They are two samples of it. Nvidia rubs her hands. And of all this battle for AI, there is a clear winner: Nvidia. As much as it may be, and beyond who has more or less GPU to do the job, the clear winner is Nvidia. In China it is not so clear due to the commercial veto, but the main world data centers use Nvidia’s architecture with their GRAPHS H100 Y H200. And that if we talk about “normal” cards for AI, since they have the B200 with four times the performance of H100. In fact, the company seems so focused on that AI career that would have neglected what led by AMD for years: Your players cards. Those are servers of Lenovo data centers. Companies seek to reduce the footprint reusing hot water after dissipation to, for example, fill pools or showers. Image | Xataka The planet, not so much. And the consequence of that expansion of the AI is that data centers not only need huge energy amounts To function, also water to dissipate the heat of the equipment. There is an important absent in the graph, Google, which also operates its data centers for AI and that, together with others as goal or Microsoftneeds nuclear centrals to feed its facilities. Consumption is so exaggerated that renewables are insufficient during demand peaks, Using fossil fuels like coal or gas ( esteem That, the 200,000 Colossus GPU consume 300 MW, enough to feed 300,000 homes) and, as we said, the Water use has become discussion material in the candidate territories to house new data centers. So much dissipation needs China to already Building at the bottom of the ocean. In Xataka | China wants to become AI world engine. … Read more

Openai and Oracle prepare a brutal data center with 2 million chips for ia

The plan ‘Stargate’ It is the great bet of the government led by Donald Trump to keep the US at the head of development in artificial intelligence (AI). When this project was made public on January 22, it hit the economic endowment that would make it possible: nothing less than 500,000 million dollars. This money will leave the coffers of the Japanese investment group SoftBank; of those of OpenAI, the creators of Chatgpt; of those of Oracle, and, finally, it will also be provided by the investment firm Emiraratí MGX. These companies will support the construction during the next four years of an advanced network of data centers that will house the high performance computing infrastructure necessary to sustain US leadership in the AI field. The spearhead of these facilities is already being built in Texas (USA), in a town called Abilene. And it is colossal. In fact, this first data center of the ‘Stargate’ project will bring together, According to OpenAimore than two million chips for ia. 4.5 GW Dan for a lot When the US government announced to Bombo y S pay this plan left a great question open: how did he plan to solve the supply of electricity required by the new facilities? Large data centers for AI consume a lot of electricity, which has caused Some technology have opted for investing in nuclear centrals to guarantee the supply of electricity that these facilities require. Oracle began to install the first ‘racks’ with NVIDIA B200 platform servers in June At the moment this question is not completely resolved. And it is not because the ‘Stargate’ infrastructure should be completely ready before President Trump’s current mandate expires, and a new nuclear power plant can hardly come into operation in four years. Even so, Openai and Oracle have formalized A few hours ago they have reached an agreement to build the necessary infrastructure to Deliver additional 4.5 GW to your data centers. Interestingly, SoftBank does not participate in the financing of this expansion, although, as I mentioned a few lines above, it does in the ‘Stargate’ project. According to OpenAia part of the gigantic Abilene data center is already in operation. In fact, Oracle began installing the first racks With platform servers B200 of NVIDIA In June. It is surprising that in just six months a part of this installation is already active, but it is important that we do not overlook that the Abilene data center seeks to demonstrate that the ‘Stargate’ plan is viable within the deadline that those responsible have promised. Anyway, in this equation there is another unknown that also has a lot to say: China. “We hope that China significantly increase its investments in AI and semiconductors in response to the US domain in AI,” CBM consultancy analysts foresee. It makes sense. These two great powers world supremacy are being disputedso it is understandable that each significant step that give one of the two Receive a more or less overwhelming answer from the other. We can be sure that 2025 will be an even more agitated year than 2024 in the geopolitical and technological fields, so we will be attentive to the steps that US and China will surely give. Image | Christina Morillo More information | OpenAI In Xataka | Huawei attacks Nvidia positions in China: he wants to have dominant hardware in inference processes in AI

They already use a mixture created by algorithms for their data centers

Goal has used a concrete mixture designed by algorithms in one of its data centers. According to the companythis formula promises to be more sustainable, faster to apply and has been developed with open source tools. With this approach, not only is it sought move towards zero emissions: Also accelerate the construction of infrastructure that grow to time, As the data center that is raising under provisional structures demonstrates. Invisible concrete weight. Few materials are as omnipresent as concrete. It is used on roads, bridges, homes … and also in data centers where a good part of our digital life is housed. The problem is that manufacturing its components, especially cement, generates a huge amount of CO2. The World Economic Forum indicates that It represents about 8% of global emissions. Goal has proposed to reduce that footprint without compromising the resistance or work speed. And that’s where his new model enters. An AI that does not create chatbots, but mixtures. To develop this system, goal was allied with Amrize – One of the world’s largest cement manufacturers – and with the Urban-Champaign University of Illinois. Together they have created an AI model that proposes concrete compositions. The model is based on Bayesian optimization And it is built with Botorch and AX, two open source tools developed by the goal itself. A slab test in the Rosemount Data Center The challenge was not minor: each mixture involves combining different types of cement, aggregates, water, additives and supplementary materials such as scum or flying ashes. The exact proportion, its origin or even the time of the year can alter the result. Traditionally, they explain, validate a new formula has been for weeks. With AI, this process accelerates because the system learns from the previous data, proposes new promising combinations and refines its predictions after each test. Implementation of the concrete formulation generated by AI in the data center From the laboratory to the field. One of the first large -scale validations was made in the data center that Meta builds in Rosemount, Minnesota. There, the contractor Mortensen applied the new mixture in one of the building’s support slabs. The objective was not only to check its resistance, but also its workability and the final finish: these slabs must be perfectly smooth and durable. The result, according to the firm, exceeded all technical standards. The formula designed by AI not only met the demands of resistance and cure, but also behaved well into work: it was poured without problems and offered an adequate surface. After two iterations, and with minimal human adjustments, the model had generated a recipe that improved the usual industrial formulas in speed, resistance and potential for emission reduction. An open model. The system developed by goal is not a commercial product or a closed tool. The company has published the code, data and technical approach in An open github repository called SustainableConcrete. The idea is not to keep the formula, but share the method: a way of applying artificial intelligence to concrete design that can adapt to other works, suppliers or materials. Touch wait to know if we will see more initiatives like this. This could facilitate the adoption of alternative mixtures in a variety of constructions. As we have seen, goal has not invented a new material. What he has done is to use AI to find new concrete formulas. Images | Xataka with Gemini 2.5 Flash | Mark Zuckerberg | Goal (1, 2) In Xataka | Nvidia says that China has the best open source AI in the world. These praises have a very clear intention

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.