We already know which will be the most expensive data center in the world. If Bill Gates paid it, it would be almost zero

Already in 2024 we saw that infrastructure spending for AI was being insane. The trend has not relaxed, quite the opposite. Big tech continues to burn money as if there was no tomorrow (literally) and most of that spending is going to most valuable asset in the AI ​​race: data centers. How much do they really cost? Data centers in numbers Epoch AI has published Frontier Data Centersa complete database about data centers being built in the United States. Through satellite images, public documents and permits, they have obtained information about the estimated construction cost, as well as energy consumption and computing power. The award for the most expensive data center goes to Microsoft Fairwater, whose total cost It could reach $106 billion when completed in 2028. To put it in context, Bill Gates’ fortune is estimated to be 107 billion dollars. It would be fair to pay it. The forecast for Microsoft Fairwater even surpasses Meta Hyperion, the data center that It will be as big as the island of Manhattan which would cost 72,000 million. Next on the list is Colossus 2, by xAIwhose estimated cost is 44 billion dollars. It is closely followed by Meta Prometheus with 43 billion and the Amazon and Anthropic data center in New Carlisle with 39 billion. Epoch AI has collected more data, such as how much computing power each facility will have. This data is measured using the NVIDIA H100 GPUs for reference. They have also calculated the energy demand and who will be the main user of each of them. Below we leave you a table with the key information: Estimate DATE ESTIMATED cost ($) computing (EN gpUS H100) energy demand intended primary user microsoft fairwater September 2027 106 billion 5.2 million 3328 MW OpenAI meta hyperion January 2028 72 billion 4.2 million 2262 MW Goal xai Colossus 2 February 2026 44 billion 1.4 million 1379 MW xAI meta prometheus October 2026 43 billion 1.2 million 1360MW Goal amazon new carlisle June 2026 39 billion 770,000 1229 MW Anthropic oracle stargate July 2026 32 billion 1 million 1180MW OpenAI microsoft fayetteville March 2026 29 billion 920,000 1065MW OpenAI/Microsoft amazon ridgeland September 2027 32 billion 630,000 1008MW Anthropic Dizzying climb Looking at the case of Microsoft Fairwater, and always according to Epoch AI’s forecast, in March 2026 the investment will be $18 billion. A year later, in February 2027, it rises to 35,000 million, just four months later it shoots up to 71,000 million, to reach 106 billion in 2028. The price increase is dizzying and responds to several factors. The first is that the computational cost of training models has been increasing. For example, GPT 4 cost OpenAI over 100 million and rumors before the release of GPT-5 pointed to training rounds of 500 million each. Epoch AI also did an analysis on this and they estimated that the cost of training has multiplied by 2.6 year after year. On the other hand, there is the demand for GPUs, necessary for training the models and the most expensive component of all. An NVIDIA H100 GPU costs 25,000 dollars and its successor, the NVIDIA B200 also known as Blackwell, could be between 30,000 and 40,000 dollars. And this is just the GPUs, many are needed more components to get a data center up and running, such as power generators, high-speed networks or refrigeration, among others. The initial bottleneck was the shortage of GPUs, but it has been overcome by a more fundamental constraint: there is not enough power for so many chips. data centers They consume a lot of energy, Seriously, a lot. To put it in context, in 2024, data centers were already the 4% of United States electricity consumption and it is expected that Demand will double in the next five years. Nobody wants to live near a data center for one reason: mass consumption is raising energy prices up to 267% in nearby areas. Power supply has become a new choke point for the industry. Microsoft is already considering producing its own energy by creating nuclear power plants and others like Google and Amazon are considering taking data centers into space. Image | Microsoft In Xataka | AI data centers are an energy hole. Jeff Bezos’ solution: build them in space

AI data centers consume too much energy. Google’s ‘moonshot’ plan is to take them to space

Training models like ChatGPT, Gemini or Claude requires more and more electricity and water, to the point that the energy consumption of AI threatens to exceed that of entire countries. Data centers have become real resource sinks. According to estimates by the International Energy Agencythe electrical expenditure of data centers could double before 2030, driven by the explosion of generative AI. Faced with this perspective, technology giants are desperately looking for alternatives. And Google believes it has found something that seems straight out of science fiction: sending its artificial intelligence chips into space. Conquering space. The company Project Suncatcher has been revealedan ambitious experiment that sounds like science fiction: placing its TPUs—the chips that power its artificial intelligence—on satellites powered by solar energy. The chosen orbit, sun-synchronous, guarantees almost constant light. In theory, these panels could work 24 hours a day and be up to eight times more efficient than the ones we have on Earth. Google plans to test its technology with two prototype satellites before 2027, in a joint mission with the Planet company. The objective will be to check if its chips and communication systems can survive the space environment and, above all, if it is feasible to perform AI calculations in orbit. The engineering behind the idea. Although it sounds like science fiction, the project has solid scientific bases. Google proposes to build constellations of small satellites—dozens or even hundreds—that orbit in compact formation at an altitude of about 650 kilometers. Each one would have chips on board Trillium TPU connected to each other by laser optical links. Such light beams would allow satellites to “talk” to each other at speeds of up to tens of terabits per second. It is an essential capability to process AI tasks in a distributed manner, as a terrestrial data center would do. The technical challenge is enormous: at these distances, the optical signal weakens quickly. To compensate, the satellites would have to fly just a few hundred meters apart. According to Google’s own studyKeeping them so close will require precise maneuvering, but calculations suggest that small orbit adjustments would be enough to keep the formation stable. In addition, engineers have already tested the radiation resistance of their chips. In an experiment with a 67 MeV proton beam, Trillium TPUs safely withstood a dose three times higher than they would receive during a five-year mission in low orbit. “They are surprisingly robust for space applications,” the company concludes in its preliminary report. The great challenge: making it profitable. Beyond the technical problems, the economic challenge is what is in focus. According to calculations cited by Guardian and Ars Technicaif the launch price falls below $200 per kilogram by the mid-2030s, an orbital data center could be economically comparable to a terrestrial one. The calculation is made in energy cost per kilowatt per year. “Our analysis shows that space data centers are not limited by physics or insurmountable economic barriers,” says the Google team. In space, solar energy is practically unlimited. A panel can perform up to eight times more than on the Earth’s surface and generate almost continuous electricity. That would eliminate the need for huge batteries or water-based cooling systems, one of the biggest environmental problems in today’s data centers. However, not everything shines in a vacuum. As The Guardian recallseach launch emits hundreds of tons of CO₂, and astronomers warn that the growing number of satellites “is like looking at the universe through a windshield full of insects.” Furthermore, flying such compact constellations increases the risk of collisions and space debris, an already worrying threat in low orbit. A race to conquer the sky. Google’s announcement comes in the midst of a fever for space data centers. It is not the only company looking up. Elon Musk recently assured that SpaceX plans to scale its Starlink satellite network—already with more than 10,000 units—to create its own data centers in orbit. “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links. SpaceX is going to do it,” wrote Musk in X. For his part, Jeff Bezos, founder of Amazon and Blue Origin, predicted during the Italian Tech Week that we will see “giant AI training clusters” in space in the next 10 to 20 years. In his vision, these centers would be more efficient and sustainable than terrestrial ones: “We will take advantage of solar energy 24 hours a day, without clouds or night cycles.” Another unexpected actor is Eric Schmidt, former CEO of Google, who bought the rocket company Relativity Space precisely to move in that direction. “Data centers will require tens of additional gigawatts in a few years. Taking them off the Earth may be a necessity, not an option,” Schmidt warned in a hearing before the US Congress. And Nvidia, the AI ​​chip giant, also wants to try his luck: The startup Starcloud, backed by its Inception program, will launch the first H100 GPU into space this month to test a small orbital cluster. Their ultimate goal: a 5-gigawatt data center orbiting the Earth. The new battlefield. The Google project is still in the research phase. There are no prototypes in orbit and no guarantees that there will be any soon. But the mere fact that a company of such caliber has published orbital models, radiation calculations and optical communication tests shows that the concept has already moved from the realm of speculation to that of applied engineering. The project inherits the philosophy of others moonshots of the company —like Waymo’s self-driving cars either quantum computers—: explore impossible ideas until they stop being impossible. The future of computing may not be underground or in huge industrial warehouses, but in swarms of satellites shining in the permanent sun of space. Image | Google Xataka | While Silicon Valley seeks electricity, China subsidizes it: this is how it wants to win the AI ​​war

making cell towers mini data centers for AI

A few days ago we heard the news that NVIDIA had invested $1 billion in Nokiataking over 2.9% of the Finnish company. Although the check in itself is striking news, since for many people, Nokia had been lost off the map for many years, the movement makes all the sense in the world: it is the Western response to many of the Chinese technology companies that for years have been investing in the deployment of 6G. And of course, with NVIDIA behind them, telephony base stations can serve much more than just providing coverage to millions of devices: becoming small distributed data centers for AI. The plan behind the investment. NVIDIA and Nokia are not just designing equipment for mobile networks. They are redefining what a cell tower is. The idea is that each base station (the towers and small installations that we see on buildings and streets) become a computing node with the ability to execute operations involving AI technologies in real time. “An AI data center in everyone’s pocket”, according to Justin Hotard, CEO of Nokia. The key here is to bring processing closer to the user in order to eliminate latency, which is usually one of the most frequent problems in AI applications that require real-time processing, such as instant translation, augmented reality or autonomous vehicles. Without latency, everything changes. When we ask an AI to translate a conversation or analyze live images, every millisecond counts. Sending that data to a distant server, processing it, and returning it introduces a significant delay that mars the final experience. The most logical solution is to decentralize: that the AI ​​lives close to the userin the telecommunications infrastructures themselves. In this sense, NVIDIA will contribute chips and specialized software, while Nokia will adapt its 5G and 6G equipment to integrate that computing capacity. As announced, the first commercial tests will begin in 2027 with T-Mobile in the United States. The Nokia effect on the stock market. Nokia shares they shot up 21% after the news broke, reaching highs not seen since 2016. NVIDIA and OpenAI have become King Midas of technology: everything they touch goes up. The investment is also a boost to the strategy of Hotard, who since his arrival in April has accelerated Nokia’s shift towards data centers and AI. The company, which already acquired Infinera for 2.3 billion to strengthen its position in data center networks, it is now positioned as the only Western supplier capable of competing with Huawei in the complete supply of telecommunications infrastructure. EITHERafter space race. While Europe and the United States accelerate their 6G plans, China has been investing aggressively in this technology for years. This alliance between NVIDIA and Nokia is a somewhat late response, but necessary. Jensen Huang, CEO of NVIDIA, explained in his speech in Washington that the goal is “to help the United States bring telecommunications technology back to America.” It is not just about infrastructure, but about strategic control. And whoever dominates this network of brains distributed throughout cities and roads will control the AI ​​applications of the future. And now what. The McKinsey consulting firm esteem that investment in data center infrastructure will exceed $1.7 trillion by 2030, driven by the expansion of AI. Nokia and NVIDIA want their piece of the pie, but they are also betting on a structural change: that mobile networks stop being mere data tubes and become intelligent computing platforms. It remains to be seen if this model works commercially and whether operators are willing to update their infrastructure. Cover image | NVIDIA In Xataka | Xi Jinping wants two things: first, to create a global center that regulates AI. The second, that it is in Shanghai

It’s called ‘data poisoning’ and it’s poisoning them from within.

AI is everywhere and every time add more users. The logical step is that it would also be the target of malicious attacks. We have already talked about the dangers of ‘prompt injection’, a surprisingly easy attack to execute. He’s not the only one. AI companies are also fighting data poisoning. Poisoned data. It consists of introducing manipulated data into resources that will later be used for AI training. According to a recent investigationit does not take as many malicious documents to compromise a language model as previously believed. They found that with only 250 “poisoned” documents, models with up to 13 billion parameters were compromised. The result is that the model can be biased or reach erroneous conclusions. Prompt injection. It is one of the Problems AI Browsers Face like ChatGPT Atlas or Comet. By simply placing an invisible prompt in an email or a website, you can get the AI ​​to deliver private information by not being able to distinguish what is a user instruction and what is a malicious instruction. In the case of AI agents it is especially dangerous since they can execute actions on our behalf. AI to do evil. According to a Crowdstrike reportAI has become the weapon of choice for cybercriminals, who use it to automate and refine their attacks, especially ransomware. He M.I.T. analyzed more than 2,800 ransomware attacks and found that 80% used AI. The figure is overwhelming. Collaboration. They count in Financial Times that leading AI companies such as DeepMind, OpenAI, Microsoft and Anthropic are working together to analyze the most common attack methods and collaboratively design defensive strategies. They are turning to ethical hackers and other independent experts to try to breach their systems so they can strengthen them. Urgency. AI browsers and agents are already here, but we are on time because there has not yet been mass adoption. It is urgent to strengthen the systems, especially to prevent the injection of prompts that can so easily steal our data. Image | Shayna “Bepple” Take in Unsplash In Xataka | “The safety of our children is not for sale”: the first law that regulates ‘AI friends’ is here

Building data centers in space was the new hot business. Elon Musk just broke it with a tweet

The debate over the feasibility of building gigantic data centers in orbit had been heating up for months. It is Silicon Valley’s new big idea to solve the insatiable energy appetite of artificial intelligence. Until, as usual, Elon Musk has entered the conversation with the subtlety of a hammer. Elon Musk has joined the chat. After weeks of debate about the feasibility of building servers in space, Eric Berger, editor of Ars Technica, argued that will end up being a more plausible option when the technology exists to assemble satellites in orbit autonomously. It was the moment chosen by Elon Musk to enter the conversation. “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links,” wrote the CEO of SpaceX. “SpaceX is going to do it,” he said. A phrase that has probably fallen like a blow on startups that are taking advantage of the momentum of AI to go out in search of financing. Why the hell do we want servers in space? The idea of ​​moving computing to Earth orbit responds to a very real crisis: AI is an energy monster, and Demand for data centers continues to grow. Given this panorama, space offers two advantages that are impossible on Earth: Almost unlimited energy: In a sun-synchronous orbit, solar panels receive sunlight almost continuously (more than 95% of the time). Free Cooling: Land-based data centers consume millions of liters of fresh water to cool. With a large enough radiator, the gap can be “an infinite heatsink at -270°C.” The heat would be radiated into the vacuum without wasting a single drop of water. The new titans of space AI. Musk is not the first to see the business. In fact, he arrives at a party where the first contracts are already being distributed. Jeff Bezos predicted during the Italian Tech Week that we will see “giant training clusters” of AI in orbit in the next 10 or 20 years. Eric Schmidt, the former CEO of Google, bought rocket company Relativity Space precisely for this purpose. And Nvidia, the undisputed king of AI hardware, has actively backed startup Starcloud, which plans to launch the first NVIDIA H100 GPU into space this November, with the goal of eventually building a monster 5-gigawatt orbital data center. Why Musk would win. The vision of Bezos, Schmidt and Starcloud faces two colossal obstacles: the cost of launch and the construction of the servers themselves. Calculations for a 1 GW data center would require more than 150 launches with current technology. And Starcloud’s plan for a 4 kilometer wide array is a logistical nightmare. Elon Musk has Starship, the giant rocket on which all of his competitors’ business models depend to be profitable. And you don’t need build a new orbital data center. Just adapt and scale the one you already have. 10,000 satellites and counting. SpaceX’s Starlink constellation no longer competes against satellite internet, goes for terrestrial fiber. Musk’s company has already launched 10,000 satellites and is preparing the deployment of the new V3 satellites, designed for Starship with high-speed laser links. According to SpaceX itself, each Starship launch will add 60 terabits per second of capacity to a network that is already, in practice, a global computing and data mesh. While Starcloud needs to hire a rocket and assemble 4km-wide solar and cooling panels, Musk simply needs Starship to finish development to continue launching satellites. In Xataka | Starlink stopped competing with satellite Internet companies a long time ago: now it is going for something much bigger

NVIDIA has risen to the top for its AI data centers. Your next big leap: cars

NVIDIA has unveiled its platform Drive AGX Hyperion 10a computing and sensor system designed for any manufacturer to produce Level 4 autonomous vehicles. Uber has already signed an agreement to deploy 100,000 units across its global network starting in 2027, and Stellantis, Lucid and Mercedes-Benz have also joined the project. Why is it important. For years, autonomous driving has been a persistent promise often wrapped in marketing. NVIDIA has turned that promise into an industrial offering with standardized architecture, certified chips, and out-of-the-box simulations. It does not sell autonomous cars, but it does sell the operating system that will make them possible. The contrast. Tesla has been selling autonomy as a leap of faith for a decade, with permanent updates, its own fleet and promises of “millions of autonomous Teslas” every year. NVIDIA, on the other hand, offers an open platform where any manufacturer can plug in their hardware. Tesla wants to be an equivalent to Apple in cars. NVIDIA prefers to be something more similar to Windows. Between the lines. Automotive only accounts for NVIDIA 1.3% of its revenue, but that segment is growing faster than the rest. In any case, Uber’s announcement has no real timetable for those 100,000 units unless it has been made public. Waymo, which has been developing its robotaxis for years, is already its sixth generation and it has the financial muscle of Alphabet behind it, it barely operates 2,000 of them. There is a considerable gap between ambition and reality. The backdrop. Drive Hyperion 10 is based on two Thor chips (2,000 teraflops each), fourteen cameras, nine radars, one LiDAR and twelve ultrasonic sensors. NVIDIA has designed it with full redundancy: if a component fails, the vehicle stops safely to avoid chain errors that multiply the potential damage. Lucid will be one of the first in offering level 4 autonomous driving to individual customers and not just fleets. Its interim CEO has admitted that so far they have disappointed in terms of driving assistance. Their commitment to NVIDIA is the classic implicit recognition: it is better to buy the brain than to build it. The money trail. NVIDIA will not continue building robotaxis for now, but for now it sells infrastructure: chips, simulation software, synthetic data… And it charges for each vehicle that uses its platform. It’s a more predictable revenue model than depending on full autonomy to arrive one day. Huang, in any case, has said that that moment is near. The interesting thing is not whether he is right, but that his definition no longer depends on blind faith. It depends on regulators, certifications and industrial tests. Autonomy has ceased to be science fiction and has become an engineering problem. And those problems are solved with processes, not with promises. In Xataka | China has turned the electric car market into a crazy race. And Porsche pays for it with billion-dollar losses Featured image | Xataka

The Internet has made data the new digital gold. And that’s why we are more fragile than ever: Crossover 1×27

The Internet is wonderful until it isn’t. We have made it such an integral part of our lives that we are doing something dangerous: telling it too much about who we are and what we do at any given moment. And that has its risks. To talk about all this in this Crossover 1×27 We have invited José, better known as Hackavisswho is an expert in cybersecurity and digital forensics. He explains to us how hackers can end up stealing our data or what dangers exist on the deep web. And there are many, both in the deep web of before (Tor) and now, because Telegram is a digital underworld in itself. That they are not alone, because dangers also lurk in the Internet that we all see and use daily. This is how Hackaviss tells us (or rather scares us) about disturbing cases in which, for example, hackers can request a loan in your name with just a photo of your ID. There are many more, but in all of them there is the same focus: The data. Because as this expert says, data is the new digital gold. Especially for cybercriminals, who obtain that personal information and then use it in all kinds of ways, both to impersonate identities and to exploit them and defraud people or entire organizations. The types of scams, as Hackaviss explains, are almost unlimited, and in fact reminds us of the famous case of Silk Roadthat of snowden or that of Pegasus and then link to modern cryptocurrency scams. There is a little bit of everything and for everyone, and the conclusion is always the same: be careful how you use the internetbecause we are increasingly fragile in the network of networks. On YouTube | Crossover

The message is clear, if you use my data without paying, prepare your lawyers

The Internet business model has been based on a tacit agreement for decades: If something is free, the product is probably us. For years, this logic was assumed without major shocks, but the emergence of artificial intelligence is changing the rules. Platforms that store human conversations have become gold mines for training models, and that has reopened old questions about the value of data. In the midst of this new scenario, Reddit has planted itself strongly. Although its millions of users do not receive any compensation for the content they generate, the company has made it clear that it will not tolerate others using it without paying for it. Reddit’s firmness has materialized in a new lawsuit filed before US justice. The company accuses Perplexity AI and three data scraping service providers of having circumvented its protection mechanisms to access copyrighted content. In its complaint, Reddit describes “scraping on an industrial scale” and maintains that the objective of these companies is to illicitly obtain the material that feeds artificial intelligence engines. It’s a new chapter in a strategy to control the use of your content. A rather particular case. At the center of the complaint are Perplexity AI and three mass data scraping intermediaries: SerpApi, Oxylabs and AWMProxy. Reddit describes them as “wannabe bank robbers,” a metaphor with which the company illustrates the attempt to access their content through indirect means. Instead of signing a licensing agreement, the lawsuit claims, these companies would have chosen to use third-party services to collect posts, comments and copyright-protected data. The conversational search engine is listed as a customer of “at least one” of those providers. The court document details a pattern of behavior that, according to Reddit, has been repeated for months. The accused companies would have used automated methods to extract information from the platform despite the restrictions imposed on their public file. The result, the company denounces, was a constant flow of publications that ended up integrated into the defendant’s artificial intelligence engine. For Reddit, it is scraping “on an industrial scale” and for clearly commercial purposes. The test that turned it all on. One of the most relevant episodes of the complaint is an experiment that Reddit considers key. In May 2024, the company ordered the defendant to stop collecting its data. However, shortly thereafter he saw an increase in Reddit mentions within the Perplexity answer engine. To verify this, he published an entry designed to be visible only by Google. According to the complaint, a few hours later the full text of that publication already appeared in the results generated by the accused company’s system. Perplexity does not hide. Perplexity noted on Reddit’s own platform. In that message, it explained that it is an “application layer” company and that “it does not train artificial intelligence models with Reddit content.” “He has never done it,” the text added. According to the company, this difference makes it impossible to sign a licensing agreement like those that Reddit has reached with other companies. “A year ago, after explaining this, Reddit insisted that we pay anyway. Giving in to these types of tactics is not the way we do business,” the statement concluded. When there is an agreement, there is money. Reddit’s position against Perplexity contrasts with the agreements it has signed with other technology companies. In February 2024 it expanded its collaboration with Google to allow access to its content through the data API, in a structured and licensed manner. Three months later, announced a similar alliance with OpenAI: ChatGPT and other company products can display recent Reddit posts in their responses. What we accept (many times) without reading. Behind all this debate there is an element that many users overlook: the Reddit Terms of Service. By creating an account, each person grants the platform a worldwide, perpetual, irrevocable and sublicensable license to use their content. This license allows you to copy, modify, distribute or publish any contribution, including making it available to other associated companies. The text also specifies that Reddit can use this material to “train artificial intelligence and machine learning models.” In other words, permission is already granted. Something we have already seen, and what remains to be seen. Reddit has been drawing a clear pattern of action for some time. In 2023 it toughened its conditions for access to the APIwhich led to widespread protests and the temporary closure of thousands of communities. A year later, in May 2024, it sent a cease-and-desist letter to Perplexity for unauthorized use of its data and subsequently filed a lawsuit against Anthropic for similar reasons. The current litigation fits that same logic: protecting the value of your content and tightening your control over who can use it. The case between Reddit and Perplexity is still in its initial phase, but its implications are evident. What the courts decide could set a precedent for future disputes between platforms and artificial intelligence developers. On the one hand there is the defense of free access to information; on the other, the right of companies to protect the content generated in their communities. The result will define the extent to which platforms control the material that users share daily. Images | Reddit | Xataka with Gemini 2.5 | Perplexity In Xataka | The race to put a humanoid robot in our house has begun. It’s an absurd race

An electric car is 54% cheaper to maintain than a combustion car. And it may not compensate because the data has a trick

The cost of a car is not what you pay for it, it is the sum of many other factors. It is what it costs you to fill the tank, what it costs you to repair it and, why not, what you get back once you have decided to get rid of it. Are there reasons to go electric? Yes, many. Also to stay in the combustion. It depends on what you value. The data. An electric car saves up to 54% in maintenance compared to an equivalent gasoline car. Those are the accounts of Autobild that are spreading in recent days among the media. His comparison pointed to a Volkswagen ID.3 with a Volkswagen Golf VII 1.6 TDI from 2016 and a Volkswagen e-Golf, also from 2016. Why does an electric car have less autonomy than advertised? The result is that maintaining the electric car was between 40 and 54% cheaper than versions with combustion engines. According to their calculations, the revisions for the diesel version ranged from 393 euros to 547 euros. The plug-in hybrid had a price in its reviews of between 161 euros and 275 euros. The electric maintenance book required maintenance of between 200 and 300 euros. Of course, the stops were less frequent and, according to their calculations, as the kilometers passed, the pure electric was between 40 and 54% cheaper than its combustion “brothers.” Because? They give several reasons. First of all, as we have seen, because the reviews They are less expensive and less common. Fewer components have to be replaced in them, so it is necessary to invest less money. Among his accounts are oil changes (almost non-existent among electric cars), the total absence of possible breakdowns of a combustion engine and also the replacement of wear elements: timing belts, spark plugs, particle filter… In addition, they pointed out that some elements suffer less wear over the years and kilometers. For example, they predict a longer useful life for disc brakes because, especially in the city, most of the braking is absorbed by regenerative braking. and the day to day. There is another invariable fact: on a day-to-day basis, an electric car is almost always cheaper than a gasoline car. In the city, the electric car consumes less than a gasoline or diesel car. This, in addition, is exposed to a greater number of breakdowns with switching on and off every few kilometers. But if you want to do the math. An electric car in the city can easily move at 10-15 kWh/100 kilometers. That means that, with a domestic charge at 10 cents/kWhwe are talking about between one euro and one and a half euros per 100 kilometers. In the city, compared to a hybrid that consumes 4 liters/100 km we are talking about more than five euros difference per day. If it is a gasoline that moves at around 7 l/100 km in the urban environment, the difference goes up to nine euros. It is in long-distance getaways where the circumstances are equal. If an electric car consumes 18-20 kWh/100 km and refuels at 0.50 euros/kWh, we are talking about between 9 and 10 euros to travel 100 kilometers, figures very similar to gasoline. Charged in an ultra-fast plug at about 0.80 euros, we are talking about gasoline or diesel winning by a lot. Yes, but. That is to say, electric car is cheaper. Almost always, but not always. First, because that first comparison that has gone viral has something of a trick: the data is from 2021. The electricity figures posted above, for example, are current and less favorable to the electric car. However, as we have seen, those who use the car in the urban environment are very likely to find it worth opting for this technology. Of course, the latest data that are collected from ADAC (the German RACE) are not so optimistic. In that case they talk about a saving of between 20 and 30% in favor of the electric car. That is, they continue winning but the margin is narrowing. And if…? Calculating what one saves with an electric car is not entirely simple. For example, right now you can calculate how much money you would save in regulated parking areas in those cities where there are discounts on parking. And you can do the math thinking that the MOVES III Plan but, in some autonomous communities, this is not entirely safe. But not only that, when calculating what a car costs we can keep in mind its selling price, whether there is a premium for the electric version, the expected savings with our type of use and the kilometers to be traveled… but, What happens if we want to sell the car? In that case, the electric car seems to lose out. At this time, it is a technology that devalues ​​quickly because batteries degrade over time (range is reduced) and innovations are making cars obsolete in a very short time while new models reduce their prices. That is to say, the second-hand market has everything to continue losing money with the electric car. So what do I do? The first thing we recommend in Xataka is that you have very clear what kind of use you are going to make the vehicle. Be as rational as possible or, at least, be very clear about what you value above all else. If you like a passionate car and money doesn’t matter to youget the vehicle that you like the most. Here, however, we are here to talk about money. If you want a adjusted car, calculate the daily kilometers you travel, the types of outings you do and make calculations of the battery size you need. Of course, if a small car with 50-60 kWh capacity is enough, keep in mind that you will have to make concessions when you travel. In that case, only you set the price for your time. With all this in mind, do the following math: Cost … Read more

Data centers do not want to depend on the conventional electrical grid. Solution: build your own plants

AI data centers have sparked a new fever: the so-called “bring your own power.” The demand and consumption The pressure these plants impose is so enormous that they do not want to depend on external sources. The solution is theoretically simple, and we are already seeing how when a new data center is built, it is normal for some type of power plant to be built next to it. We are seeing it now. The data centers that OpenAI and Oracle are building in West Texas are accompanied by the creation of a natural gas-based power plant. Both xAI’s Colossus 1 and Colossus 2 in Memphis take advantage of gas turbines. And as they also indicate in The Wall Street Journalmore than a dozen Equinix data centers across the US are powered by stand-alone fuel cells. If the conventional electrical grid cannot be used, nothing happens: you create a power plant and that’s it. The US has an electrical problem. The technology giants would prefer to connect to the conventional grid, but bottlenecks in the supply chain, bureaucracy – permits, licenses – and the slowness in building the necessary transmission infrastructure prevent this. According to the ICV firmThe United States would need to add about 80 GW of new generation capacity per year to keep pace with AI, but right now less than 65 GW per year are being built. There is another direct consequence of this problem: the rise in the electricity bill. Data centers that look like cities. The needs and ambition of AI companies has made data centers become calculation and resource consumption monsters. One can only consume as much electricity as 10,000 stores in the Walmart electronics chain, WSJ estimates. Before 2020, data centers represented less than 2% of US energy consumption. By 2028 they are expected to represent up to 12%. A 1.5 GW data center, for example, would have consumption similar to that of the city of San Francisco, with about 800,000 inhabitants. China has a lot of advantage over the US in this. While the US deal with that lack of powerChina does not stop investing in new energy generation. According to data According to the National Energy Administration, the Asian country added 429 GW of new energy generation in 2024, while the US only added 50 GW. It is true that China has four times the population, but its centralized planning is helping to avoid problems that affect the US electrical grid. The white knight to the rescue. Faced with this shortage, natural gas has become the preferred resource for on-site energy generation. Although large turbines have long delivery times, smaller turbines or fuel cells that use natural gas are being used because of their rapid availability and installation. Renewables lose steam. Meanwhile, things are not promising for renewable energies (solar and wind, especially). There are about 214 GW of new generation theoretically in projectbut spending on such technologies could decline due to the potential loss of tax credits: the Trump administration criticizes that those clean energies do not provide a constant flow necessary for AI. The nuclear alternative. Faced with this apparent decline of nuclear energy, there is a growing interest in compact nuclear reactors (SMR), which allow us to provide the advantages of this type of center and a flexibility that can be very interesting for AI data centers. amazon, Google, Goal either Microsoft They are betting part of their future on nuclear powerbut that It doesn’t mean there aren’t challenges to overcome.. Image | Wolfgang Weiser In Xataka | World record in nuclear fusion: the German Wendelstein 7-X reactor has broken all records

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.