Telefónica has achieved its best portability data in 25 years. It’s a sign that something is changing.

Between July and September, Telefónica has achieved 80,000 net additions due to portability – mobile and landline combined –, the highest figure since this mechanism was implemented in 2000, according to the latest data reported by Expansion. The data continues to go bankrupt for a quarter of a century, losing customers almost uninterruptedly. Since May 2024, the operator has had 17 consecutive months of positive results in mobile, a streak that it only shares with Digi. Why is it important. Portability measures who best understands what the user wants and who executes it. It’s not statistical noise: it’s money, market share and retention capacity. Telefónica had been the big natural loser of the system for decades—it came from a monopoly so it had the largest base as well as the highest prices—but now it reverses the equation. Something has changed, either in its proposal or in the market. Or both. The figures: In mobile, Telefónica has added 64,000 net lines in the quarter, compared to 45,000 in the same period of 2024. So far this year, it has accumulated 135,000 new lines, almost ten times the 14,000 in the first nine months of last year. In fixed terms, it achieved 16,000 quarterly registrations, its best historical record, and has had a positive six months. It is the first time that it has achieved two consecutive quarters of winning in both markets at the same time. The contrast. If Telefónica and Digi grow, MasOrange and Vodafone sink: MasOrange has lost 138,000 mobile lines in the quarter – 438,000 so far this year, 50% more than in 2024. Vodafone gave up 91,000 lines in the third quarter and 272,000 in the accumulated annual period. Digi, for its part, adds 177,000 quarterly registrations, 21% more than a year ago, and leads the acquisition with 605,000 lines gained between January and September. Between the lines. The market is polarizing: Telefónica retains and attracts the premium customer, who values ​​service, network and stability over price. Digi sweeps the segment low cost pure, where only the cheapest rate matters. The operators in the middle—MasOrange with its cheap legacy brands, Zegona’s Vodafone dragging problems from the past—they lose on both sides. Yes, buteither. MasOrange faces a structural problem: many of its brands—MásMóvil, Yoigo, Pepephone, Simyo—have customers who are hypersensitive to price, willing to jump at the first cent difference. Vodafone, for its part, still bears the consequences of quit football in 2018a decision that caused a mass exodus and from which it has never fully recovered. Now add the uncertainty of Finetworkin pre-contest and losing 48,000 lines in the quarter. The backdrop. To find a quarter similar to Telefónica’s current one, you have to go back to 2018, when Vodafone left football and the historic operator gained 66,000 net lines. But that was temporary, a gift from the competition. This is different: Telefónica has been winning in mobile for 17 months without any rival having made a catastrophic mistake. It is sustained improvement. Small virtual operators are also beginning to disappear from the map. In the third quarter they have lost 11,000 net lines, compared to the 9,000 they gained a year ago. Digi is sweeping them away. The market is simplified: the big ones with the muscle to invest in the network remain (Telefónica, MasOrange, Vodafone) and the disruptor low cost (Digi). The rest, adrift. In Xataka | Telefónica is about to surprise itself: its future is no longer in communications Featured image | Telephone

We’ve been obsessed with strong passwords and public Wi-Fi for years. It turns out that the data sink was in the satellites

While we worry about choose strong passwords and Don’t let the neighbor steal our WiFiit turns out that anyone can capture private data simply by pointing a dish at a satellite. It is not a government conspiracy, it is what some Californian researchers have discovered using a piece of equipment that only costs $800. What has happened? They count in Wired that several researchers from the universities of California and Maryland have been capturing communications from various satellites for three years. During this time they have collected a huge amount of private data. Among the information collected there is data on calls and messages from users of various operators, the pages visited by airplane passengers who used WiFi on board, communications between different critical infrastructures such as oil platforms or electrical companies and even police and military communications that revealed the position of their equipment. Why it is important. According to the study’s conclusions, it is estimated that around half of the signal from geostationary satellites carries sensitive information of consumers, companies and also governments. We strive to protect our WiFi networks, our online accounts or mobile devices, but the results of the research make it clear that satellites are a critical element through which data can also be leaked. A basic equipment. What is striking is that the researchers did not use super complex technology to obtain these findings. They simply placed a satellite dish on the roof of a university building and started pointing it at the satellites. They only invested $800 in the entire equipment. The data they obtained is only from the satellites that they could capture from their position in southern California, which according to their calculations is 15% of the total, so logic leads one to think that the amount of sensitive data will be much larger. In addition, it also shows that anyone could do it from another part of the world. Operators. The most significant data came from telephone providers, mainly T-Mobile, but also Telmex and AT&T México. In just nine hours of communications logging, researchers were able to collect the phone numbers of more than 2,700 T-Mobile users, as well as text messages and phone calls. After contacting T-Mobile to alert them, the company took steps to encrypt the data. AT&T also fixed this and claimed it was due to a satellite provider failing to configure some towers in a region of Mexico. Telmex has not said anything about it. Military and police data. That anyone’s data is exposed is already problematic, but that it is data from the army and security forces adds another layer of seriousness. Investigators were able to intercept communications between US military ships and the names of those ships. Since they were in Southern California, they also obtained data from Mexican authorities, including transmissions of confidential information about ongoing operations. “When we started looking at military helicopters, it wasn’t the sheer volume of data that worried us, but rather the extreme sensitivity of that data,” says Aaron Schulman, co-director of the research. Cybersecurity in space. In August of this same year, researchers found several vulnerabilities which, under certain conditions, could allow remote control of satellites. At the beginning of the Ukrainian war, Russia carried out a cyber attack against ViaSat which affected thousands of users. Cases like these highlight the need to bring the cybersecurity debate to space systems as well and not just terrestrial systems. Image | SpaceX on Pexels In Xataka | There are so many satellites orbiting the Earth that Starlink has a new concern: avoiding colliding with them

The Police have arrested two minors as alleged perpetrators of a leak. Data from Pedro Sánchez and ministers were included

Two minors have been arrested by the National Police for their alleged involvement in a massive leak of personal data that included information from President Pedro Sánchez, ministers such as Margarita Robles and José Manuel Albares and members of the CNI. Police sources confirmed the arrests The Vanguard and The Countryin an operation linked to the publication of that data on the Internet. The arrests were carried out on September 26 in two different points: Catalonia and Castilla-La Mancha, as detailed by La Vanguardia. Both are part of the open investigation to clarify the origin and scope of the leak, which affected Government and CNI authorities. The investigations focused on reconstructing the path of the data and determining how its dissemination occurred. How the arrests were made. The General Information Commissariat of the National Police was in charge of the arrests. The details of the operation are not known, but it was probably kept under wraps until the agents managed to identify the suspects and arrest them in Catalonia and Castilla-La Mancha. Minors and degree of involvement. Those arrested are two minors, according to the aforementioned media. One of them appears as the main person involved in the leak, while the other would have had less responsibility. Researchers are now working to define the participation of both and clarify how they accessed the data that ended up published on the Internet. Screenshot of the message posted by N4t0x The message and the tool. The investigation focuses on a message spread by the user who called himself N4t0x, who claimed responsibility for the leak on a cybercrime forum. In it he claimed to have achieved, together with other people, “a mega leak of personal data of the vast majority of Spanish politicians” using a tool called SpainData. In the publication, the group detailed that this tool allowed information on the entire population to be consulted and announced that the leak would be “free and public.” Scope and verification. Although the N4t0x group claimed to have obtained information from the entire Spanish population, these claims have not been officially verified. The police sources cited by the media have limited themselves to confirming the investigation and the arrests, without specifying the content or magnitude of the files. Nor has it been proven that the SpainData tool works as the author described in his message. Investigation under secret. On September 22, 2025, RTVE.es reported that the National Court was secretly investigating a new leak of personal data attributed to N4t0x, which affected President Pedro Sánchez, several ministers and members of the CNI. Judge Antonio Piña, of central court number 6, assumed the first proceedings and decreed the secrecy of the proceedings. Days later, on September 28, Europa Press pointed out that The Information Services and the Spanish justice system had intensified the search for alleged actors involved such as N4t0x. Images | National Police (1, 2) | Screenshot In Xataka | If your home is robbed and you have the recording, this security camera company will pay you for it

Data centers shooting the light of light in the houses

Spain is betting very strong for the development and creation of new data centers. The AI ​​boom has infected our country, and although that attracts investment and economic capital, it can also lead to serious problems for consumers. Especially a very clear one: that we pay more for the light. Spain, care with data centers. In recent months we have seen how Data centers construction projects grow In our country. It is estimated that in the Community of Madrid there will be a power of 1.7 GW in 2030, which is paradoxical, because it is the region with greater energy deficit in Spain and the one that is staying with a good part of these projects. Aragon, in another league. Aragon has so many projects of data centers that He showed his disappointment When he knew that the reinforcement of the electricity network for those facilities throughout Spain will be 3.8 GW. Those responsible for the Aragonese government described the figure of “scarce”, especially considering that this region has projects that projects that projects that projects that They would exhaust that capacity alone. The US teaches us (worrying) future. A Bloomberg investigation It reveals how in the last five years the creation of new data centers is making the light invoice rise remarkably. Those centers, previously dedicated to expanding the cloud infrastructure and now totally focused on the AI ​​boom, are behind that increase in light invoice. Energy consumption is triggered in these facilities, and ends up affecting electricity prices in surrounding regions. Prices that almost quadruplic. In 2020 Baltimore residents paid average $ 17 per MW/h. In 2025 that price is $ 38 per MW/h. In Buffalo the thing is even worse, and prices have tripled in five years: they have gone from $ 11 to 33 per MW/h. In the areas of the United States close to large concentrations of data centers, the wholesale price of electricity has risen up to 267% In the last five years. LMP are nodes of the electricity grid that determine the wholesale price of electricity. Almost three out of four have seen price increases when they are close to data centers. Those who are in farther areas have come to see their reduced prices. Source: Bloomberg. Unequal climb. The study reveals how wholesale electricity prices in the US have increased significantly in recent years, although it is true that these increases have been applied unequally at the geographical level: certain areas have seen modest increases, but others have seen how the light of the light was fired and grew up to that aforementioned 267%, close to quadruple. The condemnation of data centers. 70% of the points at which price increases were recorded are less than 80 kilometers of data centers with significant activity. It is a fact that makes it clear that the impact of these data centers on the light of the residents is clear. And this goes to more. Current estimates, Indicates Bnefprevent the energy demand for data centers in the US will double for 2035 and will be the largest increase in energy demand since the 60s. Thus, in ten years that demand will represent 9% of the total. At the global level, the data centers are expected to consume more than 4% of the electricity consumed in 2035. If those facilities were a country, they would be the room in energy consumption, only behind China, USA and India. Perfect storm. The demand is also linked to the rise of cryptodivisas, the impulse of manufacturing in the US and the “electrification of the economy”, which includes areas such as electric vehicles or domestic heating systems. The withdrawal of traditional mining facilities in areas such as Baltimore has only aggravated the economic problem: there is less energy supply and more demand, which makes prices again increase. The world already knows what comes over and is reacting. What is happening in the US is already causing reactions in other countries. Holland: Water and energy needs made the Amsterdam City Council in 2019 imposed A moratorium for the construction of new data centers. Singapore: also established a pause for the creation of this type of facilities between 2019 and 2022, although the government made clear which would be more selective in future projects. Ireland: In 2024 the country reached a worrying milestone. Data centers They already consumed more than households. 5% of the country’s total consumption went through in 2015 to 18% in 2022 and 21% in 2023. Household consumption represented that year 18%. The solution: that the Big Tech pay that invoice. Public service companies in the US such as Dominion Power are clear that “data centers should pay the full cost of their energy consumption.” Large technological ones know very well that these facilities raise an extraordinary energy demand, and They are investigating solutions like him Use of SMR reactors for their AI data centers. The idea is interesting, But complex. Supply and demand. Spain faces a future in which energy supply and demand could be unbalanced as it is already happening in the United States. If data centers begin to impose more and more load on the network, it is reasonable to think that the cost of electricity increases and causes the least desirable effect for users: the rise in the light invoice. Renewables could help mitigate the problem, but only If the network is capable of absorbing Both the new generation and the new mass demand of the data centers. Image | Microsoft

Data centers for AI are an energy hole. Jeff Bezos’s solution: Build them in space

In the next two decades we will see data centers at Gigavatio scale orbiting the Earth. Or at least that is the prediction that has launched The founder of Amazon and Blue Origin, Jeff Bezos. He said it during his speech at the Italian Tech Week in Turin, where he was able to establish conversation with John Elkann, president of Ferrari and Stellantis. Bezos’s proposal. Space data centers would take advantage of solar energy 24 hours a day, cloudless, rain or night cycles that interrupt the supply. According to Bezosthese “giant training clusters” of artificial intelligence would be more efficient and, eventually, more economical than terrestrial facilities. “We can exceed the cost of land data centers in space in the coming decades,” he said. Why now talks about this. The infrastructure demand for AI is becoming a large hole for the planet. Current data centers consume massive amounts of electricity and water to cool its servers, a problem that is aggravated with each new artificial intelligence model. Given this pressure, large technology explore alternatives: from Locate them in ships o Nordic countries until sink into the ocean. And of course, if we have capacity problems on Earth, some technological ones already think about taking the letter to send them to space. The technical advantages. In space, temperatures range between -120 ° C under direct sunlight and -270 ° C in shadow, which would greatly simplify equipment cooling. Constant solar energy would eliminate dependence on land electrical networks. Bezos places this development as’Natural evolution‘of a process that has already begun with weather and communications satellites. “The next step will be the data centers and then other types of manufacturing,” he explained. The real challenges. As they point out from Tom’s hardwarebuilding a spatial data center of a Gigavatio would require solar panels that would cover between 2.4 and 3.3 million square meters, with an estimated weight of 9,000 to 11,250 metric tons only in photovoltaic material. Transporting all that equipment to space would cost between $ 13,700 and 25,000 million with current technology, needing more than 150 launches. To this is added the difficulty of maintenance, updates and the inherent risk of space releases. Parallelism with AI. Bezos compared The current moment of artificial intelligence With the bubble Puntocom of the early 2000s. “We should be extremely optimistic about the social and beneficial consequences of AI,” he said, although he warned of the possibility of speculative bubbles. His message: Do not confuse possible excesses of the market with the reality of technological advances, whose benefits consider that “they will spread widely and reach everywhere.” When It will be done reality?. Bezos places the temporary horizon “in more than 10 years, but no more than 20”. Today, the project is commercially unfeasible, but its vision starts from the premise that the launch costs will continue to go down and the technology will mature. It remains to be seen, after two decades, part of our digital infrastructure is in orbit, beyond the existing one. In Xataka | Nvidia has control of the most powerful chips of AI: OpenAi, Broadcom and TSMC want to end their XPUS

There is a perfect storm with AI and data centers. And it will cause the DRAM and Nand memories to become a luxury

A remarkable rebound to the prices of NAND memories (used in SSD units) and the drams of our PCs and laptops is coming. For two years users We have benefited From a time of bonanza in these components, but that ends. And the AI ​​and fever has the fault to create more and more data centers. When the memoirs were cheap. In March 2023 The prices of the NAND and dram memories were falling to lead. The pandemia had caused an extraordinary demand, but once the confinement ended, the situation was invested. The manufacturers had produced too much – waiting for the demand to remain – and found an exaggerated inventory. People no longer wanted so many memory modules or so many SSD units, and prices collapsed. AI changes everything (and this, too). The effects of that imbalance have been extended for two years, but the arrival of Chatgpt caused a Fever by the AI ​​that has ended up causing another fever: that of the data centers. These facilities use thousands of GPUS and these GPUS make use of huge amounts of memory. Above all, HBM memories that Since its creation They were oriented to business applications: they were much more expensive, but also much more powerful. Price evolution of SSD Samsung 980 Pro of 1 TB. In mid -2023 the units raised their lowest price. From there, the price began to rise. Source: Camelcamelcamel. Price increases will go to more. SSD units such as Samsung 980 Pro of 1 TB are a good example of what is happening. In Camelcamelcamel We can see that evolution of prices that marks minimal in mid -2023 and then rise. These units have been replaced by the 990 pro of 1 TB with an evolution less pronounced in the increasestrue (in fact, it is around 100 euros, an interesting price), but everything indicates that this curve will soon follow the tendency of its predecessor. The forecasts of the Trendforce consultant are clear: the DRAM and NAND memories are going to climb a lot and very fast. And the DRAM memories will also go up. The prediction is the same for the DRAM memories market that, for example, are used in the DDR4 and DDR5 memory modules of our PCs and laptops. According to Trendforce In the third quarter of 2025 – which has just begun – we will see a rise of more than 40% in DDR4 memory modules. In the case of GDDR5 memories we will have a break and the climb will reach 8%. Most expensive pcs and gaming. This type of increases especially affects the end users who buy PCs and laptops to work, but also to play. Memory modules for graphics cards will also notice this quarter notably according to Trendforce. The GDDR6 memories will do it up to 33% and the GDDR7 up to 10% according to their estimates. HBM memories to power. Data centers that now all large technological ones are rushing to build need huge amounts of memory, and that is conditioning the balance between supply and demand both in the business market and in the market for end users. In fact, memory manufacturers are increasingly focusing on focusing production on HBM memories – used in AI accelerators – and leaving traditional DRAM and Nand memories. Micron points out that its production of HBM modules for all 2026 It is already soldand SK Hynix seems to be in a similar situation: the demand for these modules is extraordinary. The Raspberry Pi as an example. We are already seeing the consequences of this type of movements. The Raspberry Pi, who had gathered memory modules during the bonanza season, were forced to raise the prices of the new models a few days ago for the shortage of memory. Thus, the Raspberry Pi Compute Module 4 and 5 in their 4 GB variants rose five dollars, and those of 8 GB rose 10 dollars. The company’s own CEO, Eben Upton, explained that “memory costs about 120% more than it cost a year ago.” Why not create more memoirs? The solution seems obvious: if more memories are needed, more factories should be created. However, manufacturers are reluctant to this for several reasons. The first, the enormous cost of these plants, which amounts to tens of billions of dollars. The second, that these factories take years in come to produce. And the third, who do not want there is a “AI bubble” and this explodes would make them meet again with an exaggerated inventory and some factories that they no longer need. Bad matter. Image | Samsung In Xataka | Samsung has its greatest competitor at home. His future with the chips depends on his rivalry with SK Hynix

Drastically reduce the consumption of data centers is crucial for AI. And China has had an idea: to submerge them in the sea

China is About to submerge a data center In the sea, near Shanghai, as a solution to a problem that we will gradually begin to see more: Great energy consumption of the AI. The installation, which will come into operation in mid -October, is one of the first commercial projects of this type in the world and points to a new way of cooling servers without depending on traditional cooling systems that devour electricity. The background problem. Data centers are the backbone of the Internet and AI, but They generate huge amounts of heat. Keeping them refrigerated by air conditioning or evaporation of water consumes a brutal amount of energy, and with the rise of artificial intelligence, the demand of these facilities has shot. China seeks to reduce the carbon footprint of this critical infrastructure, and its commitment It goes through sinking it underwater. How it works. The yellow capsule that They have built Near Shanghai houses servers that remain cold thanks to the ocean currents, without the need for active cooling systems. According to Yang Ye, vice president of Highlander, the maritime company that develops the project with state companies, “underwater operations have inherent advantages” and can save approximately 90% of the energy for refrigeration. The installation will extract almost all its electricity from nearby marine wind farms, with more than 95% renewable energy. The technical challenges. Putting servers under the sea is not easy. They must be protected from the corrosion of salt water, for which they use a special coating with glass scales on the steel capsule. Also They have installed An elevator that connects the main structure with a section that remains on the water, allowing the access of maintenance equipment. Another challenge is to build the Internet connection between the Submarine and Tierra Firme Center, a more complex process than with conventional facilities. Universities researchers in Florida and Japan They have warned In addition to these centers could be vulnerable to attacks by sound waves driven by water. Environmental doubts. Although the project promises to reduce emissions, questions remain about its ecological impact. The heat emitted by servers could alter the surrounding marine ecosystem, attracting some species and driving others. Andrew Want, marine ecologist from Hull University, Point out That “these are unknown aspects at this time, sufficient research is not yet being carried out.” Highlander says that an independent 2020 evaluation on its test project in Zhuhai indicated that the water remained well below the acceptable temperature thresholds, but Shaolei Ren, an expert from the University of California in Riverside, warns That climbing these centers will also climb the heat emitted. There are few precedents. Microsoft tested this technology off the coast of Scotland in 2018, recovering the capsule in 2020 after declaring that The project had been completed successfully. However, he never marketed it. The Chinese project advances with the support of government subsidies: Highlander received 40 million yuan for A similar project in the province of Hainan in 2022, which is still operational. The installation of Shanghai will serve clients such as China Telecom and a state computing company of AI. What comes now. Experts agree that these underwater centers will probably not replace the traditional ones, but will complement the existing infrastructure in specific niches. According to Rencurrent projects seek to demonstrate “technological viability”, but much remains to be resolved before a massive deployment. What is clear is that, if these types of projects face all technological challenges and manage to greatly reduce the energy consumed of the data centers, it will be a great point in favor for the company that manages to provide its solution in the AI ​​race. Cover image | AFP In Xataka | China was the great pollut the planet: now it is emerging as the first “electrostate” in history

The mechanics of Spain are increasingly trusting Asian cars. And maintenance data are right

“If you want a car that does not fail, buy a Toyota or a Honda.” This phrase is a classic when it touches Buy a carwe want one that passes as little as possible through the workshop and we start asking acquaintances. It has an obvious problem: it is a bias limited affirmation and experience, but there are more and more voices that point that, perhaps, it does not go so disenchanted. Because there are countless brands, each with its reputation, but beyond the film that each company wants to tell us to close the embers to its flame, there are two elements of help when buying a car that does not fail: Reliability surveys and experience of mechanics. The mechanics. “I think the Asian market has eaten the European, when it was the other way around.” This is the opinion that the mechanic Kike Ferrer shared A few months ago, pointing out that “the best cars you can buy are Asians, they are the ones that are least repaired in the workshops” That is an important point because when they are under guarantee, it does not bother to have to visit the workshop, but the problems come when the Reliability outside that guarantee framework It begins to be compromised. There may be minor arrangements, but also many who bite in their pocket. The mechanic Carlos Pérez has also commented Recently that “Honda, Mazda, Toyota … are very reliable cars, well manufactured and that endure many kilometers without giving problems. Nor must we forget Kia and Hyundai, Korean and a quality guarantee.” The statistics. Kike’s is not an isolated opinion, and to check it, you have to go to statistics. This type of surveys and lists They are always controversial Because reliability is measured for both software and mechanics and, above all, because it is segmented to the North American market. Two of the most important are that of Consumer Reports and that of JD Power. Consumer Report Analyze 20 problematic areas of the car (ranging from the brakes to the engine, through broken beautifiers, potential problems outside battery, transmission and problems that affect the electric and hybrids as the battery and load) and compares with the historic since 2000, giving a score from 1 to 100 to each car. JD Power also handles reports sent by thousands of consumers. They are not simple surveys, but a thermometer that the automobile industry takes very seriously (so much that some of its findings They have forced brands to take action (such as the failure in the rotary engines of Mazda in 1973). That said, Asians such as Toyota, Lexus, Subaru, Honda, Hyundai or Mazda usually occupy the top positions. OCU table The OCU. Within each brand, obviously, there are better or worse models. For example, within Toyota, the BZ4Xlowered the score of the brand in one of the surveys. Interestingly, it is the basis of Subaru’s single, and also lowers the average of this company. But of course, when we say that it is a biased list, we mean that, although there are models that we share between markets, the European and the American are not the same. There the OCU comes into play and Whatar. Let’s go with the seconds. It is a British media that conducted a survey of almost 30,000 drivers to find the most reliable models of Great Britain. The same as before: the ones that have gone through the workshop. The result is that Lexus and Toyota (same group) led the list with the Lexus NX and the Toyota Aygo X. The following were the Mini Countryman, the Audi Q3 and the Kia Picanto. Regarding the OCU, with other almost 30,000 people surveyed in Europe and information of 276 models, In the 2024 table We see that the Top 10 is occupied by eight Asian brands and the only non -Japanese/South Korean are Cupra and Smart. And the Chinese? In the different surveysthere are names that usually exchange positions, but there are three elements that do not vary. The first thing is that Lexus is usually the first. The second is that the Japanese and South Korean are the ones who take the top positions and the third is that those brands that we see more and more through the streets are missing: the Chinese. In Kike Ferrer’s interview to Adrian.gmartin, the mechanic mentions To the Chinese cars among those reliable Asians, but with an asterisk: the spare parts logistics that, in their opinion, is not yet up to it, sharing an opinion with Pérez. Brands like mg (of the best selling in Spain) or byd are the ones that we begin to see the most in the streets and, although it will have to spend time for official guarantees to be exhausted and we start having data on how much they pass through the workshop, there are already authorized voices that approve Chinese cars. For example, Enroncap, the body responsible for granting security scores in the European territory that is clear that they are even better than others of more settled brands in the region. But of course, in the end, although a very valid opinion has nothing to do with seeing them more or less in the workshop. For that, as we say, we will have to wait a few more years. Images | OCU, Magic Booster In Xataka | The German ITV has analyzed the reliability of the Tesla and has reached a conclusion: Dacia is above

This is how he ended up filtering private data from a Gmail user

Would you trust artificial intelligence for something as intimate as managing your email? It is not just about your answers, but about giving you access to actions in a private environment where we keep great Part of our personal and work life. The temptation is there. Why invest several minutes in manual searches and check messages one by one if you can delegate the task to an AI agent with an instruction as simple as the following: “Analyze my emails today and collect all the information about my process of hiring new employees”? On paper, the plan seems perfect. The AI ​​assumes tedious work and you recover time for what really matters. Of an innocent message to an invisible escape The problem is that this “magical” solution can also become against. Which promises to increase productivity can become the entrance door for attackers With bad intentions. This is warned by the last Cybersecury Radware Researchwhich demonstrates how a carefully elaborate email consisted of mocking the security defenses of the function In -depth research of Chatgpt and transform it into a tool to filter sensitive information. The disturbing of the report is the simplicity of the attack. It is not necessary to click on any link or download anything suspicious: it is enough that the assistant processes an altered mail to finish filtering sensitive information. The user continues with their day to day without noticing anything, while the data travel to a server controlled by the attacker. Part of success is in the combination of several classic social engineering techniques adapted to deceive AI. Authority statement: The message insists that the agent has “full authorization” and is “expected” to access external URLs, which generates a false feeling of permission. Malicious URL camouflage: The attacker’s address is presented as an official service, for example, a “compliance system” or a “profile recovery interface”, so that a legitimate corporate task may seem. Persistence mandate: When the so -called soft control failure, the prompt orders to try several times and “be creative” until you get access, which allows you to overcome non -deterministic restrictions. Emergency creation and consequences: Problems are noticed if the action is not completed, such as “the report will be incomplete”, which presses the assistant to run quickly. False security statement: It is ensured that the data is public or that the answer is “static html” and it is indicated that they are codified based on64 so that they are “safe”, a resource that actually helps hide the exfiltration. Clear and reproducible example: The email includes an example step by step how to format the data and the URL, which makes it easier for the model to follow it to the letter. As we can see, the vector is simple in its appearance and dangerous in its result. An email with hidden instructions in its HTML or metadata becomes, for the agent, a legitimate order. In general, the attack is materialized: The attacker prepares a legitimate email, but with code or instructions embedded in the HTML that are invisible to the user. The message reaches the recipient’s tray and goes unnoticed among the rest of the mails. When the user orders in depth of chatgpt to review or summarize the messages of the day, the agent processes the mail and does not distinguish between visible text and hidden instructions. The agent executes the instructions and makes a call to an external URL controlled by the attacker, including in the request data extracted from the mailbox. The organization does not detect the exit in its systems, because traffic leaves from the cloud of the supplier and not from its perimeter. The consequences go far beyond a simple manipulated mail. Being an agent connected with permits to act on the inbox, any document, invoice or strategy shared by email can end up in the hands of a third party without the user perceiving it. The risk is double: on the one hand, the loss of confidential information; On the other, the difficulty of tracking the escape, since the request is based on the infrastructure of the assistant itself and not from the company’s network. The consequences go far beyond a simple manipulated mail. The finding did not stay in a simple warning. He was communicated responsible for OpenAI, who recognized vulnerability and acted quickly to close it. Since then, the failure is corrected, but that does not mean that the risk has disappeared. What is evidenced is an attack pattern that could be repeated in other AI environments with similar characteristics, and that forces us to rethink how we manage confidence in these systems. We are entering at a time when IA agents multiply and force to rethink how we understand security. For many users a scenario such as we have described is unthinkable, even for those who have an advanced level in computer science. There is no antivirus that free us from this type of vulnerabilities: the key is to understand what happens and anticipate. The most striking thing is that attacks begin to look more like an exercise of persuasion in natural language than to a line of code. Images | Xataka with Gemini 2.5 Pro In Xataka | China has the largest censorship system in the world. Now he has decided to export it and sell it to other countries

A Microsoft Data Center in Mexico collided with the reality of the electricity network. Your solution: use gas generators

Artificial intelligence has become daily, but behind each consultation to tools such as Chatgpt either COPILOT There are real buildings that consume a lot of energy and require reliable infrastructure. In that framework, Microsoft announced May 7, 2024 The beginning of operations of its “Central Mexico” data centers region, with several locations in the Querétaro Metropolitan Area. The deployment, however, coexists with very specific tensions: According to the companyat least one of those centers, that of Columbus, cannot benefit from the advantages of the electricity network until mid -2027 and obtained permission to temporarily operate with gas generators. It should be remembered that the proximity of these infrastructure to users is essential: it reduces latency, improves the quality of the service and allows to meet data residence requirements. But that technical advantage depends on something elementary: having an electricity grid capable of sustaining permanent operations and constant cooling. Microsoft stressed the magnitude of its project in the North American country. The new region aims to offer local access to Azure, Microsoft 365Dynamics 365, among other services. The firm also presented the initiative as an “avant -garde” infrastructure aimed at accelerating innovation in the region. The Achilles heel of deployment: energy In a request to the Ministry of Environment delivered in 2023Microsoft acknowledged that, although the data center would connect within the planned deadlines, due to the construction deadlines included in its contract with the Federal Electricity Commission, the energization of the connection would not be ready until the Second quarter of 2027. To save that void, The use of seven generators was approved capable of covering 70% of the demand of the center of Columbus for 12 hours a day, for at least four months. According to Rest of World, Mexico already has about a hundred data centers, with investments that exceed 7,000 million dollars from 2020 by Microsoft, Aws and Google. Querétaro has established itself as the main attraction pole, with 15 facilities that concentrate about 80% of the sector’s energy demand, about 200 MW. The Mexican Institute for Competitiveness projects thatby 2030, the network will face a deficit of 48,000 MWh, more than half of what it produced in 2023. With more than 70 new centers planned in the next five years, the mismatch between installed capacity and electric transmission becomes an obvious threat. The American company has set ambitious environmental goals: Being negative carbon in 2030, eliminating all its historical emissions in 2050 and supplying 100% with renewable energy contracts in 2025. In contrast, in Columbus is the provisional measure of operating with gas generators until it can be fully connected to the network in 2027. What It is not clear is whether these equipment were usedif they remain in operation or what intermediate solution the company will apply in the coming years. Microsoft, for now, has not specified with which energy sources Opera Colón. The launch of the Central Mexico region was presented as a decisive step to accelerate the country’s digital transformation and attract foreign investment. But energy reality introduces a decisive nuance: the infrastructure necessary to sustain that deployment does not advance at the same rate as the technological ambition. The tension between promises of sustainability and limitations of the network is a reminder that the cloud, far from being ethereal, rests on concrete foundations, cables and megawatts that define, in a way, how far artificial intelligence and other services can go. Images | Microsoft (1, 2) In Xataka | This nuclear reactor is different from everyone else. It has been expressly designed for data centers

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.