the controversial measures with which we have shielded the network a year after the collapse

Next April 28 will mark exactly one year to the day that Spain and Portugal faded to black. An unprecedented “zero energy” in the last two decades that left nearly 60 million citizens without electricity, without internet, without traffic lights and with the banking system paralyzed for up to 16 hours. As they reflect in the magazine freenthat day we suddenly discovered that something we take for granted—electricity—is the fragile foundation on which our entire modern life rests. One year after the event, the initial shock has given way to data. We no longer ask ourselves only if such a blackout can happen again, but how much it is costing us to avoid it and if we have really learned our lesson. D-day is about to arrive. Twelve months later, we finally have the “official autopsy.” The European Network of Network Operators (ENTSO-E) published a comprehensive report of 472 pages where he concludes that there was no single cause, but rather a “perfect cocktail” of multiple factors. A sudden surge originating in Spain triggered instability that the system was unable to stop. As we have already explained in Xatakathe failure can be defined as “operational blindness.” The renewable plants operated with a fixed power factor; They did not know how to read the network surge and, for safety reasons, they disconnected suddenly, causing a rebound effect. Besides, as he adds BBClocal generator voltage controls were not fully aligned with operator requirements. The crisis required millisecond reflexes, but tension control was done manually. In fact, if Europe did not fall like a house of cards, it was due to an almost miraculous technicality: a relay in the Hernani substation (Gipuzkoa) acted like a “fusilazo”, cutting the connection with France in milliseconds to shield the continent. Ironically, just ten minutes later, it was that same interconnection that served as assisted breathing to resuscitate the system. The big question: what has Spain done differently? The fear of a new blackout has changed the rules of the game, but at a high price for the citizen. Electrical Network has imposed a “reinforced model” of operations. This means that they prioritize safety over cost, keeping more expensive and stable backup plants on, such as gas combined cycles. The result? The Spanish They have been paying an extra cost of 666 million euros In these eleven months only in “adjustment services”, which have shot up 43%. In the legislative sphere, the Government has approved the Royal Decree-Law 7/2026 to streamline bureaucracy through the “Renewable Acceleration Zones” (ZAR). However, experts warn thatSince there is still no structured capacity market, investing in the necessary storage systems (batteries) continues to be a financial risk for developers. There’s more shielding going on. The collapse not only left us in the dark, but it left us cut off, although in a very uneven way. While some completely lost the signal, others managed to maintain it thanks to the logistical efforts of some operators. To avoid this coverage lottery, the CNMC has proposed that Telefónica, Vodafone and MásOrange offer a “national roaming” plan in case of emergency. If your operator’s network goes down, your mobile phone would automatically connect to the competition, based on the Swedish model. Added to this is the request to make the alert system (ASA) mandatory in cars with digital radio (DAB+), to send warnings to the population immediately even if the internet is down. The false culprit and the new energy guzzler. After the collapse, many were quick to blame green energy, but the reality is different. As explained from freenthe problem is not that Spain has a lot of solar and wind energy, but that the electrical grid is still stuck in the 20th century, designed for fossil power plants and not for a decentralized system. In fact, Spain is a fascinating laboratory. According to EUObserverthe country has managed the recent price crisis caused by the Third Gulf War much better than its European neighbors thanks to its enormous solar shield. However, the trauma of the blackout has caused an absurd side effect: operators are so afraid of overloading the grid that they force solar and wind farms to disconnect more frequently. Curtailment (clean energy generated that is thrown away) has gone from 2% to 7%. And if that were not enough, the saturated network assumes the imminent arrival of a new energy-consuming giant: the massive data centers for Artificial Intelligence. The exchange of accusations is served. In the offices the short circuit has only just begun. As detailed Financial Times, The National Markets and Competition Commission (CNMC) has opened formal investigations. Red Eléctrica (REE) faces proceedings for “very serious” infractions, while giants such as Iberdrola, Naturgy, Endesa and Repsol face possible fines of up to 60 million euros for “serious” infractions. Besides, as accounted Public, up to twenty open sanctioning files. REE defends itself by ensuring that the opening of the file does not prove its guilt. Meanwhile, a Senate report promoted by the PP directly blames the Government, REE and the CNMC for ignoring known vulnerabilities, according to Reuters. And the tension reaches the limit: electricity companies like Endesa and Iberdrola They have demanded a judge access more than 8,000 calls and emails from REE executives during the hours of the blackout, after the leak of audios where technicians warned of the danger 15 days before. An electric heart that remains at risk. Spain is “a gold mine without a road”, as defined by Patxi Callejadirector of Iberdrola. We have the sun, the wind and the technical capacity. But the great lesson of this last year is that true energy independence is no longer played at the national level, but at the local level, where factories and homes install their own batteries and hybrid panels so as not to depend on the fragile central system. We survived the blackout and avoided another one by reaching for our wallets and operating defensively. But as long as the line procedures last a decade, mass storage … Read more

Five years ago, Venice spent more than 5 billion on a system of barriers against the sea. Now look for a plan B

There was a time when Venice looked at the Adriatic with ambition. The sea not only shaped the city, permeating its DNA, it also propelled it until it became a naval power who fought for dominance of the Mediterranean. Today things are different. The Serennissima (turned into tourist power) observes with increasing concern the coming and going of the tides, the same ones that in 2019 submerged it under 187 cm of water, flooding 80% of the city. The reason is very simple. Everything indicates that the multimillion-dollar system that Venice was equipped with a few years ago to protect itself from the threat of high water It won’t take long for it to become obsolete. And it is not very clear what the alternative is. One figure: 18. The threat of flooding is not new in Venice. In fact, one of the worst in memory was suffered six decades ago, in November 1966when an intense storm caused the water to reach 194 cm, flooding much of the city. However, experts have been detecting worrying signs for some time. It is not just that Venice sink or the sea level rising (which too). There are increasingly clear signs that suggest that floods will become more frequent in the future. Recently, a group of researchers dedicated themselves to analyzing the “extreme” episodes suffered by the city, those in which 60% of its surface was flooded. Throughout the last century and a half, it counted 28 incidents of those characteristics. The surprising thing is that the vast majority of them (18) were concentrated during the last 23 years. One measurement: 0.42 m. Today more than half of Venice is alone between 80 and 120 cm above the average sea level and projections show that this scenario will soon worsen: in the best of cases, if we manage to drastically reduce our polluting emissions, the sea will rise 0.42m by 2100. In the worst case, it will be 1.8 m, which would greatly complicate the outlook for the Serennissima. In fact, now the high tide already leaves St. Mark’s Square only 30 cm above the water level. One name: Mose. Aware of how much is at stake in Venice, the Italian Government has long been looking for a way to protect itself from floods. The result was Mose (experimental elettromechanical module)a system made up of four barriers and 78 independent mobile gates that allow authorities to protect the Venetian lagoon from what is known as high watertides that flood the city. The objective: to temporarily isolate the Adriatic lagoon and thus protect Venice from the most dangerous tides. To achieve this, the barriers were strategically installed in the inlets of Lido, Malamocco and Chioggia. Each gate also measures 20m wide and between 18.6 and 29.6 m long. An investment: 5,000 million. It is said that the project mobilized an investment of more than 5.5 billion of euros (its execution was marred by corruption). Its work began in 2003 and after several delays it carried out a first test in October 2020, in an event led by the then Prime Minister Giuseppe Conte. A year earlier, Venice had suffered a of the worst floods that are remembered, during which the water reached 187 cm, flooding part of the entrance to the Basilica of Saint Mark. An indicator: frequency. The problem is that the authorities are turning to Mose much more often than expected. EuroWeekly assures that in less than a month, between January 28 and February 19, the system was activated 30 times. Other media report that since their inauguration at the end of 2020, the barriers have saved Venice from flooding in 154 occasions. The problem is that the use of Mose does not come free to the region, neither in economic terms nor on a social and environmental level. Setting up the enormous Mose floodgates has a direct cost, but it also has another indirect cost: by isolating the lagoon, the system alters, for example, the activity of the port sector and interrupts maritime traffic with the port of Marghera. Guardian points out that pressing Mose’s button has an economic impact of more than 200,000 euros for Venice. For this year’s Carnival alone the total bill would be around five million euros. An extra concern: the lagoon. Not everything is measured in operational cost, maritime traffic and economic impact. Altering the tides in the area also has an impact on its ecosystem and that is something that worries experts like Andrea Rinaldo, from the scientific committee of the Lagoon Authority. Especially if two fundamental data are taken into account: first, the frequency of use in recent years; second, the forecasts for sea level rise. “With one more meter, the Mose barriers would have to be closed an average of 200 times a year, which means that they would practically always be blocked,” explains Roinaldo. “When this happens, the lagoon loses its function as a transitional environment. It would become a pond.” A victim: the lagoon itself. As explains GuardianBy blocking the flow of water, the barriers encourage the growth of algae. The problem is that when these die and decompose they directly affect the quality of the water and the rest of the flora and fauna. Does that mean Mose was a mistake? Rinaldo thinks not. The changes are simply happening much faster than engineers expected, forcing authorities and technicians to think about the future in the medium and long term. At the end of the day, if Mose taught anything, it is that projects of his importance are not approved and executed overnight. One question: What to do? The great unknown. Those responsible for Mose are looking for ways to reduce its impact, but it is not an easy decision. Among other things because the Venetians themselves have become accustomed to the barriers and gates coming into operation at the slightest risk, points out Giovanni Zaroti, one of the system technicians. Rinaldo mentions the possibility of launching an international call … Read more

Japan has crossed a red line in the Pacific with the US. China just responded with warships closer than ever

When in 2013 two Russian strategic bombers They flew over without warning airspace near Japan, forced Tokyo to deploy interception fighters in a matter of minutes in one of the most tense responses in its recent history. The episode, almost forgotten outside military circles, made clear the extent to which there are movements in the Pacific that, even if they last just hours, can change the way countries look at each other for years. A crossing of lines. Japan has taken a step that for decades it carefully avoided: integrating for the first time with combat troops in maneuvers led by the United States in the Indo-Pacific, de facto breaking a political and strategic barrier inherited from the postwar. This movement is not symbolic, because involves deploying soldiers, ships, aircraft and missiles in a real conflict simulation scenario, which brings Tokyo closer to a much more active role within the US military apparatus. The decision, furthermore, occurs in a context of growing concern about Taiwan and for him balance of power in the region, which makes this gesture more than just cooperation: it is a clear sign of strategic alignment. China’s response: closer than ever. Beijing’s reaction has been immediate and measured in kilometers: the deployment of warships on routes much closer to Japanese territory than usual, including transit through waters that it rarely used to access the Pacific. Although China insists that these are routine exercises, the pattern reveals a willingness to press and demonstrate operational capacity in sensitive areas, bringing its military presence closer to points that it previously avoided. Not only that. This movement fits in a trend wider than greater naval aggressiveness around Japan, where each maneuver not only tests capabilities, but also political limits. Everything revolves around an island. The background of this escalation is the Taiwan issuewhich acts as the axis of tension between China and Japan since Tokyo left open the possibility of intervening if a conflict breaks out on the island. Beijing has interpreted these statements as a red lineand has since responded with diplomatic protests, economic pressure and military demonstrations. Every Japanese step in or around the strait is seen as a provocation, and every Chinese move seeks to recalibrate that balance without openly crossing the threshold of direct confrontation. Balikatan: from exercise to message. It is another of the crystal clear readings. The Balikatan maneuvers have ceased to be a simple bilateral exercise to become a multinational display of forceone with more than 17,000 troops and the participation of countries such as Australia, France or Canada. The active incorporation of Japan changes its nature, because it introduces a key actor in the so-called “first island chain”, a geographical and military barrier. designed to contain Chinese expansion in the Pacific. The deployment of anti-ship missiles and live-fire exercises, including the destruction of naval targets, reinforces the idea that rehearsing a scenario of high intensity maritime conflict. The battle for the islands. Also we have talked on several occasions in this chain of territories (which goes from Japan to the Philippines passing through Taiwan) that has become the axis of the US strategy to limit Chinese naval projection. Japan, by integrating more deeply into this system, contributes to the creation of a species of distributed “fortress” that seeks to hinder any Chinese advance towards the open Pacific. For Beijing, however, breaking or surrounding that barrier is a strategic prioritywhich explains the increase in its activity beyond that line and its insistence on operating in waters increasingly distant from its coast. An increasingly fragile balance. The result of all this is a scenario where each movement has a double reading: what some present as routine trainingothers interpret it as a climbing sign. Japan has taken a step that redefines its role in regional security, and China has responded by bringing its naval power closer to a distance it previously avoided, creating an action-reaction dynamic that increases the risk of incidents. Thus, in a global context marked by many other conflicts that could divert American attention, the Indo-Pacific is positioned as the great board where the balance of power of the 21st century is played. Image | CCTV In Xataka | Japan has dozens of “forgotten” islands off the coast of China: it is now preparing for the worst scenario In Xataka | Satellite images leave no doubt: China has concentrated thousands of fishing boats off Japan, and its idea is not to fish

This huge TCL TV is at an outlet price at Carrefour

If you would like to set up a home theater and have plenty of space in the dining room, Carrefour now has this huge TV on sale TCL 98P8KX7 98 inches. It has gone from costing 1,899 to 1,399 euros. Furthermore, if you want to pay it little by little (and you have the Carrefour Pass card), you can do so in 10 easy installments of 139.90 euros each. It is only sold online and shipping is free. TCL 98P8KX7 98″ (248.92 cm), QLED, 4K UHD TV The price could vary. We earn commission from these links A gigantic TV to set up your own home theater Until recently, to exceed 85 inches at home, the only thing you had to do was buy a projector. Although, today, there are already gigantic TVs that show that size does matter. A good example is this TCL 98P8, whose affordable price allows set up your own cinema at home without your pocket being ruined. Although the size of his QLED panel It is the most striking thing about this television, all the technology it incorporates deserves mention. Its refresh rate is 144 Hz, so it can be considered a giant monitor, as it allows you to make the most of your console. PS5 either Xbox Series X or even a high-end PC. Your brain is AiPQ Pro processorwhich analyzes each scene in real time to upscale non-4K content and improve contrast. This means that, for example, DTT looks quite decent on this screen size. When watching movies, it can be noted that it is compatible with all standards: Dolby Vision, HDR10+ and Dolby Atmos. This will allow you to enjoy a totally immersive experience when watching movies. Comes with Google TVone of the most complete ecosystems that integrates Chromecast and Google Assistanttwo very useful functions both for sending content from your mobile phone directly or for searching with your voice, respectively. The audio section is also another of the highlights of this TV and its speakers are signed by Onkyo. It includes a small subwoofer on the back, which makes the TV not sound tinny like most flat screens on the market. ⚡ IN SUMMARY: offer for the TCL 98P8KX7 smart TV today ✅ THE BEST The scale: There is nothing that compares to it. Watching a movie or a football match at this size will completely change your perception of watching TV. It is literally having a cinema in your living room. 144 Hz: For such a large panel, the response is surprisingly fast, making it a very good option for gaming. ❌ THE WORST The dimensions are also its con… You will need to measure very well not only the living room but also the hallway and the elevator, since it is a 2.48 meter diagonal TV that is quite heavy. The sound… Despite having Onkyo speakers and supporting Dolby Atmos, a screen this size cries out for a high-end sound bar to match the image. 💡 BUY IT IF… You want to enjoy an immersive movie experience without the maintenance and dullness of a projector. Also, if you have a huge wall and you want it to be the star of the room, it is a good way to do it. ⛔ DON’T BUY IT IF… Your apartment is small and you are going to sit less than three meters away, eye fatigue can be a big problem since you will start to notice the pixels if the content is not native 4K. Some sound bars that may interest you for this TV LG DS60TR – Sound Bar, Bluetooth, 440W, 5.1 Channels The price could vary. We earn commission from these links TCL Q85H Pro Sound Bar 7.1.4 Channels for TV, 860 Watts The price could vary. We earn commission from these links Some of the links in this article are affiliated and may provide a benefit to Xataka. In case of non-availability, offers may vary. Images | Webedia and TCL In Xataka | Best televisions in quality price. Which one to buy and seven recommended 4K smart TVs In Xataka | Best sound bars in quality price. Which one to buy and seven recommended models from 140 euros

Iceland was one of the last places on the planet that mosquitoes had not reached. That’s now history

For centuries, Iceland has held the ‘privilege’ of being one of the few habitable places on Earth where mosquitoes did not exist, something that can be a source of envy for many, especially with the arrival of summer. All thanks to its particular climate, with constant cycles of freezing and thawing that prevented the larvae from maturing, it acted as an insurmountable biological shield. However, climate change and human action have just broken down this barrier and this Icelandic ‘exceptionality’ has ended. The discovery. The history of this biological invasion starts in October 2025where Björn Hjaltason, a resident of the Kjós region noticed the presence of some unusual insects in his garden. From there he wanted to see them closer, and to capture them he used a fairly rudimentary method based on some ropes soaked in red wine. With this trap he obtained three specimens that were immediately sent to the Icelandic Institute of Natural Sciences, where the entomologist Matthías Alfreðsson confirmed the unthinkable: they were two females and a male. Culiseta annulataa species of mosquito common in Europe, but which had never managed to establish itself on the island of ice and fire. How have they arrived? Their disembarkation in the country may have been conditioned to different situations, such as travel aboard ships that came from Europe or even in the landing gear of commercial airplanes. All this, added to increasingly higher temperatures in Iceland, means that we are facing a major ecosystem problem. A hot Arctic. This very rudimentary discovery has served as a substrate for a profound analysis that has been published in the magazine Science, where it is noted that the appearance of mosquitoes in Iceland is only the symptom of a radical transformation throughout the Arctic. And global warming is affecting this region at breakneck speed, since the Arctic is warming four times faster than the world average. This thermal increase is not only allowing invasive species to survive the Icelandic winters, but is causing a serious biological imbalance throughout the boreal region. A domino effect. The arrival of mosquitoes and the alteration of the populations of native arthropods is not only a problem of annoying bites for its inhabitants or tourists, but it is a threat for birds. Here the early thaw is causing the peak of insect abundance to no longer coincide with the breeding season of waders, leaving them without their main source of food when they need it most. Furthermore, the large swarms of mosquitoes in arctic areas are already affecting the behavior of reindeer, which spend vital energy fleeing from the swarms instead of feeding, compromising their winter survival. That is why experts point to the need to control the arthropods that arrive in the region and, above all, a tracking system for these ecological changes that are so relevant. Images | Andreas M In Xataka | Mosquitoes attack me in summer and I tried these TikTok tricks to get rid of them

Turning plastic into fuel profitably was a pipe dream. A new process just made it possible

A team from the Oak Ridge National Laboratory, in the United States, has achieved convert plastic bags and kitchen boards into gasoline and diesel without having to resort to high temperatures or expensive materials. The discovery, published in the Journal of the American Chemical Society, has raised some eyebrows and below we tell you all the details. The problem they are trying to solve. The plastic is one of the most difficult materials to recycle profitably. Specifically, polyethylene (the polymer that makes up supermarket bags, white plastic containers or kitchen cutting boards) accumulates millions of tons in landfills each year. Until now, the only technically viable way to turn it into fuel was through a process called pyrolysis, which requires heating the material to temperatures between 450 and 500 degrees Celsius. An expensive, energy inefficient process that is difficult to scale to an industrial level. What does the new method consist of?. Researchers at Oak Ridge National Laboratory (ORNL) have opted for a different path: introduce the plastic into a mixture of molten salts with aluminum chloride, which acts both as a solvent and as a catalyst. These salts are inorganic compounds that remain stable even under demanding reaction conditions. The key is that the aluminum atoms in the mixture bind to the polymer and generate areas of high acidity that break the long molecular chains of the plastic into smaller fragments, which are transformed into molecules typical of gasoline or diesel. And all this at less than 200 degrees Celsius, a temperature comparable to that of a conventional domestic oven. Why it represents a relevant technical leap. Beyond the reduction in temperature, the process dispenses with three elements that make traditional methods more expensive and complicated: noble metal catalysts (such as platinum), organic solvents and external contribution of hydrogen. According to Zhenzhen Yanga scientist at ORNL and one of the lead authors of the study, “this is the first time that molten salts have been used as a means to produce high value-added chemicals from waste without any catalytic initiators or solvents, and at a temperature below 200 degrees Celsius.” Gasoline efficiency reaches approximately 60% in moderate conditions, a result that the researchers themselves describe as promising for its future industrial application. As they verified that worked. To understand exactly what happens during the reaction, the team used a combination of advanced analysis techniques, including soft X-ray spectroscopy, nuclear magnetic resonance, neutron scattering, and gas chromatography. Thanks to isotopic labeling, they were able to track how carbon behaves during the process and confirm that the simplest polymer chains produce gasoline-like fuel, while the more complex ones derive into diesel molecules. By having this level of detail, the process could be optimized depending on the type of fuel you want to obtain. What remains to be resolved. The system is not ready to scale immediately. The main obstacle is that the aluminum salts used are hygroscopic, that is, they absorb moisture from the environment, which compromises their long-term stability. The team working now on ways to confine or protect these saltspossibly using halides or carbon materials, to make them more durable under real industrial conditions. Mbeyond the laboratory. If the process manages to scale successfully, the implications are considerable. Polyethylene is the most produced plastic in the world, abundant and cheap to obtain as a raw material. Aluminum salts, for their part, are low-cost commercial materials. According to Liqi Qiua postdoctoral researcher at the University of Tennessee, “the starting material is abundant in consumer waste, and our catalyst system, molten aluminum salts, is very cheap.” The result could be a cost-effective route to converting plastic waste into high-quality transportation and industrial fuels, while also clearing up our landfills. At the moment the patent is pending, so we will have to wait to find out if this remedy ends up coming to fruition. Cover image | Elbert Lora and Marek Studzinski In Xataka | An 11,000 km ring around the Moon: Japan’s incredible plan to light up the Earth

24 years later they have found it

There are stories in the world of motorsport that seem straight out of a mystery novel. In 1995, when Bugatti Automobili SpA and its owner Romano Artioli declared bankruptcy, one of their last newly completed EB110 Super Sport disappeared without a trace. When the banks began to gather the company’s assets to pay off outstanding debts, that vehicle, identified as chassis number 021 painted in the iconic Blu Bugatti color, did not appear in any registry. One of the most exuberant supercars from that time had vanished…until now. A project broken before its time The Bugatti EB110 was born from one of the most ambitious bets in the automobile industry at the beginning of the nineties. Romano Artioli bought the rights to the Bugatti brand and built a factory from scratch in Campogalliano, in the Italian Motor Valley, near Modena. A total of 139 units of the EB110 were manufactured there, among which were 30 examples of its limited edition Super Sport, the most extreme in the range. Among its most famous owners are names such as Michael Schumacher, who celebrated his first Formula 1 Championship by purchasing a bright yellow EB110 Super Sport in 1994. Precisely, the Kaiser’s F1 car was chassis number 020, the car that was manufactured just before the unit in question: Super Sport number 021. As has happened so many times in the automobile industry, manufacturing one of the most desired supercars in the 90s is not easy. guarantee of financial viabilityso the Bugatti brand did not resist the economic recession of the first half of the nineties. The company declared bankruptcy in 1995 and the administrative chaos that followed meant that chassis 021 was left out of official records. Having been sent to a supplier for homologation, and not yet having completed its certification process, the car disappeared from the inventory and, with it, from the brand’s official history. It was as if unit 021 of the EB110 Super Sport had never been built. But it did exist. The reunion with a time capsule The EB110 Super Sport chassis 021 reappeared in 2019 in Munich (Germany), with just 674 kilometers on the odometer. After an exhaustive review by a team of specialists in Italy, the Bugatti was once again on public display, now as part of the personal collection of the American collector JR Amantea. The license plate with which he arrived in the country left no doubt about his history: “LOSTEBSS”, in direct reference to his long period of unknown whereabouts. After its rediscovery, the EB110 Super Sport no longer remained in storage. Unlike its previous owner, Amantea took it to the most exclusive motorsport events in the world: the The Quailheld during the 2022 Monterey Car Week; and the Amelia Island Concours d’Elegance 2023. In both it won the award for the best in its category, consolidating its reputation as one of the most extraordinary pieces in automotive collecting. The car retains its original Blu Bugatti paint and original Grigio Scuro interior, as well as including the Bugatti Certificate of Conformity, original manuals and tools. It also bears the Romano Artioli signature next to the side air intakes. That a supercar with these characteristics would reach the year 2025 with less than 700 kilometers traveled and in practically factory condition does not have many precedents in the collecting market. It’s like I’ve been in a time capsule. The EB110 Super Sport is considered one of the most technologically advanced supercars of its time. With the 3.5 liter V12 engine, prior to the arrival of the W16 developed by express wish by Ferdinand Piech and four turbos, all-wheel drive and a carbon fiber monocoque chassis, represented an enormous technical leap by the standards of the early 1990s. Recently, the Bugatti vehicle has taken a new turn in its eventful history, becoming part of a lot that is put up for auction by the Mecum house in Indianapolis. The auction house has confirmed that the lot will come out without a reserve price, which means that it will be the collectors interested in this gem who will really decide its final price. As and how to collect RobbReportalthough Mecum declines to offer an official estimate, recent sales of equivalent supercars suggest that its price could be between $2.5 million and $3.5 million. In Xataka | For years no one knew who had bought the most expensive Bugatti in the world: until it became part of an inheritance Image | Mecum Auctions

Tim Cook will step down as CEO and John Ternus will be the new leader

Apple management is preparing for a replacement that marks a turning point in the company: Tim Cook will step down as CEO and John Ternus will be his successor. As reported by Apple, confirming the rumorsthe change is part of a pre-planned transition process supported by the board of directors. The appointment will be effective as of September 1, the date on which Ternus will assume the position after years at the head of hardware engineering. John Ternus, current Senior Vice President of Hardware Engineering, and future CEO of Apple Tim Cook will continue to lead Apple over the coming months while working directly with John Ternus on the transfer of responsibilities. Once the replacement is effective, he will assume the position of executive president of the board of directors, a role from which he will continue to be involved in the company’s strategy and its relationship with governments and regulators. The change redefines his role, but keeps him linked to the company’s strategy and certain institutional functions.

We thought that Voyager 1 had already given everything it could. NASA continues to turn off parts to keep it alive

to some 25,000 million kilometers from Earth, Voyager 1 continues to send us data from interstellar space, Farther than any other ship built by humanity. The probe was launched in 1977 and, almost half a century later, it remains operational with an increasingly delicate condition: to keep it alive, the mission team is shutting down parts of the ship itself. That is exactly what has just happened with one of its scientific instruments, in a maneuver that reveals the delicate moment the mission is going through. The maneuver. On April 17, engineers at the Jet Propulsion Laboratory in Southern California they sent the order to turn off the experiment Low-energy Charged Particlesbetter known as LECP. It is an instrument dedicated to measuring low-energy charged particles, including ions, electrons and cosmic rays from both our solar system and the galaxy. The decision was not improvised. According to NASA, this instrument was next in the order agreed upon years ago by the scientific and engineering teams to cut consumption without terminating the mission. There are no solar panels. To understand why NASA has reached this point, we have to look at how Voyager 1 is powered. The probe does not work with solar panels, but with a radioisotope thermoelectric generator that converts the heat generated by the decay of plutonium into electricity. This system has allowed the mission to be sustained for decades, but its capacity is not infinite. According to NASA, both Voyager 1 and Voyager 2 They lose about 4 watts of power per yeara small loss on paper, but decisive when you have been managing each watt with extreme care for almost half a century. The scare that accelerated the decision. Although the shutdown of the LECP was part of a previously defined roadmap, there was a recent episode that forced the team to move more carefully. During a routine turn maneuver on February 27, Voyager 1’s power levels dropped unexpectedly. The US agency explains that any additional descent could activate the ship’s undervoltage protection system, designed to disconnect components on its own and protect it. A calculated “pruning”. The shutdown sequence was decided a long time ago, in joint conversations between those who design the scientific part of the mission and those who technically keep it alive. Of the 10 instruments each Voyager had, seven have already been turned off. In addition, the LECP will not be completely disconnected: the small motor that allows the sensor to rotate to scan in all directions will remain on, because it barely consumes 0.5 watts and keeps a remote option open to reactivate it later. The plan that comes now. With this shutdown, NASA does not consider the issue closed, but rather gains time to attempt a deeper intervention. According to the agency, switching off the LECP should give Voyager 1 about a year of respite. During that time, engineers want to complete a more ambitious energy adjustment for the two probes, dubbed “big Bang“The idea is to change several energy-consuming devices at once, turning off some and replacing others with lower consumption alternatives, to conserve the necessary heat and continue operating scientific instruments for as long as possible. When will the maneuver be attempted?. NASA will first test this setting on Voyager 2, which is closer to Earth and has slightly more power. The tests are planned for May and June 2026 and, if they go well, the team will try to apply the same maneuver on Voyager 1 no earlier than July. Images | POT In Xataka | The paradox of artificial gravity: Einstein told us how to do it, engineering tells us it is almost impossible

Samsung has shown a new device with AI. It is not what we imagined and is reminiscent of an Apple idea

When they tell us about a new device with artificial intelligencethe normal thing is that we think of a mobile phone, a laptop or, at most, the disappointing Rabbit R1 either Humane AI Pin. That’s why it’s interesting to stop when a company like Samsung teaches something that doesn’t quite fit into any of those boxes. What we have seen this time is not a common gadget, but a rather revealing clue as to how the South Korean giant could imagine a possible home interface of the future. What Samsung has shown in Milan is called Project Luna and, at least for now, it moves in the field of concepts. It is a desktop device with a mobile circular screen that acts as a head and can rotate to face the user. The company’s materials also show that this head not only rotates, but also changes orientation depending on the angle it needs. With that combination, Samsung draws a home device that wants to look less like a conventional speaker and more like an object designed to interact with the user. A concept that points further than a speaker One of the scenes that Samsung has used to show Luna places it on a kitchen table, connected to the user’s smartphone, playing music with an interface reminiscent of a record player and answering questions both by voice and on screen. In that same demonstration he also appears controlling the lighting in the room and suggesting food options for the day. Additionally, there are projectors scattered around the kitchen that display data such as the calories in the recipe or a calendar notice for a dinner party. And that’s where Luna begins to tell us something more interesting than her own design. In an interview with Fast CompanyMauro Porcini, Samsung’s chief design officer, explained that this concept represents more of “a vibe, a feeling of the type of design language we want to use.” The phrase matters because it lowers any immediate commercial reading and forces us to look at it differently. Rather than anticipating a launch, the firm seems to use this project to teach the type of language and relationship with the user that it wants to explore in future AI devices. And at that point it is difficult not to remember Apple. In August 2024, Mark Gurman told Bloomberg that the company was moving forward with the development of a home desktop device that would combine an iPad-like screen with a robotic arm. The proposal, according to that informationwas conceived as a home control center, a tool for video calls and a remote surveillance system, with a screen capable of tilting and rotating 360 degrees using actuators. It has not materialized as a product, but there is some underlying parallel with what Samsung is now teaching. The most interesting reading may not be in looking for an exact equivalence between what Samsung has taught and the rumors about Apple, but in stopping at the underlying trend. What we’ve seen suggests that home AI could end up taking a much more tangible form than the assistants or screens we already know. We are not yet talking about a consolidated category, far from it. But it does provide a fairly serious clue as to where the industry could move in the coming years. At this point, the temptation is to think: okay, that sounds good, but where exactly does something like this fit into our daily lives. Because we can imagine it on the kitchen counter, recommending a mealanswering a quick question or accompanying us while music plays, and the scene is even convincing. The problem is that that same house is already full of devices that already cover a good part of all that. Images | Samsung In Xataka | Meta spent 2 billion on a Chinese AI startup. China is clear that it was a conspiracy

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.