When they told us all the advantages of intermittent fasting, they forgot one small detail: that it could make us bald.

For years we have been sold that intermittent fasting It was the strategy of the future to lose weight and improve our metabolic health. It is logical: it was something easy to implement, reasonable and very striking. Had everything necessary to become a fashion. And so it was. It is now, as the first long-term studies come to an end, that we begin to really understand its pros and cons. The most striking, of course, is the one that has to do with hair. What exactly is intermittent fasting? In general terms, we call ‘intermittent fasting’ a diet that alternates periods without food restrictions with brief periods of fasting. ‘Fasting’, here, is a deliberately elastic term: it can mean eating absolutely nothing or significantly reducing the number of calories consumed. The idea behind it sounds good.. When we undergo prolonged calorie restriction, the body goes into “savings mode” and that causes weight loss to slow down (or, at least, slow down). Intermittent fasting would attempt to trick the body into not adapting to the new calorie restriction and therefore continuing to “spend” at a normal rate. And does it work? That’s the bad news. “Research does not consistently show that intermittent fasting is superior to continuous low-calorie diets” when it comes to weight loss, the study tells us. more complete analysis on the subject after reviewing almost fifty studies. The clinical trials that have been carried out Subsequently, they only insist on the same thing: in general terms, the results are identical to those with the rest of the normal diets. Both in the dropout rate and in the amount of weight achieved or the improvement in health markers. The choice of another method, ultimately, has more to do with individual philias and phobias than with any type of extra scientific evidence. After all, everyone has a peculiar relationship with food and, consequently, there are some strategies that ‘fit’ us better than others. In other words, there are people who use it. Yes and the truth is that nothing happens. Little by little, researchers are discovering good things (can help intestinal cells regenerate) and bad things (could promote the formation of precancerous polyps). So, little by little, we are better understanding what it does, what it stops doing and what mechanisms are behind intermittent fasting. That’s when the surprises begin. Because, for example, a clinical trial carried out with mice has discovered that intermittent fasting slows hair growth. Researchers at Westlake University (in Zhejiang, China) took about 50 mice, shaved them and divided them into three groups with dietary restrictions (fed every 8, 16 or 48 hours) and one without restrictions which is the control group. After a month, the mice that could eat without problem had recovered their hair. Those who fasted, on the other hand, only partially recovered after 96 days. As? Because? What is happening here? The first thing is to make it clear that the researchers “They don’t want to scare people away from intermittent fasting.“; but rather highlight “the importance of taking into account that it could have some unwanted effects.” Taking this into account (and that the study is in mice), the answer is both simple and full of uncertainties: to begin with, hair growth is a process that requires constant and balanced nutrition. But researchers believe the problem could go further: It is possible that “the body uses fat reserves instead of glucose and this could trigger the release of chemicals that damage hair cells.” However (and this is important) the research is in a very seminal state and there is still much to investigate. After all, there is no better occasion than this: the occasion they paint her bald. Image | Seika In Xataka | The great promise of science to end baldness is not a transplant or a medicine: it is a vaccine A version of this topic was originally published in February 2025

“Left click, right click”, this is how the AI ​​decides an attack in war. China, Russia and the US need fewer and fewer humans

A group of Google engineers signed an internal letter to protest against a project in which your own software was being used by the Pentagon, sparking an unprecedented debate within the company about how far the technology they had created should go. Since then, almost 10 years have passed, an “eternity” with the implementation of AI. The war accelerates… without humans. They counted last week in The New York Times that modern warfare is entering a phase in which human intervention is no longer the center of decision-making, but rather an almost symbolic step within processes dominated by algorithmswhere artificial intelligence systems identify targets, recommend attacks and generate complete plans in a matter of seconds. Programs like Project Maventoday developed by Palantir and integrated with models like Anthropic’sshow the extent to which the decision chain has been compressed: satellite images, drone data and intercepted signals are automatically processed to generate target lists and attack solutions, reducing human intervention to something as simple as selecting options on the screen, in the words of Pentagon officialsit is as simple as clicking “Left click, right click”. Powers in the same race. Because at the center of this transformation are the United States, China and Russia, competing to lead a new arms race based on autonomous systems capable of operating without direct intervention. In China, for example, the development of coordinated drone swarms by artificial intelligence and capable platforms to operate alongside fighters manned reflects a commitment to scale and automation. Meanwhile, in Russia they are betting on systems like the Lancet droneswhich evolve towards capabilitiesand autonomous selection of objectives. For its part, the United States is trying to close the gap by encouraging companies like Anduril to speed up production of autonomous drones, in a race where the speed of development is almost as important as the technology itself. The Chinese WZ-8 drone Ukraine as a turning point. How have we been countingthe war in Ukraine has been the turning point that has turned these technologies on real tools combat, demonstrating that relatively simple systems can evolve rapidly towards semi-autonomous capabilities and changing the balance on the battlefield. Adapted commercial drones, unmanned vessels and data analysis systems have allowed resist a superior adversary, while Russia has responded incorporating automation progressive in their own systems. As pointed out analyst Michael Horowitz, “the battlefield in Ukraine has served as a laboratory for the world,” accelerating a transition that is no longer experimental, but operational. Silicon Valley at war. Unlike previous arms races, the Times I remembered that the role does not fall solely on the States, but also in technology companies and start-ups that are redefining military development. Here are companies like Google that initially participated in projects like Maven before withdrawing due to internal pressures, while others like Palantir or Anduril have occupied that space with a more vision aligned with the defense. In China, the “civil-military fusion” model directly integrates to private companies in the development of military systems, while in the West attempts are made to replicate that dynamism with million-dollar investments and growing collaboration between Silicon Valley and the Pentagon. Algorithms against algorithms. The result is a form of war in which the confrontation is no longer only between armies, but between automated systems that operate at speeds impossible for humans, a scenario that we have counted where drones launch drones to take on other drones and sensor networks connect globally to execute real time attacks. Projects like the Chinese attempt to replicate networks similar to the Joint Fires Network American forces reflect this trend toward an interconnected war, one where a sensor at one point on the planet can trigger an attack on another without direct intervention. At this point, superiority no longer depends solely on the quality of weapons, but on the ability to integrate data, process it and act faster than the adversary. Uncontrolled speed. There is no doubt, this acceleration carries risks that worry even those who pushed these systems, as automation can trigger military responses before humans can intervene or fully understand the situation. Studies such as that of RAND Corporationworks that have shown scenarios in which autonomous systems inadvertently escalate conflicts, while experts warn of a possible “escalation spiral” driven by the decision speed of machines. As recognized General Jack Shanahan, promoter of Maven, the reality is that there is a danger of deploying “untested, insecure and poorly understood” systems in a context of competition where each actor fears being left behind. Less humans, more automation. Thus, the panorama that is drawn is that of a war every time more automatedwhere human intervention is progressively reduced and critical decisions are delegated in artificial intelligence systems capable of analyzing, deciding and acting in seconds, something very different which is do it “well”. From autonomous drones to target analysis platforms, through global combat networks, the trend seems clear, that of a war of the immediate future that will be decided less in offices and more in algorithmsin an unstable and certainly chilling balance, because we are talking about technological speed being on track to surpass the human capacity to control it in the middle of a war. Image | StockVault, Infinity 0 In Xataka | Russia is no longer surrendering to Ukrainian soldiers, but to machines: the rules of war are being redefined In Xataka | China was the power that launched drones. Now he has realized his danger with a decision: close the sky to them

the controversial measures with which we have shielded the network a year after the collapse

Next April 28 will mark exactly one year to the day that Spain and Portugal faded to black. An unprecedented “zero energy” in the last two decades that left nearly 60 million citizens without electricity, without internet, without traffic lights and with the banking system paralyzed for up to 16 hours. As they reflect in the magazine freenthat day we suddenly discovered that something we take for granted—electricity—is the fragile foundation on which our entire modern life rests. One year after the event, the initial shock has given way to data. We no longer ask ourselves only if such a blackout can happen again, but how much it is costing us to avoid it and if we have really learned our lesson. D-day is about to arrive. Twelve months later, we finally have the “official autopsy.” The European Network of Network Operators (ENTSO-E) published a comprehensive report of 472 pages where he concludes that there was no single cause, but rather a “perfect cocktail” of multiple factors. A sudden surge originating in Spain triggered instability that the system was unable to stop. As we have already explained in Xatakathe failure can be defined as “operational blindness.” The renewable plants operated with a fixed power factor; They did not know how to read the network surge and, for safety reasons, they disconnected suddenly, causing a rebound effect. Besides, as he adds BBClocal generator voltage controls were not fully aligned with operator requirements. The crisis required millisecond reflexes, but tension control was done manually. In fact, if Europe did not fall like a house of cards, it was due to an almost miraculous technicality: a relay in the Hernani substation (Gipuzkoa) acted like a “fusilazo”, cutting the connection with France in milliseconds to shield the continent. Ironically, just ten minutes later, it was that same interconnection that served as assisted breathing to resuscitate the system. The big question: what has Spain done differently? The fear of a new blackout has changed the rules of the game, but at a high price for the citizen. Electrical Network has imposed a “reinforced model” of operations. This means that they prioritize safety over cost, keeping more expensive and stable backup plants on, such as gas combined cycles. The result? The Spanish They have been paying an extra cost of 666 million euros In these eleven months only in “adjustment services”, which have shot up 43%. In the legislative sphere, the Government has approved the Royal Decree-Law 7/2026 to streamline bureaucracy through the “Renewable Acceleration Zones” (ZAR). However, experts warn thatSince there is still no structured capacity market, investing in the necessary storage systems (batteries) continues to be a financial risk for developers. There’s more shielding going on. The collapse not only left us in the dark, but it left us cut off, although in a very uneven way. While some completely lost the signal, others managed to maintain it thanks to the logistical efforts of some operators. To avoid this coverage lottery, the CNMC has proposed that Telefónica, Vodafone and MásOrange offer a “national roaming” plan in case of emergency. If your operator’s network goes down, your mobile phone would automatically connect to the competition, based on the Swedish model. Added to this is the request to make the alert system (ASA) mandatory in cars with digital radio (DAB+), to send warnings to the population immediately even if the internet is down. The false culprit and the new energy guzzler. After the collapse, many were quick to blame green energy, but the reality is different. As explained from freenthe problem is not that Spain has a lot of solar and wind energy, but that the electrical grid is still stuck in the 20th century, designed for fossil power plants and not for a decentralized system. In fact, Spain is a fascinating laboratory. According to EUObserverthe country has managed the recent price crisis caused by the Third Gulf War much better than its European neighbors thanks to its enormous solar shield. However, the trauma of the blackout has caused an absurd side effect: operators are so afraid of overloading the grid that they force solar and wind farms to disconnect more frequently. Curtailment (clean energy generated that is thrown away) has gone from 2% to 7%. And if that were not enough, the saturated network assumes the imminent arrival of a new energy-consuming giant: the massive data centers for Artificial Intelligence. The exchange of accusations is served. In the offices the short circuit has only just begun. As detailed Financial Times, The National Markets and Competition Commission (CNMC) has opened formal investigations. Red Eléctrica (REE) faces proceedings for “very serious” infractions, while giants such as Iberdrola, Naturgy, Endesa and Repsol face possible fines of up to 60 million euros for “serious” infractions. Besides, as accounted Public, up to twenty open sanctioning files. REE defends itself by ensuring that the opening of the file does not prove its guilt. Meanwhile, a Senate report promoted by the PP directly blames the Government, REE and the CNMC for ignoring known vulnerabilities, according to Reuters. And the tension reaches the limit: electricity companies like Endesa and Iberdrola They have demanded a judge access more than 8,000 calls and emails from REE executives during the hours of the blackout, after the leak of audios where technicians warned of the danger 15 days before. An electric heart that remains at risk. Spain is “a gold mine without a road”, as defined by Patxi Callejadirector of Iberdrola. We have the sun, the wind and the technical capacity. But the great lesson of this last year is that true energy independence is no longer played at the national level, but at the local level, where factories and homes install their own batteries and hybrid panels so as not to depend on the fragile central system. We survived the blackout and avoided another one by reaching for our wallets and operating defensively. But as long as the line procedures last a decade, mass storage … Read more

Five years ago, Venice spent more than 5 billion on a system of barriers against the sea. Now look for a plan B

There was a time when Venice looked at the Adriatic with ambition. The sea not only shaped the city, permeating its DNA, it also propelled it until it became a naval power who fought for dominance of the Mediterranean. Today things are different. The Serennissima (turned into tourist power) observes with increasing concern the coming and going of the tides, the same ones that in 2019 submerged it under 187 cm of water, flooding 80% of the city. The reason is very simple. Everything indicates that the multimillion-dollar system that Venice was equipped with a few years ago to protect itself from the threat of high water It won’t take long for it to become obsolete. And it is not very clear what the alternative is. One figure: 18. The threat of flooding is not new in Venice. In fact, one of the worst in memory was suffered six decades ago, in November 1966when an intense storm caused the water to reach 194 cm, flooding much of the city. However, experts have been detecting worrying signs for some time. It is not just that Venice sink or the sea level rising (which too). There are increasingly clear signs that suggest that floods will become more frequent in the future. Recently, a group of researchers dedicated themselves to analyzing the “extreme” episodes suffered by the city, those in which 60% of its surface was flooded. Throughout the last century and a half, it counted 28 incidents of those characteristics. The surprising thing is that the vast majority of them (18) were concentrated during the last 23 years. One measurement: 0.42 m. Today more than half of Venice is alone between 80 and 120 cm above the average sea level and projections show that this scenario will soon worsen: in the best of cases, if we manage to drastically reduce our polluting emissions, the sea will rise 0.42m by 2100. In the worst case, it will be 1.8 m, which would greatly complicate the outlook for the Serennissima. In fact, now the high tide already leaves St. Mark’s Square only 30 cm above the water level. One name: Mose. Aware of how much is at stake in Venice, the Italian Government has long been looking for a way to protect itself from floods. The result was Mose (experimental elettromechanical module)a system made up of four barriers and 78 independent mobile gates that allow authorities to protect the Venetian lagoon from what is known as high watertides that flood the city. The objective: to temporarily isolate the Adriatic lagoon and thus protect Venice from the most dangerous tides. To achieve this, the barriers were strategically installed in the inlets of Lido, Malamocco and Chioggia. Each gate also measures 20m wide and between 18.6 and 29.6 m long. An investment: 5,000 million. It is said that the project mobilized an investment of more than 5.5 billion of euros (its execution was marred by corruption). Its work began in 2003 and after several delays it carried out a first test in October 2020, in an event led by the then Prime Minister Giuseppe Conte. A year earlier, Venice had suffered a of the worst floods that are remembered, during which the water reached 187 cm, flooding part of the entrance to the Basilica of Saint Mark. An indicator: frequency. The problem is that the authorities are turning to Mose much more often than expected. EuroWeekly assures that in less than a month, between January 28 and February 19, the system was activated 30 times. Other media report that since their inauguration at the end of 2020, the barriers have saved Venice from flooding in 154 occasions. The problem is that the use of Mose does not come free to the region, neither in economic terms nor on a social and environmental level. Setting up the enormous Mose floodgates has a direct cost, but it also has another indirect cost: by isolating the lagoon, the system alters, for example, the activity of the port sector and interrupts maritime traffic with the port of Marghera. Guardian points out that pressing Mose’s button has an economic impact of more than 200,000 euros for Venice. For this year’s Carnival alone the total bill would be around five million euros. An extra concern: the lagoon. Not everything is measured in operational cost, maritime traffic and economic impact. Altering the tides in the area also has an impact on its ecosystem and that is something that worries experts like Andrea Rinaldo, from the scientific committee of the Lagoon Authority. Especially if two fundamental data are taken into account: first, the frequency of use in recent years; second, the forecasts for sea level rise. “With one more meter, the Mose barriers would have to be closed an average of 200 times a year, which means that they would practically always be blocked,” explains Roinaldo. “When this happens, the lagoon loses its function as a transitional environment. It would become a pond.” A victim: the lagoon itself. As explains GuardianBy blocking the flow of water, the barriers encourage the growth of algae. The problem is that when these die and decompose they directly affect the quality of the water and the rest of the flora and fauna. Does that mean Mose was a mistake? Rinaldo thinks not. The changes are simply happening much faster than engineers expected, forcing authorities and technicians to think about the future in the medium and long term. At the end of the day, if Mose taught anything, it is that projects of his importance are not approved and executed overnight. One question: What to do? The great unknown. Those responsible for Mose are looking for ways to reduce its impact, but it is not an easy decision. Among other things because the Venetians themselves have become accustomed to the barriers and gates coming into operation at the slightest risk, points out Giovanni Zaroti, one of the system technicians. Rinaldo mentions the possibility of launching an international call … Read more

Japan has crossed a red line in the Pacific with the US. China just responded with warships closer than ever

When in 2013 two Russian strategic bombers They flew over without warning airspace near Japan, forced Tokyo to deploy interception fighters in a matter of minutes in one of the most tense responses in its recent history. The episode, almost forgotten outside military circles, made clear the extent to which there are movements in the Pacific that, even if they last just hours, can change the way countries look at each other for years. A crossing of lines. Japan has taken a step that for decades it carefully avoided: integrating for the first time with combat troops in maneuvers led by the United States in the Indo-Pacific, de facto breaking a political and strategic barrier inherited from the postwar. This movement is not symbolic, because involves deploying soldiers, ships, aircraft and missiles in a real conflict simulation scenario, which brings Tokyo closer to a much more active role within the US military apparatus. The decision, furthermore, occurs in a context of growing concern about Taiwan and for him balance of power in the region, which makes this gesture more than just cooperation: it is a clear sign of strategic alignment. China’s response: closer than ever. Beijing’s reaction has been immediate and measured in kilometers: the deployment of warships on routes much closer to Japanese territory than usual, including transit through waters that it rarely used to access the Pacific. Although China insists that these are routine exercises, the pattern reveals a willingness to press and demonstrate operational capacity in sensitive areas, bringing its military presence closer to points that it previously avoided. Not only that. This movement fits in a trend wider than greater naval aggressiveness around Japan, where each maneuver not only tests capabilities, but also political limits. Everything revolves around an island. The background of this escalation is the Taiwan issuewhich acts as the axis of tension between China and Japan since Tokyo left open the possibility of intervening if a conflict breaks out on the island. Beijing has interpreted these statements as a red lineand has since responded with diplomatic protests, economic pressure and military demonstrations. Every Japanese step in or around the strait is seen as a provocation, and every Chinese move seeks to recalibrate that balance without openly crossing the threshold of direct confrontation. Balikatan: from exercise to message. It is another of the crystal clear readings. The Balikatan maneuvers have ceased to be a simple bilateral exercise to become a multinational display of forceone with more than 17,000 troops and the participation of countries such as Australia, France or Canada. The active incorporation of Japan changes its nature, because it introduces a key actor in the so-called “first island chain”, a geographical and military barrier. designed to contain Chinese expansion in the Pacific. The deployment of anti-ship missiles and live-fire exercises, including the destruction of naval targets, reinforces the idea that rehearsing a scenario of high intensity maritime conflict. The battle for the islands. Also we have talked on several occasions in this chain of territories (which goes from Japan to the Philippines passing through Taiwan) that has become the axis of the US strategy to limit Chinese naval projection. Japan, by integrating more deeply into this system, contributes to the creation of a species of distributed “fortress” that seeks to hinder any Chinese advance towards the open Pacific. For Beijing, however, breaking or surrounding that barrier is a strategic prioritywhich explains the increase in its activity beyond that line and its insistence on operating in waters increasingly distant from its coast. An increasingly fragile balance. The result of all this is a scenario where each movement has a double reading: what some present as routine trainingothers interpret it as a climbing sign. Japan has taken a step that redefines its role in regional security, and China has responded by bringing its naval power closer to a distance it previously avoided, creating an action-reaction dynamic that increases the risk of incidents. Thus, in a global context marked by many other conflicts that could divert American attention, the Indo-Pacific is positioned as the great board where the balance of power of the 21st century is played. Image | CCTV In Xataka | Japan has dozens of “forgotten” islands off the coast of China: it is now preparing for the worst scenario In Xataka | Satellite images leave no doubt: China has concentrated thousands of fishing boats off Japan, and its idea is not to fish

This huge TCL TV is at an outlet price at Carrefour

If you would like to set up a home theater and have plenty of space in the dining room, Carrefour now has this huge TV on sale TCL 98P8KX7 98 inches. It has gone from costing 1,899 to 1,399 euros. Furthermore, if you want to pay it little by little (and you have the Carrefour Pass card), you can do so in 10 easy installments of 139.90 euros each. It is only sold online and shipping is free. TCL 98P8KX7 98″ (248.92 cm), QLED, 4K UHD TV The price could vary. We earn commission from these links A gigantic TV to set up your own home theater Until recently, to exceed 85 inches at home, the only thing you had to do was buy a projector. Although, today, there are already gigantic TVs that show that size does matter. A good example is this TCL 98P8, whose affordable price allows set up your own cinema at home without your pocket being ruined. Although the size of his QLED panel It is the most striking thing about this television, all the technology it incorporates deserves mention. Its refresh rate is 144 Hz, so it can be considered a giant monitor, as it allows you to make the most of your console. PS5 either Xbox Series X or even a high-end PC. Your brain is AiPQ Pro processorwhich analyzes each scene in real time to upscale non-4K content and improve contrast. This means that, for example, DTT looks quite decent on this screen size. When watching movies, it can be noted that it is compatible with all standards: Dolby Vision, HDR10+ and Dolby Atmos. This will allow you to enjoy a totally immersive experience when watching movies. Comes with Google TVone of the most complete ecosystems that integrates Chromecast and Google Assistanttwo very useful functions both for sending content from your mobile phone directly or for searching with your voice, respectively. The audio section is also another of the highlights of this TV and its speakers are signed by Onkyo. It includes a small subwoofer on the back, which makes the TV not sound tinny like most flat screens on the market. ⚡ IN SUMMARY: offer for the TCL 98P8KX7 smart TV today ✅ THE BEST The scale: There is nothing that compares to it. Watching a movie or a football match at this size will completely change your perception of watching TV. It is literally having a cinema in your living room. 144 Hz: For such a large panel, the response is surprisingly fast, making it a very good option for gaming. ❌ THE WORST The dimensions are also its con… You will need to measure very well not only the living room but also the hallway and the elevator, since it is a 2.48 meter diagonal TV that is quite heavy. The sound… Despite having Onkyo speakers and supporting Dolby Atmos, a screen this size cries out for a high-end sound bar to match the image. 💡 BUY IT IF… You want to enjoy an immersive movie experience without the maintenance and dullness of a projector. Also, if you have a huge wall and you want it to be the star of the room, it is a good way to do it. ⛔ DON’T BUY IT IF… Your apartment is small and you are going to sit less than three meters away, eye fatigue can be a big problem since you will start to notice the pixels if the content is not native 4K. Some sound bars that may interest you for this TV LG DS60TR – Sound Bar, Bluetooth, 440W, 5.1 Channels The price could vary. We earn commission from these links TCL Q85H Pro Sound Bar 7.1.4 Channels for TV, 860 Watts The price could vary. We earn commission from these links Some of the links in this article are affiliated and may provide a benefit to Xataka. In case of non-availability, offers may vary. Images | Webedia and TCL In Xataka | Best televisions in quality price. Which one to buy and seven recommended 4K smart TVs In Xataka | Best sound bars in quality price. Which one to buy and seven recommended models from 140 euros

Iceland was one of the last places on the planet that mosquitoes had not reached. That’s now history

For centuries, Iceland has held the ‘privilege’ of being one of the few habitable places on Earth where mosquitoes did not exist, something that can be a source of envy for many, especially with the arrival of summer. All thanks to its particular climate, with constant cycles of freezing and thawing that prevented the larvae from maturing, it acted as an insurmountable biological shield. However, climate change and human action have just broken down this barrier and this Icelandic ‘exceptionality’ has ended. The discovery. The history of this biological invasion starts in October 2025where Björn Hjaltason, a resident of the Kjós region noticed the presence of some unusual insects in his garden. From there he wanted to see them closer, and to capture them he used a fairly rudimentary method based on some ropes soaked in red wine. With this trap he obtained three specimens that were immediately sent to the Icelandic Institute of Natural Sciences, where the entomologist Matthías Alfreðsson confirmed the unthinkable: they were two females and a male. Culiseta annulataa species of mosquito common in Europe, but which had never managed to establish itself on the island of ice and fire. How have they arrived? Their disembarkation in the country may have been conditioned to different situations, such as travel aboard ships that came from Europe or even in the landing gear of commercial airplanes. All this, added to increasingly higher temperatures in Iceland, means that we are facing a major ecosystem problem. A hot Arctic. This very rudimentary discovery has served as a substrate for a profound analysis that has been published in the magazine Science, where it is noted that the appearance of mosquitoes in Iceland is only the symptom of a radical transformation throughout the Arctic. And global warming is affecting this region at breakneck speed, since the Arctic is warming four times faster than the world average. This thermal increase is not only allowing invasive species to survive the Icelandic winters, but is causing a serious biological imbalance throughout the boreal region. A domino effect. The arrival of mosquitoes and the alteration of the populations of native arthropods is not only a problem of annoying bites for its inhabitants or tourists, but it is a threat for birds. Here the early thaw is causing the peak of insect abundance to no longer coincide with the breeding season of waders, leaving them without their main source of food when they need it most. Furthermore, the large swarms of mosquitoes in arctic areas are already affecting the behavior of reindeer, which spend vital energy fleeing from the swarms instead of feeding, compromising their winter survival. That is why experts point to the need to control the arthropods that arrive in the region and, above all, a tracking system for these ecological changes that are so relevant. Images | Andreas M In Xataka | Mosquitoes attack me in summer and I tried these TikTok tricks to get rid of them

Turning plastic into fuel profitably was a pipe dream. A new process just made it possible

A team from the Oak Ridge National Laboratory, in the United States, has achieved convert plastic bags and kitchen boards into gasoline and diesel without having to resort to high temperatures or expensive materials. The discovery, published in the Journal of the American Chemical Society, has raised some eyebrows and below we tell you all the details. The problem they are trying to solve. The plastic is one of the most difficult materials to recycle profitably. Specifically, polyethylene (the polymer that makes up supermarket bags, white plastic containers or kitchen cutting boards) accumulates millions of tons in landfills each year. Until now, the only technically viable way to turn it into fuel was through a process called pyrolysis, which requires heating the material to temperatures between 450 and 500 degrees Celsius. An expensive, energy inefficient process that is difficult to scale to an industrial level. What does the new method consist of?. Researchers at Oak Ridge National Laboratory (ORNL) have opted for a different path: introduce the plastic into a mixture of molten salts with aluminum chloride, which acts both as a solvent and as a catalyst. These salts are inorganic compounds that remain stable even under demanding reaction conditions. The key is that the aluminum atoms in the mixture bind to the polymer and generate areas of high acidity that break the long molecular chains of the plastic into smaller fragments, which are transformed into molecules typical of gasoline or diesel. And all this at less than 200 degrees Celsius, a temperature comparable to that of a conventional domestic oven. Why it represents a relevant technical leap. Beyond the reduction in temperature, the process dispenses with three elements that make traditional methods more expensive and complicated: noble metal catalysts (such as platinum), organic solvents and external contribution of hydrogen. According to Zhenzhen Yanga scientist at ORNL and one of the lead authors of the study, “this is the first time that molten salts have been used as a means to produce high value-added chemicals from waste without any catalytic initiators or solvents, and at a temperature below 200 degrees Celsius.” Gasoline efficiency reaches approximately 60% in moderate conditions, a result that the researchers themselves describe as promising for its future industrial application. As they verified that worked. To understand exactly what happens during the reaction, the team used a combination of advanced analysis techniques, including soft X-ray spectroscopy, nuclear magnetic resonance, neutron scattering, and gas chromatography. Thanks to isotopic labeling, they were able to track how carbon behaves during the process and confirm that the simplest polymer chains produce gasoline-like fuel, while the more complex ones derive into diesel molecules. By having this level of detail, the process could be optimized depending on the type of fuel you want to obtain. What remains to be resolved. The system is not ready to scale immediately. The main obstacle is that the aluminum salts used are hygroscopic, that is, they absorb moisture from the environment, which compromises their long-term stability. The team working now on ways to confine or protect these saltspossibly using halides or carbon materials, to make them more durable under real industrial conditions. Mbeyond the laboratory. If the process manages to scale successfully, the implications are considerable. Polyethylene is the most produced plastic in the world, abundant and cheap to obtain as a raw material. Aluminum salts, for their part, are low-cost commercial materials. According to Liqi Qiua postdoctoral researcher at the University of Tennessee, “the starting material is abundant in consumer waste, and our catalyst system, molten aluminum salts, is very cheap.” The result could be a cost-effective route to converting plastic waste into high-quality transportation and industrial fuels, while also clearing up our landfills. At the moment the patent is pending, so we will have to wait to find out if this remedy ends up coming to fruition. Cover image | Elbert Lora and Marek Studzinski In Xataka | An 11,000 km ring around the Moon: Japan’s incredible plan to light up the Earth

24 years later they have found it

There are stories in the world of motorsport that seem straight out of a mystery novel. In 1995, when Bugatti Automobili SpA and its owner Romano Artioli declared bankruptcy, one of their last newly completed EB110 Super Sport disappeared without a trace. When the banks began to gather the company’s assets to pay off outstanding debts, that vehicle, identified as chassis number 021 painted in the iconic Blu Bugatti color, did not appear in any registry. One of the most exuberant supercars from that time had vanished…until now. A project broken before its time The Bugatti EB110 was born from one of the most ambitious bets in the automobile industry at the beginning of the nineties. Romano Artioli bought the rights to the Bugatti brand and built a factory from scratch in Campogalliano, in the Italian Motor Valley, near Modena. A total of 139 units of the EB110 were manufactured there, among which were 30 examples of its limited edition Super Sport, the most extreme in the range. Among its most famous owners are names such as Michael Schumacher, who celebrated his first Formula 1 Championship by purchasing a bright yellow EB110 Super Sport in 1994. Precisely, the Kaiser’s F1 car was chassis number 020, the car that was manufactured just before the unit in question: Super Sport number 021. As has happened so many times in the automobile industry, manufacturing one of the most desired supercars in the 90s is not easy. guarantee of financial viabilityso the Bugatti brand did not resist the economic recession of the first half of the nineties. The company declared bankruptcy in 1995 and the administrative chaos that followed meant that chassis 021 was left out of official records. Having been sent to a supplier for homologation, and not yet having completed its certification process, the car disappeared from the inventory and, with it, from the brand’s official history. It was as if unit 021 of the EB110 Super Sport had never been built. But it did exist. The reunion with a time capsule The EB110 Super Sport chassis 021 reappeared in 2019 in Munich (Germany), with just 674 kilometers on the odometer. After an exhaustive review by a team of specialists in Italy, the Bugatti was once again on public display, now as part of the personal collection of the American collector JR Amantea. The license plate with which he arrived in the country left no doubt about his history: “LOSTEBSS”, in direct reference to his long period of unknown whereabouts. After its rediscovery, the EB110 Super Sport no longer remained in storage. Unlike its previous owner, Amantea took it to the most exclusive motorsport events in the world: the The Quailheld during the 2022 Monterey Car Week; and the Amelia Island Concours d’Elegance 2023. In both it won the award for the best in its category, consolidating its reputation as one of the most extraordinary pieces in automotive collecting. The car retains its original Blu Bugatti paint and original Grigio Scuro interior, as well as including the Bugatti Certificate of Conformity, original manuals and tools. It also bears the Romano Artioli signature next to the side air intakes. That a supercar with these characteristics would reach the year 2025 with less than 700 kilometers traveled and in practically factory condition does not have many precedents in the collecting market. It’s like I’ve been in a time capsule. The EB110 Super Sport is considered one of the most technologically advanced supercars of its time. With the 3.5 liter V12 engine, prior to the arrival of the W16 developed by express wish by Ferdinand Piech and four turbos, all-wheel drive and a carbon fiber monocoque chassis, represented an enormous technical leap by the standards of the early 1990s. Recently, the Bugatti vehicle has taken a new turn in its eventful history, becoming part of a lot that is put up for auction by the Mecum house in Indianapolis. The auction house has confirmed that the lot will come out without a reserve price, which means that it will be the collectors interested in this gem who will really decide its final price. As and how to collect RobbReportalthough Mecum declines to offer an official estimate, recent sales of equivalent supercars suggest that its price could be between $2.5 million and $3.5 million. In Xataka | For years no one knew who had bought the most expensive Bugatti in the world: until it became part of an inheritance Image | Mecum Auctions

Tim Cook will step down as CEO and John Ternus will be the new leader

Apple management is preparing for a replacement that marks a turning point in the company: Tim Cook will step down as CEO and John Ternus will be his successor. As reported by Apple, confirming the rumorsthe change is part of a pre-planned transition process supported by the board of directors. The appointment will be effective as of September 1, the date on which Ternus will assume the position after years at the head of hardware engineering. John Ternus, current Senior Vice President of Hardware Engineering, and future CEO of Apple Tim Cook will continue to lead Apple over the coming months while working directly with John Ternus on the transfer of responsibilities. Once the replacement is effective, he will assume the position of executive president of the board of directors, a role from which he will continue to be involved in the company’s strategy and its relationship with governments and regulators. The change redefines his role, but keeps him linked to the company’s strategy and certain institutional functions.

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.