We believed that the iPhone 17 and the Air shone by cameras and design. We have just discovered that they hide an exclusive security function

For a long time we have lived with the illusion that there are impenetrable computer systems. The reality is less resounding: in security, everything is reduced to how much effort, time and resources it requires to force a lock. Just as it is not the same to open the door of the house as the vault of a bank, in the digital world There are more or less resistant barriers and unexpected shortcuts that avoid brute force. The objective of the defense is not perfection, but to raise the toll to break it is impracticable. From there, the risk never disappears, it is managed. With that practical look, Apple has been adding layers to make every step of the attacker more and reduce its maneuvering margin. According to the Cupertino companythe most sophisticated exploitation chains that have observed against iOS come from the mercenary spyware and rely on memory vulnerabilities. Although they do not explicitly mention it, they surely refer to threats such as Pegasus of the company NSO. And the answer they have raised is a new piece in that wall: a reinforcement that integrates hardware and system to monitor the integrity of memory and cut overwhelms or undue accesses before they thrive. Memory Integrity Enforcement on iPhone 17 and iPhone Air Apple has presented Memory Integrity Enforcement (Mie) as part of the new iPhone 17, iPhone 17 Pro and Pro Max and iPhone Airan integrated memory defense directly in its hardware and operating system. This development is the result of five years of joint work among their teams of Chips and Software Engineeringwith the aim of drastically raising the cost and complexity of attacks based on memory corruption. Mie promises to act continuously and transparently, covering critical areas such as kernel and more than 70 processes in user space, all this without compromising energy consumption and device performance. The Miene nucleus combines several layers that work in a coordinated manner to reinforce security. The typated memory assigners are systems that organize the data according to their type, as if each object class had a specific drawer. This organization makes it more difficult than an error in a program allows one data to overwrite another. If a failure occurs, the system can detect it before it becomes an attack. On this basis acts the Enhanced Memory Tagging Extension (EMTE), a hardware technology that adds an extra layer of memory control. Emte works by assigning a “secret label” to each memory block. Every time an app or the system wants to access it, you must present the correct label; If it does not coincide, Hardware blocks attempt And the system can close the process. This permanent and synchronized check allows to detect and stop classic attacks such as buffer overflows or use after release (USE-AFTER-FREE), which are usual techniques to take control of a device. The allocators protect the use of large -scale memory, while EMTE provides precision to the smallest blocks, where the software itself does not respond with the same effectiveness. This permanent and synchronized check allows to detect and stop classic attacks such as buffer overflows The bet responds to a landscape of threats where the highest levels against iOS are faces, complex and directed, historically associated with state actors. These chains usually share a common denominator: they exploit interchangeable memory vulnerabilities that have been present throughout the industry. The intention of Mie is cutting the progression in early stages, when the attacker still has little room and depends on chaining multiple fragile steps to gain control. Apple graph showing real exploitation chains and the points where it blocks them The scope of protection includes kernel and extends to key system processes that are usually entry objectives. In addition, Apple makes available to developers the possibility of testing and integrating these defenses through the Enhanced Security option in Xcode, including EMTE capabilities in compatible hardware. That is especially relevant to applications where a user can be direct objective, as messaging or social networkswhich often appear at the beginning of the exploitation chains. To sustain the labeling and synchronous check -up without perceptible impact, Apple redesigned the A19 and A19 Pro allocating CPU area, CPU speed and memory for label storage. The company precisely modeled where and how to deploy emte, so that the hardware meets the demand for checks. The software, on the other hand, takes advantage of the assignments typated to raise the bar of protection against memory corruption, while the hardware assumes fine verification. As we point out above, this should maintain the expected experience in performance and autonomy. The project was evaluated with its offensive research team from 2020 to 2025. First with conceptual exercises, then with practical attacks in simulated environments and, finally, on hardware prototypes. This prolonged collaboration allowed to identify and close complete exploitation strategies Before launch. According to Apple, even trying to rebuild known real chains, they failed to restore them reliably against Mie, because too many steps were neutralized at the base. Even so, Apple remembers that perfect security does not exist. Very rare cases could survive, such as certain overflows within the same allocation. For previous generations without EMTE support, the company promises to continue expanding software -based improvements and safe memory allocatives, with the aim of bringing part of these benefits to previous devices without affecting its stability. Ultimately, Mie does not eliminate riskbut it does redraw the rules of the game by raising the cost and difficulty of memory corruption techniques. For those who buy an iPhone 17 or an iPhone Air, this translates into always active protection and, according to Apple, invisible for the user. Images | Xataka with Gemini 2.5 In Xataka | Or pay or we will use your works to train AI: the threat of hackers to an artist website In Xataka | How to change all our passwords according to three cybersecurity experts

We believed that the end of Windows 10 support would be a nightmare for Microsoft. There are those who point out that it will be a gold mine

Windows 10 He has already marked in red on October 14, 2025, day you will stop receiving updates If you don’t hire The ESU Plan. For Microsoft, what seemed like a problem is Also a business. An analysis firm estimates that the program could leave billions of dollars only in the corporate sector. Before the imminent end of Windows 10 support, the company insists that the best option is to update to Windows 11buy new equipment or use Windows 365 to access the cloud system. The change will affect both companies and individuals, who will have to decide how to continue. ESU: The proposal for Windows 10 and reinforces Microsoft accounts ESU is the official Microsoft offer for those who cannot leave Windows 10 in 2025. In exchange for an annual subscription, the teams receive only critical and important security updates. There is no standard technical support or new functions. The requirement is clear: to have version 22H2 installed. Microsoft describes it as a temporary continuity tool, not as a substitute for migration to Windows 11. The ESU business scheme is designed as a price ladder: $ 61 per device the first year, 122 the second and 244 the third, always with a three -year limit. Microsoft clarifies that The subscription is cumulative: Entering later does not reduce the cost. Access will be managed by volume licenses and activation keys will only be operational after October 14, 2025, the end of the free Windows 10 support. For private users, Microsoft has raised a different approach: a unique quota of 30 dollars to access security updates for 12 months, with alternative options for not paying. It will be possible to activate ESU by redeeming 1,000 Microsoft Rewards points or using the Windows Backup application to make a backupwhich will unlock access without cost. Microsoft indicates that it will be enabled soon. According to Windows Centralwhich cites a detailed analysis of Nexthink, Microsoft could enter until 7,300 million dollars Only in the business segment thanks to ESU. The consultant starts from the official Microsoft data, which in July 2025 estimated at more than 1.4 billion the PC World Park with Windows. Of that total, it is estimated that around 30% corresponds to public and private organizations, which is equivalent to about 420 million devices. The report also projects that, even with the impulse of Windows 11, about 121 million equipment will continue to run with Windows 10 after October 14, 2025. With the cumulative cost of the program, which doubles the price every year for a maximum of three periods, this user base would generate significant income. They are independent calculations that illustrate ESU’s economic potential, but Microsoft has not confirmed or commented on these figures. Nexthink figures draw a billionaire business, but reality is yet to be seen: we will have to wait to check how many companies and users pay for prolonging Windows 10. until you install Linuxchange Mac or move to a Chromebook. There is also the possibility of renewing equipment and moving on to Windows 11. Images | Windows | Arnav Singhal In Xataka | Send files among all my devices was a roll. Then I found this free application, Open Source and Multiplatform

The earth does not have as much space as we believed

For years, the carbon capture and storage (CAC or CCS) has been One of the great technological promises in the Fight against climate change. The idea is simple: if we can’t stop issuing co₂, It can be captured from the air and bury it safely in deep geological formations. But this ‘plan B’ is beginning to lose meaning. What was thought. We have always taken for granted that the ‘warehouse’ we had had in mind was practically infinite to store all what we would like. Estimates talked about A capacity of between 10,000 and 40,000 gigatons of CO₂that would allow us to ‘live’ calm without having to reduce our emissions overnight. What is the problem. A new and devastating study Made by an international team of scientists, everyone has come to give us all: the warehouse is much smaller and has very strict conditions when using it. The new figure, which the authors have defined as a ‘prudent planetary limit’ is 1,460 gigatons of CO₂. This is almost an order of magnitude lower than the most optimistic estimates that were on the table. It is like discovering that the hard drive that was believed to be 40 terabytes, actually only has 1.5 tenderlast of useful storage. How they know. To reach this conclusion, the researchers did not limit themselves to calculating the total volume of The sedimentary basins of the planet. Instead, they did what no one had done on this scale: apply a series of risk and exclusion filters based on prudence and damage prevention. They created the most detailed map to the date where the CO₂ should not be stored. “Peros” that reduce capacity. In the investigation, experts pointed to different reasons to be able to remove storage capacity to our planet. These can be summarized at the following points: Seismic risk: All areas with moderate or high seismic activity have been ruled out, since injecting High pressure here can reactivate geological failures that cause earthquakes. Protected and Polar Areas: Attending to international agreements such as Kunming-Montreal, all natural parks, biosphere reserves and environmentally sensitive areas are excluded. Cercanías to the cities: to protect human health and avoid contamination of aquifers, an exclusion zone of 25 km around urban areas was established, since a CO₂ leak could acidify drinking water. Ocean depth: current offshore extraction technology is concentrated in relatively shallow waters. The study establishes a practical limit of 300 meters in marine depth, since going further shoots costs and risks, as the Deepwater Horizon disaster recalled. International borders: storing carbon under the territory of another country is a legal and political mines field. The study assumes that cross -border accounts would be, in practice, very difficult to use without complexes and, today, non -existent international agreements. A finite and precious resource. The main conclusion reached by the study is that geological storage is not unlimited. It is a finite resource, Like oil or lithium, and must be managed with an intergenerational vision. It is, as the authors say, a “savings account” that belongs to this and the future generations. Right now it is used to mitigate current emissions and continue to burn fossil fuels, and also reverse global warming, since when storing this gas the objective is to lower the temperature of the planet in general. But the conflict is evident: each ton we use today for the first objective is a ton less than future generations will have for their time. There is a limit. Perhaps the most shocking data of the study is this: if we dedicate the totality of this prudent limit of 1,460 gigatons exclusively to eliminate carbon from the atmosphere, we could only reduce the global temperature by a maximum of 0.7 ºC. This puts a very real stop to the popular ‘overshoot’ strategies that trust that We can exceed 1.5 ° C limit and then ‘cool’ the planet with mass capture technologies. This study tells us that our ability to back down is, at best, very limited. The urgency of reducing emissions is multiplied. If we cannot rely on a massive cleanliness, the only safe way to reduce emissions drastically and urgently, we have a problem. The study shows that, to the current rhythm, many climatic scenarios would exhaust this storage budget before the year 2200, leaving future generations without tools to manage the climate. Rich and poor in storage. The analysis also reveals a new geopolitical panorama with clear ‘winners’ and ‘losers’ in this race. Winners are countries such as Russia, the United States, China, Brazil and Australia that retain great alonance potential even after applying all risk filters. At the other extremes we have the countries ‘poor in storage’ such as those belonging to the European, Indian or Norway Union, which see their potential drastically reduced. This means that, to meet their objectives, they could have to depend on other countries to store the captured CO₂, creating new economic and logistics units. A blow of reality. This study does not mean that carbon capture is useless. It will continue to be a crucial technology to decarbonize industries such as cement or steel. Which means that it is not the panacea that some expected. It is not an excuse to delay climatic action. It is a forceful reminder that there are no magical technological solutions that exempt us from the hardest and urgent task: stop emitting greenhouse gases. Images | Peter Burdon In Xataka | I have measured the CO2 of my office for weeks. And now religiously vento every hour and a half

We believed that the most complicated summer of F-35 had ended. Until his software made him kamikaze on the Arctic

In mid-August it seemed frankly difficult for something more to the brand new F-35 of Lockheed Martin. After the plane stranded for a month In India, the reverse from Spain to an order (to which they have added other countries), and A second breakdown Of a hunt, this time in Japan, the quota of fatalities seemed complete. Until a report has appeared that calls for the plane and its sophisticated software. An accident and its causes. Now we know that on January 28, 2025 an F-35a of the United States Air Force, assigned to the 354th combat wing at the base of Eielson (Alaska), He crashed After taking off in training mission as part of a group of four aircraft. He Official Report of the Pacific Air Forces revealed that the main cause was the fluid freezing Hydraulic contaminated with water in the shock absorbers of the landing train, which prevented the complete extension of the struts and caused that the weight sensors on wheels on wheels They will erroneously interpret That the plane was on the ground while still flewing. Kamikaze mode. This false signal automatically activated the “on-aund” control mode in full flight, the aircraft becoming uncontrollable. Luckily, the pilot managed to eject and survived with minor injuries, but the plane, valued in 196.5 million dollarsit was completely lost. Emergency in flight. The problem was immediately manifested: the front train was misaligned at 17 degrees and could not retract. After radio consultations with engineers from Lockheed Martin and a flight supervisor, the pilot tried for almost an hour Reactivate the wheel using two “touch-And-go” maneuvers. However, the ice also blocked the main trains, and at the second attempt the sensors indicated that the aircraft had landed. What happened then? That the system automatically changed to Operation mode on landdrastically reducing the control capacity. The pilot, nicknamed in The “MP” reportmanaged to eject just before the hunt went into loss and fell into chopped. The device came to rise more than 1,000 meters after the ejection, and then rush vertically, in the sequence recorded in a video that went viral. Technical and maintenance factors. The investigation He explained that the ice in the struts, added to the bad alignment of the front -train blocking hook, damaged metal components and prevented the correct system coupling. In addition, and very important, the Wow sensors (Critics in the F-35 Flight Control Logic, known as Claws) showed vulnerability in extreme cold conditions, something that Lockheed Martin already He had warned In previous maintenance bulletins. In other words, the ice “cheated” to the software. The report Underline that water pollution in hydraulic fluids derived from poor management of hazardous materials and breaches in service protocols. These negligence, together with decision -making during the emergency, were considered contributing factors to the accident. Implications and lessons. No doubt, the case has highlighted the complexity inherent in the high F-35 automationwhere a sensor failure can trigger waterfall reactions in the control software. Although nine days later another F-35A was able to land with a similar problem in the train without consequences, the Research Board stressed that, with the available information, the safest option would have been to order an immediate landing or a controlled ejection instead of risking a second attempt to maneuver. Although the report did not issue recommendations Formal policy changes, did highlight the need to reinforce compliance with maintenance protocols, supervision of fluid use and preparation for operations in Arctic environments. Strategic repercussions. In short, the accident, Without fatalitieshighlights the challenges of operating fifth generation fighters in extreme conditions such as Alaska, where temperatures close to –17 ºC can aggravate technical vulnerabilities. Not just that. It also offers a warning to future operators in cold climates, Like Canada and Finlandwhich must consider the reliability of the sensors and the resilience of the control systems in hostile environments. Beyond the technical, the event illustrates how the sophistication of the F-35, with its dependence on algorithms and automation, can become a risk factor in unforeseen emergenciesforcing to rethink the balance between human control and software in new generation military aircraft. Image | US AIR National Guard/Tech. Sgt. Adam Keele In Xataka | It is being a complicated summer for the US F-35: after the “no” of Spain Russia and China have appeared to do more damage In Xataka | A group of countries is being formed after the decision of Spain: those that are closing the door to the US F-35

Physicists believed that the neglect was a useless particle. Now they suspect that it is the key to universal quantum computers

Experts Quantum computing with those who have had the opportunity to speak, such as Spanish physicists Ignacio Cirac either Juan José García RipollThey argue that quantum computers will be able to make great contributions when they are capable of amend your own mistakes. The main problem they face in this area is noise, understood as the disturbances that can alter the internal state of the cubits and introduce calculation errors. The strategy for which many of the research groups that are involved in the development of quantum computers are opting for monitoring the operations carried out by the cubits to identify real -time errors and correct them. The problem is that from a practical point of view This strategy is very challenging. Logical cubits represent a way to overcome the difficulty involved in the use of hardware or physical cubits, which are extremely noise sensitive, and, therefore, prone to make mistakes. Each logical cubit is constructed abstractly on several physical or hardware cubits, so that a single logical cubit encodes a single cubit of quantum information, but with redundancy. It is precisely this redundancy that allows to detect and correct the errors that are present in the physical cubits. Anyway, the researchers will have one more tool to deal with the errors of quantum computers. It can even be the most powerful resource that they currently have at your fingertips: the Neglectón. Universal quantum computers are one step closer One of the most promising research fields in this area is topological quantum computing. Its purpose is to protect the delicate quantum information that the cubits work coding it in the geometric properties of exotic particles known as ISING anions. An important note before moving forward: in condensed physics an anion is not the same as in chemistry. In fact, Ising’s anions are quasiparticles that, in theory, arise in some two -dimensional materials. Its existence has not yet been demonstrated experimentally, so they are a theoretical result at the moment. It seems a complicated concept, and it is, but in this article we do not need to deepen much more. What we do need is that Ising’s anions They are presumably much more robustand, therefore, resistant to errors that traditional cubits. In practice this implies that moving some anions around others in a specific way should allow researchers to carry out logical operations with them. This is the reason why they are so attractive in quantum computing. And, in addition, they have another great advantage: this configuration is largely immune to external noise. Ising’s anions are quasiparticles that, in theory, arise in some two -dimensional materials Currently, Ising’s anions are thoroughly investigating in the condensed matter laboratories of the entire planet because they are one of the main candidates to participate in the construction of universal quantum computers, and, therefore, immune to errors. Aaron Lauda, ​​professor of mathematics, physics and astronomy at the University of Southern California (USA), Holds the following: “By themselves, Ising’s anions cannot perform all the necessary operations for a general purpose quantum computer. The calculations they support are based on the ‘braided’ (branding), And they require physically moving anions around each other To carry out quantum logic. For Ising’s anions, this braided only allows a limited set of operations known as Clifford doors, which fall short with respect to all the power required for universal quantum computing. “ Fortunately, the research team led by Lauda has found a way to transform ISING anions into universal structures that are capable of performing any quantum calculation through braided. Its solution for the moment is only theoretical, but its potential is enormous. The surprising thing is that what they propose is to resort to a new type of anion known as Neglectón that was initially discarded when it was “discovered” in the theoretical framework. In fact, Neglelectón has gone from being a mathematical waste to be the new hope of quantum computers. In theory when combining Ising’s anions and neglect, universal quantum computing will be possible through braided. According to Aaron Lauda Only one neglect is needed because it remains in the stationary or static state while the calculations are carried out by braiding Ising anions around them. It is a surprising conclusion. One last note to conclude: The neglect is not a fundamental particlesuch as the electron or the quark; It is a theoretical quasiparticle that arises from the collective behavior of many other particles in a two -dimensional system. Let us trust that it is consolidated as the definitive tool that will allow researchers to carry quantum computing from theory to practice in a robust and efficient way. Image | IBM More information | Science Daily In Xataka | Bitcoin encryption and other cryptocurrencies will fall. And those responsible will be quantum computers

We believed that astronauts from the Apollo missions left the earth. Actually, they did not completely abandon the atmosphere

The idea that space begins where heaven ceases to be blue is a story for children. Decades of scientific research show that the Earth’s atmosphere It is much bigger than it was believed. Not even the 12 people who stepped on the moon abandoned at all their influence. Where the earth ends. As Explain the expert in heliophysics From NASA, Doug Rowland, there is no clear border. “The atmosphere does not stop at Everest, or where the planes fly. It continues and continues, becoming less and less dense as you go up.” The International Space Station, which orbits our planet about 400 kilometers high, experiences sufficient air resistance to need a periodic impulse. Otherwise, it would fall back to earth. But the real surprise came after Decades of Observations of Soho (Solar and Heliospheric Observatory), a joint mission of ESA and NASA. To the moon and beyond. A study Based on data from the Soho Observatory, he revealed that the outermost layer of our atmosphere, a faint cloud of hydrogen atoms called Geocorona, extends up to 630,000 kilometers, almost twice the distance from the earth to the moon. When astronauts from the Apollo 16 mission installed the first telescope on the moon in 1972, they captured an image of the geocorone shining in ultraviolet light. What they didn’t know was that they were still inside her. In words of Igor Baliukinmain author of the study: “The moon flies through the atmosphere of the earth.” Oxygen on the moon. The presence of the earth on the moon is not limited to hydrogen. Earth oxygen also arrives at our satellite. It occurs for about five days a month, when the moon passes through the Magnetocola of the Earth, the magnetic tail of our planet. Every time it happens, Oxygen ions are accelerated to the satellite and are embedded in the lunar soil. Researchers believe that this process has occurring 2.4 billion years, which means that lunar regolite could keep a record of the evolution of our own atmosphere. The “official” border of space. The Atmosphere is divided into layers: Trophosphere, stratosphere, mesosphere, termosfera and exosphere. The latter, the exosphere, starts about 700 kilometers high and merges with the solar wind about 10,000 kilometers. But their particles are so scarce and so scattered that they can escape towards space. The “official” border of the space is, by convention, the line of karm, located 100 kilometers from altitude. It is considered the point at which traditional aeronautics is no longer possible due to lack of air. However, the geocorone, the luminous part of the exosphere, is the proof that the atmospheric influence of our planet comes much, much further. Image | POT In Xataka | The most prolific astronomer in the world is a complete stranger. Has discovered half of the moons of the solar system

We believed that at this point the PS5 and Xbox would be cheaper than ever. The hallucinating thing is that it is happening just the opposite

Sony has just announced that all models of their PlayStation 5 consoles will rise $ 50 in the United States. The argument that Sony has used in its statement To explain the decision is that there is a “difficult economic environment.” Or what is the same: they blame tariffs. But that is already a generalized trend … and very unreasonable. PS5 rises around the world. It is the same that happened in April in Europe, Middle East, Africa, Australia and New Zealand, when this company He also announced that prices went up for some models. In Europe, for example, the PS5 digital edition cost 499.99 euros. In the US, these increases affect all models, and the new prices will be activated as of Thursday in that country, being like this: PlayStation 5: 549.99 dollars (before $ 499.99) PlayStation 5 Digital Edition: 499.99 dollars (before $ 449.99) PlayStation 5 Pro: $ 749.99 (before $ 699.99) Prices for accessories, yes, remain unchanged. Microsoft already did the same. This movement is the same as Microsoft also with its Xbox Series S/X consoles, which They also rose in price at 50 euros In Europe and the US. Even the controls and some accessories saw their prices increased. And of course, Nintendo. The Japanese company took a long time to end up lowering the price of the original Nintendo Switch, but the Switch 2 has given it the excuse for uprise even more Even in the games. Not only that: the economic situation has caused Nintendo that Prices upload of the original Nintendo Switch, the Lite edition and the accessories. These data have not been met for the latest Sony and Microsoft models. In fact, the thing is totally the other way around. Damn tariffs. Most video game consoles are manufactured in China, and tariffs have especially affected the Asian giant. That certainly makes manufacturers more expensive to manufacture and export them to be able to sell them in the US or Europe. And if they want to keep their profit margins, they end up doing something logical for them and terrible for consumers, because we are the ones who end up paying the duck. But some manufacturing costs have dropped. The amazing thing about the situation is that both the PS5 and the Xbox Series S/X are now cheaper to manufacture (let’s not say the original switch). After five years in the market (nine in the case of the Nintendo console), the components of these consoles have dropped in price: their cost was then very high, but after all this time those components have ended up becoming almost obsolete compared to the new chips that have appeared in all this time. So what is happening? Not everything has come down. In fact, these consoles are based on GDDR6 memory chips and NAND chips for their solid state units, and there the prices They rose strongly in 2025 and 2025 for the relative shortage of this type of chips and the focus that manufacturers are putting in HBM memories for AI accelerators such as those of Nvidia. To this are added other costs not directly related to the material invoice (BOM), such as logistics, manufacturing energy, or marketing. The dollar loses bellows. But it is also the green ticket has lost value in front of the euro and other currencies. After 80 years being the pillar of the world economy, US tariffs They are starting to destroy it And that also translates into the price increases of everything. Including consoles. Less “boxes” are sold. Consoles manufacturers used to lose money with their models in the initial periods. The idea was always to recover those losses of plenty with the sale of physical games, but the rise of the sale of digital editions, the free-to-play games and the subscriptions have made the distribution of value change. For Sony or Microsoft the solution is simple: if we no longer recover so much value with the games, we upload the price of the hardware. But all that already happened before. The truth is that the situation is in many cases analogous to which it was lived with the PS4 and the Xbox One for example. These consoles ended up falling in a remarkable price over the years, but here a perfect storm has been created. One in which inflation, dollar weakness, tariffs and new video game access models have caused the worst result for users: more expensive consoles. In Xataka | 60 euros for a camera, 90 for a game. The worst (and best) of Nintendo prices for switch 2

Duolingo believed that AI was his ally. GPT-5 has just demonstrated that it can be its mortal competition

Duolingo sinks in the stock market. In early June its action was around $ 525, but now its value It has collapsed Up to 325 dollars, 38%. It is not entirely clear what this debacle has caused, but we have a clear suspect: AI. Be careful what you say. Three months ago Luis Von Ahn, CEO of Duolingo, made very controversial statements and indicated that he would replace part of his network of external (human) collaborators by generative AI systems. Although he stressed that they would continue to be a company that took great care to its employees, it also stressed that IA would take an increasingly notable role in the entire operation of the company, especially to “eliminate bottlenecks so we can do more” with the employees they already had. The value of Duolingo’s action has suffered important ups and downs in what we have been, but the last trend is clearly negative. Source: Google Finance. Boom and drop in actions. These statements were produced at the end of April. The initial impact for shares seemed to be positive, which went from $ 400 to $ 530 (32.5% growth) in a few days. But shortly after that optimism for the role of AI in the company vanished among investors: the action fell even below those initial levels, and now is around year principle levels. It seemed that Duolingo traced. The company presented financial results and corroborated the success of its business model. The feeling of progress when learning a language – gamification is a powerful (and as we will see, dangerous) tool— sold more than learning itself. That allowed to momentarily stop the fall of the actions a few days ago, but then something happened. GPT-5. During the presentation of the new OpenAI model there was a demonstration in which one of the company’s engineers launched a dart poisoned to Duolingo. That demo was to create a custom web application in just three minutes with which the user could learn French. With a simple prompt an app that competed directly with Duolingo, and that of course avoided paying for that application to learn languages. That PROMPT single made GPT-5 capable of creating an interactive website to teach you to speak in French. Source: OpenAi. Be careful with gamifying everything. Although that demo of GPT-5 becoming a personalized teacher is striking, the value of value of Duolingo’s action may also have been motivated by other causes. Especially, for that clear focus on the gamification of the learning process. Converting this process into a game is attractive and encourages many users to take that task in a more fun way, but criticism of excessive focus In Duolingo’s gamification they are frequent. As A user said In Reddit, “for me the reward to learn a language is to learn the language.” Other explained That Duolingo is not a learning application, but that it must be taken as something else: a game. The condemnation of advertising. Other criticisms are aimed at Excessive appearance of ads advertising when one uses duolingo to learn a language. Advertising occurs in the free version, because the premium version has the advantage of not showing it. The model is reasonable – study, after all, is a company and is there to earn money – but as with streaming, the presence of ads is increasing and is increasingly annoying for those users of the free version. Of betting on AI to be threatened by her. The truth is that although all these factors may have influenced that assessment, volatility may have been influenced by those expectations that are constantly lived with AI. The companies that are most committed to this technology are the ones that are going up in the stock market –to tell the “Trinity of the AI”-, although the real impact of this technology is for the very discreet moment. AI as a private teacher. What is unquestionable is that The potential of AI as a teacher of any discipline “Not only about languages,” is undeniable. It is something that GPT-4o already pointed out, whose demonstrations were in the same direction. For example, the boy’s video Learning to solve a mathematical issue —Hahere included – it was especially striking, and hunting a future in which whoever wants to get a lot of these “private teachers” that we can create with a prompt single in chatgpt (and other chatbots, of course). It is early to know, of course, but Duolingo, like many others, seems to be suffering the consequences of that future potential. In Xataka | We do not know if the AI is going to eat your work, but the CEO of some startups are determined to convince you of it

We believed that the “road of yellow tiles” was a private cinema. Until some divers descended to the depths

The idea of the yellow tile road was born with the novel The Wonderful Wizard of Oz by L. Frank Baum (1900), where a golden path is described that leads to the Emerald city and symbolizes the trip to personal fulfillment and discovery. In the famous adaptation to the 1939 cinema, that path became a visual icon thanks to the pioneering use of Technicolor: the bright yellow contrasted with the intense green of the city and the blue sky, marking the passage of Dorothy from Kansas’s gray routine to a fantasy world. It turns out that in marine depths we had Another way of tile. A geological finding. History dates back to 2022. During the Luʻuaaahikikekumu expeditiona scientific team aboard E/V Nautilus shipwhile exploring the chain of old submarine volcanoes of Liliʻuokalani Ridge, he ran into Rocking formation that remembered the mythical “road of yellow tiles” of the cinema. The curious structure, located at the top of the submarine nootka, within the National Marino Monument Papahānaumokuākeait turned out to be an example of ancient volcanic geology: rock fragments generated by high -energy eruptions, known as hyaloclastitethat have fractured uniformly due to repeated cycles of heating and cooling by successive eruptions. This pattern, similar to the cracked of the surface of a Brownie, has conferred the rock an appearance of perfectly aligned cobblestone. Origins and characteristics. Hialoclastite is formed when hot magma comes into contact with water, fragmenting into vitreous particles and accumulating in the seabed. Over time, these deposits are compacted and cemented, and, in cases like this, exposed to thermal changes that They produce rectilinear fissures. The found sector showed a stretch of “baked scab” dry to touch, An optical effect that surprised the team and generated jokes on the “road to the Atlantis.” The inspection with the nautilus robotic arm allowed to collect samples from Ferromanganésic scabs (rich in iron and manganese oxides) that covered the background, a resource of scientific and industrial interest. Importance of the mission. That was the First systematic exploration Of these submarine mountains, whose main objective is to understand the mysterious discontinuity that presents its alignment in the ocean bed. The finding of “road” joined other unique observations of the expedition, such as the filming of a strange agency nicknamed Headless Chickn Monsterstrengthening the idea that the area houses poorly documented biological and geological phenomena. Beyond the visual anecdote, the identification and study of these formations provides key information about the Submarine eruptive processes and the tectonic evolution of the region, opening the door to new discoveries in one of the most remote and protected areas of the planet. Scientific context. He find It was also framed in an international effort to map and understand the underwater structures that make up the hidden geography of the oceans. The formation of the “path” in Nootka Seamount Not only illustrates how volcanic activity can generate visually striking patterns, but also offers clues about the systems behavior submarine magmatic and its interaction with water in High energy environments. Plus: These studies are essential to improve underwater volcanism models, evaluate potential mineral resources and understand how these geological habitats influence deep marine biodiversity, a field in which each expedition reveals more unknowns than certainties. Image | E/V Nautilus In Xataka | We know more than Mars than the seabed. An expert helps us to understand why it is still an enigma and what mysteries keep In Xataka | The Atlantic has a ‘lost city’ with the key to life on other planets. Now is in danger

We believed to know what killed Napoleon’s army in Russia. The finding of a tooth has shown us something else

In 1812 there is a moment that was going to be registered in the history books. The Russia invasion by Napoleon culminated in one of the greatest military tragedies: The great arméeformed by more than half a million men, was forced to a devastating withdrawal marked by hunger, cold and disease, a combination that cost the lives of hundreds of thousands of soldiers. Or we believed. Health catastrophe. In the summer of 1812, Napoleon Bonaparte gathered up to 600,000 soldiers for his campaign against Russiathe greatest force he had ever deployed. However, the burned land strategy of Tsar Alejandro iwhich involved Evacuar Moscow and deprive the supplier of supplies, forced the withdrawal of the French army to Poland during a brutal winter. Between October and December of that year, more than 300,000 men perishedvictims of hunger, the extreme cold and a wave of diseases that devastated to an already weakened force. For a long time, the testimonies of survivors and the first scientific analyzes pointed to the TIFUS and the trench fever as the main culprits, reinforcing the idea that the bad hygienic conditions had sealed the fate of the great Armée. The new findings. Now, research carried out In the Pasteur Institute in Paris they have contributed a more precise vision thanks to metagenomic techniques, capable of identifying genetic material of any pathogen present in human remains. Nicolás Rascovan’s team analyzed Thirteen soldiers Buried in Vilna (current Lithuania), epicenter of mortality during the withdrawal. The results did not detect traces of typhus or trenches fever, but they did reveal the presence of Salmonella Entericacause of paratyphoid fever, and Borrelia recurrentis, transmitted by lice and responsible for recurring fever. These diseases, although not always fatal, would have deeply weakened soldiers already exhausted by endless marches, lack of food and glacial temperatures. In that context, even pathologies that in other circumstances could have overcome became mortal. Napoleonic invasion in Russia Lethal combination He New scenario It suggests that defeat is not explained by a single infectious agent, but by a devastating combination: physical exhaustion, starvation, extreme cold and a set of diseases that, together, undermined the resistance of tens of thousands of men. The Parathyphoid fever It would have caused diarrhea and dehydration, while recurring fever progressively weakened with cyclical episodes of high fever. All this, added to the lack of hygiene, to the spread of lice and the impossibility of adequate medical care in the middle of the chaos of the withdrawal, turned the Napoleon army into a paid field For the disease. The magnitude of the health catastrophe even exceeded combat losses, and became one of the decisive factors that precipitated the collapse of the campaign. Historical and scientific implications. Although some experts warn that the amount of recovered DNA is reduced and that the results are not entirely conclusive, The study It marks an important advance in the use of modern tools to reinterpret historical episodes. Demonstrates the Metagenomics potential To trace diseases in ancient human remains and offers new perspectives on how biology, and not only military strategy, it can explain the collapse of whole armies and populations. Researchers They point That these techniques could also be applied to the study of communities in America and Australia after European contact, where the lack of reliable records and historical biases make it difficult to understand the true impact of epidemics. The defeat that sealed the empire. The Tragedy of 1812 It is still one of the most studied inflection points in military history. The collapse of the Great Armée Not only stopped the Napoleonic expansion, but triggered the offensive of his enemies and the beginning of the end of his empire. While the epic of the campaign has traditionally been narrated in the key of battles and strategic decisions, the New evidence They confirm that biology and disease played a central role in the debacle. The withdrawal of Russia was, ultimately, both a military disaster and an epidemiological catastrophe, and the DNA of a few teeth found in Vilna has allowed to illuminate more precisely the executioners invisible and tiny that decimated the soldiers of Napoleon in one of the most lethal winters in history, starting with an unexpected “army” of lice. Image | Jean-Louis-Ernest Meissonier, Blaue Max In Xataka | “Even if I told you, you would not believe me”: the mystery of what Napoleon saw when he slept in the great pyramid of Egypt In Xataka | ‘Napoleon’ is Ridley Scott’s most controversial film in years. Not among critics: among historians

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.