the shipwreck from 2,000 years ago that reveals the “luxuries” of the Roman legions in Switzerland

Few products of Mediterranean gastronomy are as iconic as wine or olive oil. In fact, if we take a look at current exports of the Spanish statewe will check that both are still at the top. This is not something new: two millennia ago, the Roman Empire had already converted the Iberian Peninsula into one of its great strategic pantries. One of the most compelling evidence is It is Monte Testaccioa 50-meter-high artificial hill in the center of Rome made from the remains of ceramic amphorae, 80% of which came from Baetica (today, Andalusia) and brought olive oil. It wasn’t just trade: it was logistics on an imperial scale, organized and sustained for centuries. That this network reached very far is something that the archaeological record continues to confirm: one of the latest and most impressive finds is in the depths of the Swiss lake of Neuchâtel. The discovery. In the Swiss lake of Neuchâtel they have found the cargo of “the wreck of the Eagles”, a ship sunk between the years 17 and 50 AD, in the middle of the Roman Empire. From 2024 to the present the Octopus Foundation has recovered approximately 600 pieces: hundreds of almost intact plates, platters, bowls and glasses, two large fragments of amphorae for oil or wine, a wicker basket preserved in the lime of the lake with the crew’s kitchen utensils, metal tools, harness and shooting equipment, four cart wheels, legionary weapons, among other elements. Why is it important. The most interesting thing about this discovery is that the Roman Empire had a primitive globalization insofar as they were able to distribute their lands throughout the length and breadth, which was not small: It covered three continents: from Great Britain to the Carpathians in Europe, North Africa and Asia Minor. The Roman soldiers in Switzerland did not only eat local products, but also had access to the flavors of their land. On the other hand, it is worth highlighting the exceptional conservation, something that has been helped by the cold waters and the lack of oxygen at the bottom. Furthermore, the archaeological context is intact, allowing the reconstruction of its organization on board and the combination of evidence of civil tableware, land transport equipment and military weapons. Context. The hypothesis The one on which the research team is working points to the Legio A constant supply was needed to maintain a legion of about 6,000 men. Thus, the cargo would have traveled by cart to the Roman port of Yverdon, south of the lake and from there it would have crossed it to the north. As the cause of the sinking, the team points to a gust of wind when approaching the Thielle channel. That there are swords suggests that it was not a military ship but a merchant ship under armed escort. Be careful, no structural traces of the boat have been found, only its cargo, hence the team does not rule out that the boat did not sink at all or that it did so in another place. The only thing we are clear about that was lost at the bottom of the lake was the cargo. Octopus Foundation Oil or wine? At the moment the Octopus Foundation describe the amphorae only as containers intended for the transport of oil or wine, without further precision, which is why further analysis is pending to clear up doubts. Today olive oil and wine may be associated with more select consumer products, but in ancient Rome they were essential items: liquid gold was used for almost everythingfrom cooking to lighting with lamps through personal hygiene and even for sports, medicine and rituals. And the wine, even if it was diluted with water, formed part of the daily diet of all social classesincluding troops. Octopus Foundation How it is being excavated. The detection of the cargo was aerial, using a drone in winter, when the visibility of the lake is greater. Thanks to 3D photogrammetry they were able to generate maps of the site, which they then divided into grids to determine the exact position of the objects found. They then photographed each piece and recorded it in situ before being extracted individually. The site was kept secret during the year between the two campaigns and was monitored with underwater cameras developed expressly for the project. The urgency to act came from a real threat: the sediments that had protected the cargo for centuries had eroded as a consequence of the hydraulic corrections of the Jura in the 19th and 20th centuries, leaving the pieces exposed to currents, anchoring of recreational boats and looting. What’s coming now. The extracted pieces are being analyzed in the Laténium laboratory with the aim of identifying pottery workshops, determining the content of the amphorae using residual organic chemistry and reconstructing trade routes. Once these doubts have been unraveled, its final destination is a public exhibition at the Neuchâtel archeology museum. In Xataka | The Romans were thirsty for oil and we have just found in Tunisia the second largest press of the Empire In Xataka | The most polarizing and divisive scientific debate of the moment has to do with wine. With one 1,700 years old Cover | Octopus Foundation and Rahime Gül

It turns out that there is a Soviet submarine at the bottom of the Norwegian Sea releasing radiation for 40 years

On April 7, 1989, the Soviet nuclear submarine K-278 Komsomolets sank in the Norwegian Sea after an uncontrolled fire fruit probably short circuit in the electrical panels of compartment 7, which led to a massive and uncontrollable deflagration because the atmosphere was critically enriched with oxygen due to failures in the air regeneration system. Of the 69 people on board, only 27 survived. It wasn’t just any submarine: it had a double titanium helmet that allowed him descend to unreachable depths for his rivals of the time. Its cutting-edge technology hid a dangerous core: a nuclear reactor and two plutonium warheads that have since lain at the bottom of the sea, 180 kilometers southwest of Bear Island, in the Svalbard archipelago. And according to the most complete study carried out to date, published a few days ago in the scientific journal PNASthe Komsomolets remains an active source of radioactive contamination in the Arctic. The discovery. In 2019, a Norwegian research team went down with the Ægir 6000 underwater robot to thoroughly inspect the submarine using cutting-edge technology. As they approached the ventilation tube they found a visibly distorted column of water, as if it were smoke, as you can see in the video immediately after this block. It is a leak with intermittent behavior. They took samples and the results were overwhelming: concentrations of Cesium-137 800,000 times the normal radiation of seawater in the area and Strontium-90 400,000 times. Both isotopes are direct products of nuclear reactor fission. The analysis shows that the radiation comes from the propulsion system (the nuclear reactor) and that the reactor fuel is in the process of corrosion with the environment. Why is it important. The good news is that this radioactive leak does not come from the nuclear warheads: two torpedoes with atomic warheads. For now, that threat is under control: the Soviets sealed the torpedo compartment with titanium plates in the early 1990s and judging by analysis, the sealing continues to work because they have not detected weapons-grade plutonium in the marine environment. The bad news is the reactor. It does not explode or disappear, but simply the zirconium cylinders that protect the uranium and plutonium are corroding, leaking these isotopes into the sea in a slow and invisible leak that is diluted in the ocean. Fortunately, samples taken in relatively close areas show that dilution is rapid, as they return values ​​close to normal. In fact, the hull is full of sponges, corals and anemones and its samples contain low traces of cesium-137, but without detectable damage. Context. Man-made radioactivity in the oceans has three main sources according to the International Atomic Energy Agency: the atmospheric nuclear tests of the 60s and 70s, the Chernobyl accident and the authorized discharges from the Sellafield and La Hague reprocessing plants, in the United Kingdom and France respectively. The sunken nuclear submarines, where the Komsomolets would enter, have a marginal contribution. Their importance is more qualitative than quantitative: they are point sources, localized and that tend to worsen over time. After the Chernobyl disaster in 1986, the Soviet Union came under great international pressure. When the Komsomolets sank three years later, Moscow organized inspection missions with MIR submersibles. When he confirmed that the warheads had been in contact with sea water, he acted: in 1994, with the economy in free fall and western funds involvedRussian technicians they sealed the cracks of the torpedo compartment with titanium plates. Since 2007, Norway has undertaken regular monitoring of the wreck as part of its nuclear safety responsibilities in the Arctic. Current risk status. For now the nuclear warheads are contained, their sealing works and there are no signs of weapons-grade plutonium in the water. The reactor is the active problem now: the fuel is corroding, the emissions are real, and the research team does not understand why they are intermittent or what the rate is. Any attempt to recover or physically manipulate the submarine would probably be more dangerous than leaving it where it is, since if the radioactive materials reached the atmosphere, the contamination could reach land with worse consequences than today. . A nuclear laboratory under the sea. The research team has two goals ahead: to understand why the leak is intermittent and whether that corrosion rate is accelerating over time. Inadvertently, the Komsomolets is now a natural laboratory to study what happens to submerged nuclear reactors in the long term. Information that is not trivial, given the number of nuclear devices that sleep on the seabed. In Xataka | Russia’s most advanced nuclear submarine was a secret. Until Ukraine has revealed everything, including its failures In Xataka | The Soviet Union needed to save millions of people from hunger so something was invented: the art of making sausages Cover | Karina Victoria

a puppy from 15,800 years ago rewrites the history of domestication

For many, the dogs they live with They are another member of the familybecause the link that is created surpasses many friends with other humans. And it is no wonder, because we have been living with them for millennia, but the exact origin of our bond in history has always been involved in a scientific debate. But this has finally been solved thanks to genetics. The study. They have been two monumental published reviews in Nature those that have hit the table thanks to the analysis of the DNA of a puppy that lived 15,800 years ago at the Pınarbaşı site, in modern-day Turkey. This discovery has not only set back the biological clock of our canine companions by at least 5,000 years, compared to previous genetic records, but demonstrate that our alliance with wolves was forged long before we invented agriculture. A puppy with honors. The discovery is undoubtedly a triumph of pelogenetics, since for years scientists depended on the shape of bones to distinguish between a wolf and a primitive dog, a method that has many errors. But now science has turned to the genetic material found inside your cells to clear up any doubt. The remains of three puppies were found at the site, but what is fascinating is not only their antiquity, but also how they lived. Here the chemical analysis that was carried out reveals that these animals had a diet surprisingly similar to that of the humans with whom they lived, including a strong base of fish. Furthermore, they were buried following human rituals, which is a posthumous treatment that demonstrates a deep emotional bond. Its expansion. But the Turkish puppy is not an isolated case, since the first study of Nature demonstrate that, by the Late Upper Paleolithic, dogs had already spread rapidly throughout western Eurasia. Here the team also analyzed remains found in Gough’s Cave, in the United Kingdom. There they identified another domesticated dog from 14,300 years ago whose jaw had perforations, again suggesting ritual practices. The most interesting thing is that, despite the enormous geographical distance that separates Turkey from England, the genomes of both animals present strong genetic similarities, confirming that they belonged to the same large population of Paleolithic dogs. Another study. In parallel, he wanted to broaden the panorama after examine the remains of 200 European dogs from more than 14,000 years ago, managing to confirm the presence of another primitive dog in Kesslerloch (Switzerland), dated at 14,200 years. This second team demonstrated that the lineages of these first Paleolithic dogs did not become extinct, but rather that their genetic signatures have survived and are present in the modern dogs that sleep on our sofas today. Before agriculture. The most classical culture told us that the domestication of animals was a by-product of the Neolithic, since here we began to settle, invented agriculture, and along the way, we domesticated animals. But this has completely changed with these studies, since the genomes analyzed confirm that these dogs descend from a lineage of ancient wolves that formed an integral alliance with strictly hunter-gatherer humans. Images | Road Ahead In Xataka | Traveling with a dog is increasingly common, so the European Commission has decided something: mandatory passport

China has been boasting about its driverless robotaxis for years. Until more than 100 have stood at once in Wuhan

The screen inside said: “Driving system failure. Staff will arrive in five minutes.” But no one came. The passenger pressed the SOS button and was told they were on their way, but it took 30 minutes just to get someone to pick up the phone. Meanwhile, the robotaxi was still stopped in the middle of a lane in Wuhan, with traffic passing on both sides. That is what happened on the night of Tuesday, April 1, in the Chinese city of Wuhan: More than a hundred autonomous cars from Apollo Go, Baidu’s robotaxis subsidiary, have stopped working at the same time due to a system failure. It is the first time that a collective robotaxis blackout has occurred in China, and it has exposed a concern that the sector has been avoiding for some time. Why is it important. Baidu is not a minor player. Apollo Go operates more than 1,000 robotaxis in Wuhan alone, its largest deployment, and has already accumulated more than 20 million trips in its history. The company just started in Abu Dhabi and Dubaithe two large cities of the United Arab Emirates; It is negotiating its entry into the United Kingdom and Switzerland, and has an agreement with Uber to operate through its app. An incident of this magnitude doesn’t come at any time: it comes when the company, like its entire sector, is trying to convince the world that it is ready to scale. Between the lines. Technically, the incident could be explained in many ways. Some Chinese media cited anonymous sources who pointed to the security self-verification systems, which would have detected some abnormal condition and stopped the vehicles preventively. If this were the case, the system would have worked exactly as designed, but the result has been chaotic: cars stopped in the center lanes of expressways, some passengers trapped for more than 90 minutes, collisions caused by vehicles that suddenly braked on highways… That no one has been injured is almost a matter of luck. The contrast. It is not the only precedent. In December 2025, A power outage in San Francisco left Waymo robotaxis immobilized throughout the cityforcing Waymo to send software updates for its entire fleet. Months before, in August, An Apollo Go fell into a ditch in Chongqing; in may, a Pony.ai car caught fire in Beijingwithout causing injuries. It’s easy to see a certain pattern: large-scale autonomous driving has not yet achieved the reliability it needs to justify the trust that is being asked of the public. And now what. Cars stopping is a problem, but an even bigger problem is that no one knows why. Baidu has not explained what caused the failure or how long it took to resolve it. Wuhan police have confirmed the incident but without giving further details about the cause. This opacity weighs as much or more than the incident, especially if we talk about a sector that has been arguing for years that its cars are safer than those driven by humans. We assume that is very true, but block failures like this do not invite optimism without questions. Featured image | Baidu-Apollo In Xataka | Waymo’s self-driving cars have started honking at each other. At 4 in the morning

Artemis II takes off successfully and humanity returns to the Moon after more than 50 years

Artemis II It has taken off successfully and we are not facing just any launch. What we have seen marks the return of beings humans heading to the Moon more than half a century after the last missions of the Apollo program, a milestone that for decades seemed reserved for the history books. This time, furthermore, it is not just about returning, but about taking a crew further from Earth than any human being has gone in more than half a century, in a mission designed to validate NASA’s deep exploration system in real conditions. To understand the dimension of this takeoff, it is worth stopping for a moment at what exactly Artemis II is. The mission represents the first crewed flight of NASA’s new exploration system, which combines the Orion spacecraft, the SLS rocket and the Kennedy Space Center’s ground systems. For approximately ten days, the astronauts will evaluate the behavior of the ship in real deep space conditions, something that until now had only been tested without people on board. NASA itself raises it as an essential step to pave the way for future missions designed to return to the lunar surface. The journey that returns humans to the lunar environment Before reaching this moment, what we have had has been a countdown with some tension. In the hours before, the teams had to review an anomaly in a temperature sensor of a battery of the abort system, which NASA attributed to an instrumentation problem and which, according to the agency, would not affect the launch. Added to this was another incident in the flight termination system, the safety mechanism that allows the rocket to be destroyed if it deviates from its trajectory and poses a threat, a problem that placed the mission in “no go.” Both setbacks were left behind before takeoff and are now part of the background of a day that finally went ahead. The planned flight path of Artemis II Over the next few days, what we will see will be a relatively short, but very demanding mission. After launch, the spacecraft will first enter a high orbit around the Earth for about 24 hours to check that all systems are working correctly, before beginning the journey to the Moon. From there, the crew will perform various maneuvers, including a manual control test and approach to the upper stage of the SLS, to validate Orion’s behavior in real situations. The plan is to circle the Moon and return without setting foot on our satelliteon a journey of about ten days designed to rehearse each key phase of the trip. The crew of Artemis II If you look at the crew, What we find is a very measured mix of experience and symbolism. Reid Wiseman is the mission commander, accompanied by Victor Glover as pilot and Christina Koch and Jeremy Hansen as mission specialists, four profiles who have already experienced space first-hand. Together they have accumulated 660 days in orbit and 12 spacewalks, which fits with a mission in which every decision counts. Added to that is something that also weighs: Koch will be the first woman to travel to the Moon and Hansen the first non-American to do so, opening a new stage in who is part of these trips. There is a detail that touches us a little more closely and that we should not lose sight of. Part of this mission also passes through Spainspecifically by Tres Cantos, in Madrid, where Airbus Crisa has designed, manufactured and validated the Thermal Control Unit of the European Service Moduleintegrated into Orion. This system is responsible for supplying air and water to the crew and maintaining the temperature within appropriate levels for both the astronauts and the equipment. It is a discreet piece within the whole, but without it it would not be possible to sustain a mission like this in safe conditions. In development. Images | POT In Xataka | The Artemis II astronauts will carry out experiments in what will be their own study models

Until now Cantabria was only a transit area for flamingos. They have been refusing to fly south for three years

When we talk about climate changewe automatically think of melting glaciers or crops altering, but we must also take into account that biodiversity maps are being redesigned in real time. This is what we are seeing in the north of Spain, where southern flamencos have begun to consider Cantabria as your new residence of winter. What the census says. In order to see how the habits of flamingos are changing, we have to turn to the census of wintering waterfowl in Cantabria, which indicates that right now there is a stable population of about 25 southern flamingos who have been settled in the region for three consecutive winters without emigrating. These specimens arriving from the Mediterranean Sea have decided that the climate and conditions of the Cantabrian Sea are sufficient to avoid having to continue their journey to the warmer southern latitudes. And the culprit of this is climate change. The problem of winter. The theory tells us that this type of animal species usually always go to places with optimal temperatures, causing them to be in the north in summer and in the south in winter. But this has changed radically, and these migratory birds are slowly becoming one of the best thermometers of global warming. The explanation here lies in the fact that winters on the Cantabrian coast are increasingly milder, causing this thermal increase to eliminate the barrier of extreme cold that traditionally forced these species, such as the flamingo, to flee to the south of the peninsula or North Africa. Adaptation of the species. By not facing severe frosts that freeze water and limit access to their food, flamingos find the energy expenditure of a long migratory journey unnecessary. They simply “stick” here to an area that is now hospitable. A perfect refuge. For a species to decide to stay, it is not enough for it to be less cold, but food and shelter are also needed. Here Cantabria has one of the richest and most important estuarine complexes in northern Spain. The flamingos in this case have concentrated their colony in two key points of the Cantabrian geography: the bay of Santander and the estuarine complex of the Marinas of Santoña, Victoria and Joyel. Here the high wetland quality It guarantees an ecosystem rich in the small crustaceans and microorganisms that these birds feed on. Images | Jannes Jacobs In Xataka | These birds travel more than 3,000 kilometers every year to reach Spain. The curious thing is that some arrive without fingers

Three months ago Australia banned social media for those under 16 years of age. It is already investigating possible breaches

Just three months ago, Australia launched one of the most ambitious regulations that have been proposed so far on social networks and minors. The measure came into force on December 10, 2025 with a clear message: force platforms to prevent those under 16 years of age from having accounts and give families back part of the control over the digital lives of the youngest. From the first moment it was presented as a pioneering initiative, but something important was also assumed from the beginning: applying it was not going to be easy. The first doubts. The rule has already entered its most delicate phase, checking whether it is really being applied as planned. The eSafety regulator has opened the first formal review and has put platforms such as Facebook, Instagram, Snapchat, TikTok and YouTube under scrutiny. The agency speaks of “significant concerns” and points to failures in control mechanisms. It also points out that current systems are not effectively preventing those below that threshold from continuing to open new accounts. How minors are sneaking in. The report goes beyond a general warning and focuses on very specific failures in the control systems. It has been detected that there are not enough safeguards to prevent users under the permitted age from creating new accounts, but also something more striking: some platforms allow the verification processes to be repeated until the user manages to pass them. Also in certain cases, these profiles are invited to demonstrate that they meet the age requirement even after having indicated that they do not, which shows inconsistencies in the application of controls. A problem that was already anticipated. The difficulties in applying the rule have not arisen now, they were already on the table from day one. When the law came into force, The Australian Government itself admitted that its implementation would not be perfect, and the first signs pointed in that direction. According to ABC, Some minors managed to bypass the verification systems with basic tricks, such as altering their appearance in facial controls. The outlet itself also warned that parents and older siblings could help some children get around the restrictions, an early sign that the challenge was not just in passing the law, but in making it really work. What is at stake for the platformss. The investigation opened by eSafety does not remain a diagnosis, it opens the door to possible sanctions if it is demonstrated that companies have not taken reasonable measures to prevent minors affected by the rule from having an account. Reuters points out that The fines can reach 49.5 million Australian dollars and affect the aforementioned services and platforms. The regulator has already begun collecting evidence and hopes to close at least part of its investigations by mid-year, which places technology companies in a scenario in which non-compliance is no longer just a reputational risk. The Spanish mirror. What is happening in Australia helps to put into context a debate that has also gained weight in Spain, although here it is at a different point. Peter Sánchez announced in February that The Government wants to prohibit access to social networks for minors under 16 years of age within a broader package of measures on age verification, traceability of hate and responsibility of technology managers. The key difference is that that ban has not come into force and is not being enforced. Still, the Australian case offers a useful reference to anticipate what kind of challenges may appear when such a measure moves from political announcement to actual implementation. Images | cottonbro studio In Xataka | “What the hell is happening with Lidl Spain?”: Germans are speechless at the chain’s comic surrealism

The biggest find in twelve years of GTA archeology came from an Edinburgh flea market and a used Xbox 360

It’s fascinating when we discover details years (even decades) after a game’s release that hadn’t come to light before. Secret levels in classics that everyone had examined from cover to cover, unrevealed meanings, unsolved puzzles… and sometimes, versions of the games that should never have seen the light of day and that give clues about the ideas that were considered in the development process. The latest case in that sense: ‘GTA IV’. What has happened? Last weekend, a user of GTAForums known as janmatant He paid £5 at a flea market in Edinburgh for an Xbox 360 in not very good condition. At home he discovered that the console was running Xshell, the operating system for Microsoft development kits. The 120 GB hard drive contained a single game: a beta version of ‘Grand Theft Auto IV’ dated November 2007, several months before its commercial release. The treasures he found were poured into the thread GTA IV Beta Huntwho has been tracking unreleased content from the game since 2014 (and which has generated 14 new pages of comments since posting janmatant). GTA IV on the trail. That the discovery occurred in Edinburgh is not at all coincidental. Rockstar North has been based in the capital since it was DMA Design, in 1987, and that is why the console ended up in the hands of a scrap dealer, a process that clearly should not have happened. Development kits are proprietary hardware that Microsoft distributes exclusively to studios (and in those days also to the press) to run games in conditions close to the final hardware. In theory, at the end of a project cycle, those units are returned or destroyed, but this was not the case. 118 gigabytes of Liberty City. After confirming by the serial number that the devkit was authentic, janmatant uploaded the content to the Internet Archive under the title “Great Stealing of Vehicles four XDK”. The 118 GB file is it executable on a real Xbox 360 with debugging tools, although a fully playable version is not yet ready. The most immediate find was the Liberty City ferries. The barges appear in the game’s first trailer and in some cutscenes, but in the final game they are just a set piece. The realistic ‘GTA IV’ opted for a world focused on cars and taxis and in its day, Obbe Vermeij, former technical director of Rockstar North, counted that the shuttles were removed late in development, with models already finished. Zombie mode. There had always been rumors about a zombie mode for which we had never had solid evidence. Herein build We find hospital beds with direct references to zombies, early models of infected characters and several animations associated with this variant. The Cutting Room Floorthe wiki dedicated to documenting cut content in video games, had already listed the project as “Z: Resurrection” based on code fragments found in the final version, but without visual material to support it. A former Rockstar developer It has taken away some of the epicness of the matter: According to him, zombie mode was simply an “experiment” that artists and programmers played to develop in parallel, not a formal production line. That doesn’t mean the discovery is minor, but rather that the creative leeway within Rockstar North in 2007 allowed a team to test out survival horror mechanics during development. Other divergences. The build includes other substantial differences from the final game. The silenced pistol is in this version’s arsenal, along with other unfinished weapons and a notable number of incomplete animations and unreplaced audio markers, as is the case with any half-developed game. The models of some NPCs are different from the final ones, and the character of Michelle, the FIB informant who appears as Niko Bellic’s early romantic interest, has a look here that forum users describe as strangely disturbing. What may be most surprising to any fan of the game is that about half of the radio stations sound completely different. ‘GTA IV’ has one of the most elaborate soundtracks in the saga, with dozens of real music licenses distributed on thematic stations. That half of that content changed between November 2007 and the April 2008 release says a lot about the licensing negotiation process in the final phases of development. What does Rockstar do? After everything that happened, Rockstar Games and Take-Two have not issued public statements. Although companies have a reputation for relentlessly pursuing leaks, the author of this leak purchased the console legally. In any case, he has put the devkit up for sale on eBay for £800. It’s not too much for material of such magnitude, but the truth is that, once on the Internet, access to these secrets is universal. In Xataka | The best video games of 2026 and the most interesting ones to come

The US has invested 16 years and 8 billion dollars in renewing the software of its GPS network. Result: a failure of epic proportions

The Next-Generation Operational Control System project (OCX) was going to modernize the constellation of the United States’ more than 30 GPS satellites. The company RTX Corporation (previously known as Raytheon) managed to win the project in 2010 with a budget of 3.7 billion dollars. The project was supposed to be completed in 2016, but in reality the US has spent $8 billion and 16 years later has an absolute disaster on its hands. 16 years of broken promises. In 2010 the iPad had just appeared on the scene and cloud computing was a somewhat diffuse concept. The project of the US Government was reasonable, and proposed that the OCX system be operational by the time Lockheed Martin’s new GPS III satellites debuted. The development became a chaos of bugs and requirements changes, and to this day it is unclear when, if ever, it will be completed. In Xataka 90% of Iran’s oil industry depends on a tiny island. One that is already on the radar of the US and Israel A fortune invested. The financial management of the project is the first big disaster. The initial budget was estimated at 1.5 billion dollars, but since the award until today that figure has risen to reach almost 7.7 billion of current dollars, to which another 400 million are added to support an improved version of the satellites, the GPS IIIF. This increase is not due in large part to the project suddenly being much more ambitious or more capable, but rather to the costs of having been fixing everything that has gone wrong since they started working on it. Software costs more than satellites. Every time software fails an integration test, the bill runs into tens or hundreds of millions of dollars. That has made the OCX system one of the most expensive and least efficient software projects in recent US military history. In fact, it far exceeds the cost of the satellites themselves that it had to control: the 22 GPS III satellites of the contract signed in 2018 have a budget of 7.2 billion dollars. Satellites of the future controlled by a fairground shotgun. Currently the United States has a fleet of GPS III satellites in orbit capable of emitting much more powerful “M-code” signals and interference resistantsomething that among other things allocates them especially for military applications. The problem is that since the OCX software not workingthey are managing them with control systems inherited from the 90s. It is as if we had a VHS video connected to watch movies on an 8K Smart TV: the potential is there, but one of the components is an absolute bottleneck. {“videoId”:”x8wlh9q”,”autoplay”:false,”title”:”United States vs. China: The CHIPS WAR”, “tag”:”webedia-prod”, “duration”:”1611″} The cybersecurity nightmare. One of the big problems of this project has been the cybersecurity requirements. OCX was designed to resist cyberattacks from powers such as Russia or China, but that requirement has become a spectacular technical burden. Pentagon standards have evolved so quickly that they have not been able to be adapted to an architecture that begins to become obsoleteand covering successive patches is locking the system in a complex vicious circle: the software is never finished because more and more vulnerabilities appear. Failed tests. The latest report from the Government Accountability Office (GAO) has been the final straw. During the tests the system again showed once again instabilitywhich has forced the final delivery to be delayed to the end of 2026 or even 2027. Frank Calvelli, of the Air Force, has expressed his dissatisfaction with that unacceptable management of private industry: the strategic advantage that this project should offer at a time like this is inaccessible due to the disastrous progress of the project. It’s not that difficult. for a long time the excuse for justify the delays was that OCX was “the most complex software ever created for space,” but other players in the sector have shown that achieving these types of technical milestones is possible. SpaceX has demonstrated this with technical “miracles” like its reusable Falcon 9 or with the development of Starship, for example, so those arguments are falling on deaf ears now. Waiting for a better GPS. These problems also affect us end users, who will not be able to enjoy the L5 signals for now. This much more robust frequency will significantly improve accuracy in urban centers with many tall buildings. The irony is tragic: we cannot use extraordinary space infrastructure because the base stations cannot cope with it. While waiting for the problems to be resolved, the learning is clear: the software cannot be a monster that takes 16 years to build In Xataka The GPS in the Baltic has been experiencing interference for months and the culprit is becoming increasingly clear: Russia And while as always, China. While the US crashes against its project to renew the GPS constellation, China has once again managed to “become independent” from Western technology. Your satellite navigation system Beidouit does not replace GPS, true, but It already complements it in 140 countries. Once again China’s long-term view has its obvious result: it has taken 20 years in deploying its constellation, but they already surpass the GPS system in metrics such as signal availability or integrated messaging services. Europe, by the way, also has its own alternative. In Xataka |GPS “dead zones” are spreading around the world: jammers are to blame for confusing drones (function() { window._JS_MODULES = window._JS_MODULES || {}; var headElement = document.getElementsByTagName(‘head’)(0); if (_JS_MODULES.instagram) { var instagramScript = document.createElement(‘script’); instagramScript.src=”https://platform.instagram.com/en_US/embeds.js”; instagramScript.async = true; instagramScript.defer = true; headElement.appendChild(instagramScript); – The news The US has invested 16 years and 8 billion dollars in renewing the software of its GPS network. Result: a failure of epic proportions was originally published in Xataka by Javier Pastor .

150 years ago, Spain made a unique decision in the world. Ouigo and Iryo believe that Renfe uses it to get them out of the market

They have no rolling stock. And the worst of all (for them) is that they are not going to have it. Ouigo, Iryo and a third rolling stock company have raised their voices before the National Markets and Competition Commission (CNMC) to make it clear that the current system with two gauges of track reduces their competitiveness in our country compared to Renfe. And it doesn’t seem like it’s going to change in the short term. What has happened? The CNMC has published a document with the name “Report on technical barriers to the provision of railway services”. It sets out the challenges and interventions that Spain should carry out in the coming years. It specifies that the Spanish railway system has the obligation to improve interoperability with its neighboring countries, both to facilitate the flow of passengers and goods. But there is a drawback: the track widths. And this inconvenience has a very relevant economic impact. They complain. In the document the different postures are collected of those involved. And it states that “Ouigo, Iryo and a rolling stock manufacturer (which is not specified) warn that the uncertainty regarding the schedule and details of the Gauge Migration Plan, as well as the unification of the electrification system and the implementation of the ERTMS signaling system, makes decision-making on strategic investments difficult, and they ask that the Gauge Migration Plan be prepared and published as soon as possible.” In short: the two operators and the rolling stock manufacturer complain that Adif does not have a clear plan as to whether the Iberian high-speed track gauges are going to adapt to European standards, which move in standard gauge. The same happens with the unification of the electrification system and the definitive implementation of the ERTMS system. And they defend themselves. The position of Adif and Renfe is set out in the same document. Both companies “point out that incorporating gauge change technology in the rolling stock and infrastructure is less expensive and entails fewer interruptions in traffic than the migration of the infrastructure. On the other hand, both the AESF and the DG of the Railway Sector indicate that, in addition to Talgo, there is a second manufacturer of variable gauge rolling stock for high speed, CAF, although they admit that it is currently only approved to operate at 250 km/h.” In short: neither Renfe nor Adif They believe that adapting to the standard width is economically profitable given the high economic impact. The bottleneck. What Ouigo and Iryo defend is that the current situation and the commitment to trains with wide gauge technology leaves them behind. They have two reasons to maintain this. CAF can supply trains with this technology but they are only approved to travel at a maximum of 250 km/h. Talgo is the only company with this technology with approval to circulate up to 350 km/h. They are known as Talgo AVRIL but their production is committed to Renfe. And the results are not satisfactory either.. Beyond these two manufacturers, no one seems to want to get involved in the production of trains capable of changing tracks between standard and Iberian gauge. And the fact is that their production means meeting a demand that is still a niche or a rarity in the world railway system. Very juicy. The reluctance of Adif and Renfe is not strange either. For Adif it would mean a huge investment that has to be able to make profitable with the rest of the operators when the vast majority of current corridors in Spain already operate with standard gauge. For its part, Renfe does not want to let go of this trick either. Right now, the high speed to Galicia needs trains that are capable of moving between the Iberian gauge and the standard gauge if you do not want to transfer and the Spanish company is the only one that has the trains for this. The Galician corridor has also emerged as one of the most profitable. Travel has grown so much that it has made airlines retreat and now that they have to liberalize the line, maintaining the current situation guarantees that they will continue to be the only ones that will be able to offer this trip without transfers, which is a clear competitive advantage. Photo | Falk2 In Xataka | “Whoever wants to come, should invest”: Ouigo wanted to enter the Madrid-Galicia AVE but now sees it as impossible before 2030

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.