when Albacete set the record for a capital at -24ºC

Europe has started 2026 with cold. Very cold. But even the icy winds that have hit part of the continent and the peninsula these days, sinking the thermometer below 15 ºC, they pale when compared to what Castilla-La Mancha experienced in the early stages of 1971. That year left a meteorological curiosity in Albacete, a record that has remained unbeatable since then in the historical records of the Aemet: the coldest temperature ever remembered in a provincial capital, neither more nor less than -24 degrees. The most curious thing is that not even that value (more typical of other Siberian latitudes) marks the record of cold registered by the agency in Spain. “Extreme values”. The Aemet not only helps us know the weather in the ‘future’, to know if this week it is going to rain or be sunny, we should dust off the winter scarves and gloves or we can give the anoraks a break. The agency also allows us to know what the weather was like in our cities 10, 30, 50 years ago… even more, almost a century ago, something that is possible thanks to its series of “absolute extreme values”. The service (available online) details the record measurements associated with each weather station since 1920. What does that mean? That we can know the record values ​​of rain, temperatures, snowfall or gusts of wind captured by each of the stations managed by Aemet in the 50 provinces of Spain, in addition to the cities of Ceuta and Melilla. Their data must be handled with some caution (especially in comparisons) because they are subject to important handicaps. Aemet does not clarify, for example, whether all the sensors have been operational for the same amount of time or how long each one has been working. Another key fact is that within the same region (or even locality) there may be thermal differences or significant rainfall. It all depends on where the sensor is installed. A station located in a port area may collect very different values ​​than another located within the same municipal area but in the heart of the urban area or in a higher area, such as an airport. In fact, it is not strange that Aemet has sensors that collect information near the terminals. provincial capital lowest temperature Date Madrid -15.2ºC 01/16/1945 Barcelona -10ºC 02/11/1956 Valencia -7.2ºC 02/11/1956 Saragossa -11.4ºC 02/05/1963 Seville -5.5ºC 02/12/1956 Malaga -3.8ºC 02/04/1954 Murcia -7.5ºC 01/16/1985 Palma de Mallorca -10ºC 02/12/1956 Las Palmas de Gran Canaria 6.5ºC 03/27/1954 Alicante -4.6ºC 02/12/1956 Bilbao -8.6ºC 02/03/1963 Cordova -8.2ºC 01/28/2005 Valladolid -18.8ºC 01/03/1971 Victoria -21ºC 12/25/1962 To Coruña -4.8ºC 01/07/1985 Grenade -14.2ºC 01/16/1987 Oviedo -6ºC 01/07/1985 Santa Cruz de Tenerife 8.1ºC 02/22/1926 Pamplona -16.2ºC 01/12/1985 Almeria 0.1ºC 01/27/2005 San Sebastian -12.1ºC 02/03/1956 Burgos -22ºC 01/03/1971 Albacete -24ºC 01/03/1971 Santander -5.4ºC 01/21/1957 Castellón de la Plana -7.3ºC 02/11/1956 Logrono -11.6ºC 12/25/1962 Badajoz -7.2ºC 01/28/2005 Salamanca -20ºC 02/05/1963 Huelva -5.8ºC 02/17/1938 Lleida -15.4ºC 01/02/1971 Tarragona (Reus Airport) -8ºC 02/11/1983 Lion -17.4ºC 01/13/1945 Jaen -8ºC 02/11/1956 Cadiz -1ºC 02/11/1956 Ourense -8.6ºC 12/25/2001 Girona -13ºC 01/09/1985 Lugo -10ºC 12/23/2005 Caceres -5.8ºC 02/11/1956 Guadalaraja -11ºC 01/28/1952 Melilla 0.4ºC 01/27/2005 Toledo -14.4ºC 01/18/1945 Ceuta -0.4ºC 01/05/1941 Pontevedra -5.5ºC 12/10/1922 Palencia -14.8ºC 01/04/1971 Royal City -13.8ºC 01/03/1971 zamora -13.4ºC 01/03/1972 Avila -16ºC 01/15/1985 Huesca -13.2ºC 02/12/1956 Basin -17.8ºC 01/03/1971 Segovia -17ºC 01/06/1938 Soria -15ºC 12/17/1963 Teruel -21ºC 01/12/2021 One piece of information: -24ºC. Taking into account the above, the historical record of the Aemet leaves a curious fact, one that I remembered recently in X Vicente Aupí, popularizer and astrophotographer: on January 3, 1971 Albacete the thermometers dropped neither more nor less than -24º. The data was obtained at the air base and is interesting for several reasons. Not only is it the lowest value recorded in the city since records began, it is also the coldest confirmed in a provincial capital. Freezer records. The next lowest value among the provincial capitals was experienced by Burgos that same day (January 3, 1971), when the mercury dropped to -22. Vitoria and Teruel follow in the ranking. The first recorded -21 ºC on Christmas Day 1962, the second endured the same temperature on January 12, 2021. These are surprisingly low data, although in recent decades Aemet has reported a few measurements below -15º. Meteorological bulletin of January 3, 1971, when the station located in Albacete recorded a minimum of -24 ºC, a record value among the provincial capitals of Spain. Extract from the meteorological bulletin of December 17, 1963, when a minimum of -30 ºC was recorded at the Calamocha-VOR observatory station, province of Teruel. to stay at home. The most striking thing about January 3, 1971 is that the thermometer not only collapsed in Albacete. Another interesting resource that Aemet offers is the newspaper archive of the ‘Meteorological Bulletin’a part edited by the agency’s predecessors between March 1893 and well into the 21st century. On its website today we can consult practically all of its digitized issues from 1894 to 2007. Among them is the census of that Sunday, January 3, 1971. And what does it tell us? That day the people of Albacete were not the only ones who faced a wave of polar cold. Although the city took the cake, in Burgos they scored -22º, in Valladolid, Teruel and Daroca -19º and -18º in Cuenca or La Molina. Some of these values ​​were also obtained at aerodromes, just as happened in Albacete, where that day the thermometers they did not go beyond -6º. In all the provincial capitals of Spain, that day the mercury did not rise above 10 ºC, the maximum recorded in Almería, Cádiz and Castellón. The coldest day? Yes. And no. The figure for Albacete is a record among provincial capitals, but in Spain we have endured even colder days. At the end of 1963, the residents of a small town in Teruel saw how the mercury dropped until it reaches -30 ºC. That is the surprising minimum temperature recorded on December 17 of that year … Read more

In 1845, John Franklin’s expedition set sail in search of the Northwest Passage. 180 years later his loss remains a mystery

On the morning of May 19, 1845, Captain John Franklin and his expedition weighed anchor from the Greenhithe Harboralmost at the mouth of the Thames. They were looking for the Northwest Passagethe (at that time theoretical) maritime route that would link the Atlantic and the Pacific through northern Canada. They never came home. 129 men who never returned and who, for 170 years, have been one of the great questions of scientific and naval exploration. We now know why the men of John Franklin’s lost exploration died. There are those who insinuate that the trip started badly from the beginning. It should never have been in the first place. John Franklin. The first option William Edward Parryone of the great English explorers, but he had already traveled to the Arctic five times and “was tired.” So he declined the offer. Secondly, they thought about James Clark Ross. Ross has just arrived from Antarctica where he had explored the Ross Sea and Island. In fact, the ships on that expedition were the same as those that would be used on this mission (two of Ross Island volcanoes They are called Erebus and Terror in honor of the ships). But upon returning to England, he became engaged to his future wife and decided that great explorations were no longer for him. He was followed by James Fitzjames (discarded due to inexperience), George Back (considered too controversial) and Francis Crozier (who, well, was Irish and that was more than enough reason to rule him out). Seeing the yard, John Barrow, second secretary of the Admiralty, called John Franklin. To this day no one knows why Franklin, who was already a legend at the time and was almost 60 years old, he said yes. But the fact is that, as I said, they left the vicinity of London that day in 1845. They stopped in Orkney and the convoy formed by the two main ships (HMS Erebus and HMS Terror), the HMS Rattler (the first English warship with steam propulsion) and a transport headed to Greenland. There they sacrificed ten oxen and the expedition began its solo journey. The search for the Northwest Passage The travels of Marco Polo are a peculiar book. Not only does it remain a very interesting precedent for current anthropology, but it served as an inspiration for many during the era of great exploration. The image you can see above is precisely the annotated copy of ‘The Voyages’ that Christopher Columbus had. In one of its versions, the Italian one from 1559, a Chinese province called Anian. We assume that it was from there that the geographers and explorers who discussed whether America was a new continent or, on the contrary, an Asian peninsula, got the name of the Strait of Anian, the separation between Asia and America that would give access to the Northwest Passage. It is what we know today as the Bering Strait and for years it was pure mythology. But, first, Ferdinand Magellan and his crew turned around Cape Espiritu Santo and found themselves face to face with the southeastern passage; and, second, a Dane in the service of Russia, Vitus Beringrediscovered for the West the strait through which Semyon Dezhniov had already traveled sixty years before. The rest was geopolitics: the quick passage to the Pacific without having to pass near the Spanish territories in America was too juicy. In 1745, an English law promised 20,000 pounds to whoever discovered the pass and the boom began. I have tried to convert the amount to a current currency and I have not been able to do it accurately, but I have drawn one conclusion: it was a lot of money. Favorable weather In early August 1845, two whalers, the Prince of Wales and the Enterprise, encountered Franklin’s ships in Baffin Bay. They were waiting for favorable weather to enter the Strait of Lancaster. That was the last time they were seen. Two years passed. And, little by little, Lady Jane Franklin, some members of Parliament, and the fledgling British press began to ask the Admiralty to send someone to search for the heroes of Franklin’s expedition. The Government sent three expeditions: one by land and two by sea, one through the Atlantic and another through the Pacific. They failed. Fearing that they would be forgotten, Lady Jane Franklin composed her lament, the song you can hear just above. And, although I don’t know if it was for that reason, the truth is that was not forgotten. In fact, the search for the lost expedition “became nothing less than a crusade.” In 1850 alone, eleven British and two American ships tried to locate them. It was then that the first tombs were found. Over the years, the different expeditions found fragments, Inuit stories and objects from the expedition. In 1855, following the indications of some Inuit tribes, pieces of wood were found with the name of Erebus. In 59 two messages were found. The first, dated May 28, 1847, was from Franklin himself and read “Sir John Franklin, Commander of the Expedition: All Well.” It is the document on the right. It was a common practice at the time, documents were left in different places so that, in case of problems, they could be reconstruct the details of the trip. But in this case, something curious happened: on the edges there was another message, dated April 25, 1848, explaining that the ships had been trapped in the ice. Franklin and twenty-three other crew members were dead. And the rest, the survivors, had abandoned the ships looking for an exit to the south. In the next few years some objects, some rumors and some tombs appeared. Nothing else. The ships never appeared and we never, in 150 years, discovered what had really happened to Captain John Franklin’s lost expedition. One hundred and fifty years without news In the 1980s, the University of Alberta launched a project to track the expedition. The different possible routes were traveled … Read more

Setting up a smart home is a nightmare. The solution is Huawei is to set it up for them

The promise of the smart home where everything works automatically without a problem sounded great, but the reality is that it is still a real chaos of incompatibilities and most annoying bugs. Even if we have all the devices from the same brand, there is still the part of assembling them, hiding cables… Huawei has the solution, although it doesn’t exactly come cheap. The complete pack They count in Panda Daily that Huawei has launched an offer of smart-home solutions that come in various packages with different devices and at various prices. The packages are designed to be installed in new construction homes and also for installation in already built homes. With these options, Huawei seeks to offer a comprehensive solution under the umbrella of your HarmonyOS system. In total they offer six packs, three for new construction homes and three for existing homes. The cheapest is the ‘starter pack’ for already built houses and costs 1,200 euros in exchange and includes the control hub and some essential functions such as lighting, air conditioning and curtain control. The most expensive packages are those installed in newly built homes. The most basic costs more than 3,500 euros in exchange and has WiFi 7 connectivity throughout the house, control of lights, curtains, air conditioning, smoke sensor for the kitchen and smart lock. The premium package goes up to almost 12,000 euros and adds features such as AI cameras, ambient lighting strips, and speakers throughout the house. All packs include installation and Huawei is committed to completing it in just 24 hours in the case of existing homes. The announcement is only for China, where Huawei had already launched similar solutions in the past. The chaos of home automation In Spain there are solutions provided by installation companies, but We do not find similar proposals through brands with smart-home devices such as Samsung or Xiaomi. Typically, we are the users who buy the devices and install them at home ourselves. Mounting cameras and lights is quite simple, but if we want deeper automation, for example controlling blinds or blinds, things get complicated and many times we have to go to an installer. Then there is the issue of compatibility. In my house I have two cameras, several lights, a robot vacuum cleaner and an automatic cat feeder. It’s not much, the problem is that each thing works with a different app and, although I can bring everything together in Google Home, the reality is that there are devices that it does not recognize, others that are deconfigured if the WiFi goes down and in general it is quite cumbersome. The standards like matter They promised to unify this chaos, but to this day it still hasn’t taken off. This same year they analyzed the topic in XDA Developerswhere they criticized that there are still many devices that do not support it and those that do sometimes lose functions compared to native integrations, as happens with Philips Hue. Returning to Huawei’s proposal, I don’t think the solution should be to buy a package worth several thousand euros and tie ourselves to a brand forever. However, the fact that it sounds like a much more convenient option than its alternatives It says a lot about the state of the connected home landscape. Image | Huawei In Xataka | Home automation and leaving for a month: Ana Boria has put all her efforts to the test just before the expected trip

In 1969, humans set foot on the Moon for the first time. He did it thanks to a computer less powerful than your cell phone

The arrival to the Moon It was one of the scientific and technological milestones most notable of the 20th century and something that remained in those who lived and in those who did not thanks to the images and audios. Something that happened more than 40 years ago, when there were still many technological revolutions to come, such as personal computers or mobile phones. What technologies made it possible for humans to reach the Moon? Something that is already fascinating in itself, but it is even more so if you know the details of the computers, cameras and other devices that were used in the mission, taking into account their characteristics. What technology made it possible for three human beings they reached the moonWould they walk around and tell us in the meantime? We travel in time and space to review. like matryoshkas The Apollo 11 mission was the eleventh of a NASA program that had a total of 22 missions (19 of them being successful), in the 1960s until 1972. Until mission 7 the launches were unmanned and mission 8 was the first to orbit the Moon, but for all of them a Saturn rocket launcher was used. The one for Apollo 11 was the Saturn V, a rocket 110.64 meters high and weighing 2,700 tons with a tank full of fuel (the largest NASA has ever built). Depending on the stage (there were three, S-IC, S-II and S-IVB) the number of engines varied and so did the fuel, which were mixtures of oxygen, kerosene or liquid hydrogen. But the Saturn V was not the one that reached the Moon, but rather the one that went out into space and directed the modules towards it. These modules were the command and service (CM) and the lunar (LEM); The CM contained the engine of the propulsion system that was responsible for entering and leaving lunar orbit and had space for three astronauts, and the LEM was the first ship designed to be able to fly in a vacuum, without aerodynamic capacity. (POT) The LEM separated from the CM as it entered the orbit of the Moon and descended to its surface. It was designed to land only on the Moon since the legs were so weak that they would not support the weight of the LEM in Earth’s gravity (9.8 m/s² versus 1.6 m/s² on the Moon). There was room here for only two astronauts. The speeds that were reached (increasing upon entering the gravitational field of the Moon) were 3,700 kilometers per hour and up to 9,000 km/h due to lunar gravity. And here comes a question: how is it possible to brake at those speeds? To enter lunar orbit, hypergolic braking was used (using hydrazine, dimethylhydrazine and nitrogen tetroxide, hypergolic compounds – which explode without a heat source) and engine shutdown. The computers of the Apollo 11 mission To review the computing involved in the Apollo 11 mission, we must take into account the emission and reception, that is, what was on the ground and what the aircraft carried. And it is also worth remembering that at the time a computer was far from being something domestic or common, or from fitting on a desk. On Earth, in the Goddard Space Flight Center and the Manned Spacecraft Center in Houston, worked with the IBM System/360 75 mainfream, which (along with the 44, 91, 95 and 195) was implemented with hardwired logic instead of microcode like all other IBM S/360 models. For the curious techieshere a configuration diagram and explanation of the team. In the ships, however, the Apollo Guiding Computer (AGC), manufactured by Raytheon and designed by the MIT Instrumentation Laboratory. This team stood out for being one of the first to use integrated circuits. There was one in the LEM and another in the CM. The specifications of these teams are surprising not because the numbers are smaller compared to the current ones, but because even making the effort to place our minds in the 1960s, it is impressive to see that teams like this managed to carry out something as complex as a round trip to the Moon. The AGC had storage of 36,864 14-bit words and RAM of 2,048 words. (POT) Comparing it with later equipment, more or less between the two AGCs they have approximately the same memory as what a Commodore-64 (from 1982) had, but it was about eight times less powerful than an IBM XT (from 1981, which was 4.77 MHz compared to 0.043 MHz for the AGC). In fact, a computer with half a GB of RAM has 100,000 times more memory than AGC. But computers do not live on hardware alone, and software here has considerable weight. 300 people participated in its creation over seven years, at an approximate cost of 46 million dollars (at the time). Among them was Allan Klumpp, a mechanical engineer at MIT whose proposal for landing on the Moon reflects all calculations as well as diagrams and drawings of the situation on the dashboard. The program was called LUMINARY and was written in MAC programming language (MIT Algebraic Compiler), but no terminal or compilation programs, this was done with some punched cards which were prepared with a kind of typewriter (and if a hole was made wrong, a new one had to be made). On the occasion of the 40th anniversary of the famous achievement, it was transcribed the code of both modules (transcribing it), where we read that Klumpp said that this was never exempt from bugs. What is notable here is the multitaskgiven that the fact that the software allowed it was already an achievement and that it was not easy for him to carry it out. In fact, there was some alarm due to the high demand on the computers as at the time of the moon landing, which resulted in a slow response and not with all the calculations, so there was one minute of the eleven that lasted the … Read more

30 years ago a young Chinese man set up an ice cream stand. Now he leads an emporium with more stores than McDonald’s

It’s hard to believe in a world dominated by big brands and multinationals, but there is a hospitality chain with more stores than McDonald’s and Starbucks that you’ve probably never heard of. His name is Mixue (Mìxuě Bīngchéng) was founded in the late 90s by a university student from Zhenghou, China, and today it is considered the largest food and beverage chain in the world. This is how it is recognized, for example, by the magazine TIMEwhich has included it in your listing of the 100 most influential companies of 2025. It is estimated that it has more than 46,000 stores spread throughout Asia, Australia, the Middle East and South America, a vast network of stores offering a menu based on ice creams, smoothies, coffees, traditional teas and bubble teas. Bigger than McDonald’s? Yes, if we talk about the number of establishments. The benefits already they are something else. While McDonald’s boasts of having more than 43,000 restaurants spread across more than a hundred countries and Starbucks managed 40,576 stores At the end of the first quarter of fiscal year 2025, Mixue surpasses (and quite comfortably) both figures. A few months ago the magazine TIME assured that the chain has more than 45,000 spread mainly throughout mainland China, although it also operates in other regions. Do you have so many stores? Yeah. Fortune calculate which exceeds 46,000 points of sale throughout Asia, Austria, the Middle East and South America. Other sources speak of more storesraising the total network to 53,000 points selling. Beyond these dancing numbers, one thing is clear: Mixue is normally considered the food and beverage chain with a greater deployment of establishments in the world. In addition, its branch network continues to expand to good If in the West its brand is less known to us than McDonald’s or Starbucks, it is because (despite the international jump that has given in recent years) most of the Mixue stores they remain focused in China. The firm also has another handicap that helps understand its global expansion: while in the case of Starbucks more than 50% of the stores are in the hands of the company itself, in Mixue practically all They operate through franchises. What is your story? Mixue’s is the typical story of improvement and accelerated growth that gives shine to the classes of coaching business. The father of the company is Zhang Hongchao, who laid its foundation almost 30 years ago from scratch. Your story starts in 1997in Zhengzhou, when Zhang, then a university student, managed to get his grandmother to lend him 3,000 yuan ($420) to set up a small slushie and soft drink stand. Despite the challenges that were encountered along the way (and some other business failure), Zhang moved forward, managed to adapt to the changes in Zhenghou, reinvested in machinery and found the key to creating a million-dollar business. Sam Tang account that his first success came in 2006, when he launched ice creams for one yuan. In 2014, its brand already had 1,000 stores. In 2020 there were 10,000. And how has it succeeded? The big question. Mixue’s business model has several clear characteristics. The first, its commercial approach. The chain basically sells ice cream. soft servesmoothies, tea drinks and bubble teasalthough in your menu coffee and Fortune assures which in the future plans to expand its offering with beer. The other great features of your menu are the affordable priceswith ice creams for less than one euro. Other peculiarities of the company are its commitment to dominate the supply chainits commitment to a clearly identifiable brand thanks to symbols such as its mascot (Snow King) and, above all, an expansion through franchises. In a report from a few months ago the company itself recognizes that almost all of its stores (99%) are opened and operate through franchises. Mixue is responsible for supervising businesses, choosing locations, decoration and assessing the capacity of the staff. For her, the business is not so much in the fee that those stores then pay as in the equipment, merchandise and packaging that she sells to them. And the future? It doesn’t look bad. In spring the company went public in Hong Kong and managed to raise nearly 450 million of dollars, starring in one of its best premieres of the first half of 2025. The company seems willing also to get into the powerful (and disputed) US market. According to precise Fortuneduring the first half of the year the company reached a revenue volume of 2,000 million dollars (40% more than in 2024) with profits of 370 million. Despite its humble origins, its founder and his brother now manage a fortune of billions of dollars. Images | Choo Yut Shing (Flickr) 1 and 2 and Jeremy Thompson (Flickr) In Xataka | One of the biggest wine critics is French and has toured China. There is no good news for French wine

We have been talking theoretically about data centers in space for months. A company already has a plan to set it up in 2027

The Californian startup Aetherflux has announced which will launch its first data center satellite in the first quarter of 2027. It is the initial node of a constellation that the company has named “Galactic Brain”, designed to offer in-orbit computing capacity powered by continuous solar energy. The underlying promise. Aetherflux presents an alternative to the years of construction that terrestrial data centers require. According to Baiju Bhatt, company founder and co-founder of the financial firm Robinhood, “the race toward artificial general intelligence is fundamentally a race for computing power and, by extension, energy.” The company is committed to placing sunlight next to silicon and completely bypassing the electrical grid. How the project works. The Galactic Brain satellites will operate in low Earth orbit, taking advantage of solar radiation 24 hours a day, something impossible on land. Advanced thermal systems would eliminate the limitations faced by terrestrial data centers, which require large amounts of water and electricity for cooling. In addition, the constellation fits within Aetherflux’s initial plans: transmitting energy from space to Earth using infrared lasers. The competition is already underway. Aetherflux is not alone in this bet. Google presented in November your Suncatcher projecta plan to launch AI chips into space on solar-powered satellites. Jeff Bezos too expressed his optimism on large data centers operating in space in the next decade or two, a goal that Blue Origin has been working on for more than a year. SpaceX also works in use Starlink satellites for computing loads of AI. Musk himself wrote in The real obstacles. Although launch costs have decreased considerably, they remain prohibitive. According to recent estimateslaunching a kilogram with SpaceX’s Falcon Heavy costs around $1,400. Google calculate that if these costs drop to about $200 per kilogram by 2030, as projected, the expense of establishing and operating space data centers would be comparable to that of terrestrial facilities. In addition, the chips will have to withstand more intense radiation and avoid collisions in an increasingly congested orbit. The urgency. Big tech is colliding with physical limits on Earth. From 2023, dozens of data center projects have been blocked or delayed in the United States due to local opposition over electricity consumption, water use and associated pollution. According to the consulting firm CBRElimitations in electricity generation have become the main inhibitor of data center growth around the world. The Aetherflux Calendar. The company, founded in 2024 and which has raised $60 million in financing, plans to first demonstrate the feasibility of transmitting space energy through a satellite that will launch in 2026. If all goes according to plan, the first Galactic Brain node will arrive in 2027. The company anticipates launching about 30 satellites at a time on a SpaceX Falcon 9 or equivalent, although if Starship becomes an option, they could orbit more than 100 data center satellites in a single launch. The long term strategy. Aetherflux hasn’t revealed pricing yet, but promise Multi-gigabit bandwidth with near-constant uptime. Their approach is to continually release new hardware and quickly integrate the latest architectures. Older systems would run lower priority tasks until the life of the high-end GPUs were exhausted, which under high utilization and radiation might not last more than a few years. Cover image | İsmail Enes Ayhan and NASA In Xataka | OpenAI launches GPT-5.2 weeks after GPT-5.1: a maneuver that aims to cut ground on Google’s Gemini 3

An “invisible” Russian submarine has set off alarms in the Arctic. Europe’s response: Atlantic Bastion

The launching of the Khabarovskthe new and ultra-quiet Russian submarine capable of deploying nuclear torpedoes Poseidonhas reactivated a fear that had been latent for decades in cities like London: the possibility that the naval balance of the Atlantic is once again tilting in favor of Moscow. The response from the United Kingdom has been forceful, and it is called Atlantic Bastion. Submarine warfare. Although the public image of the Russian threat usually revolves around research vessels like Yantarsuspected of mapping and potentially manipulating underwater cables and pipes, European specialists know that what is truly disturbing lies much further down. Russia has spent decades reducing the acoustic signature of its submarines to levels that they border on invisibilitycombining new propulsion systems, composite coatings and virtually undetectable cooling pumps. In this environment, where silence is power, a ghost submarine with nuclear capacity alters not only the sea routes, but the very heart of the strategic infrastructures that connect Europe with the world. UK reinvents itself. Faced with the resurgent threat from Khabarovskthe Royal Navy has launched what they have called as Atlantic Bastiona plan designed to restore British strategic advantage in its own and allied waters. Its origin is not new and it we have counted before: the United Kingdom has been monitoring the Greenland-Iceland-United Kingdom gap (GIUK gap) since before the creation of NATO, and the Second World War already demonstrated that controlling that maritime corridor was essential to prevent enemy forces from slipping into the North Atlantic. But what used to be destroyers and acoustic sweeps is becoming a hybrid framework that combines Type 26 frigates equipped with new generation sonar, aircraft P-8 Poseidon capable of patrolling thousands of kilometers and, above all, swarms of underwater drones equipped with artificial intelligence. According to the Ministry of Defensethis architecture aims to detect, classify and follow any enemy submarine that tries to penetrate British or Irish waters, and to do so constantly, autonomously and with an unprecedented range. The algorithms arrive. The core of the project will be Atlantic Neta distributed network of autonomous underwater gliders equipped with acoustic sensors and guided by artificial intelligence systems capable of recognize sound signatures with a level of precision that until a few years ago was little less than the preserve of science fiction. Unlike the SOSUS of the Cold War, based on gigantic fixed hydrophones placed on the seabed, the new generation will be mobile, expandable and adaptable to the routes and behaviors of increasingly soundproof submarines. The ultimate ambition is to deploy hundreds of cheap, persistent units that together create aa surveillance mesh much harder to evade. The metaphor is revealing: if finding a silent submarine is like searching for a needle in an oceanic haystack, modern technology makes it possible to exponentially multiply the number of searching hands. Khabarovk The technological challenge of hunting shadows. However, even with this technological revolution, experts warn that detecting new Russian submarines will continue to be an extremely complex undertaking. Since the 1980s, Moscow has drastically reduced lacoustic emissions of its fleet, which requires combining passive and active sensors and complex configurations such as bistatic sonar, where one vessel emits a pulse and another collects the echo. These techniques require coordination, multiple platforms, and significant sensor density, something that Atlantic Bastionaims to provide but it is still far from being deployed on a full scale. The arrival of the Type 26 frigates, designed to be the flagship of British anti-submarine warfare, is fundamental to this purpose, as is the cooperation with Norway and other allies that are also strengthening their capabilities in the North Atlantic. The Russian Bastion Puzzle. Even if Atlantic Bastion managed to limit the presence of Russian attack submarines in the Atlantic, there is one dimension that no Western system can solve: Russian strategic submarines already they don’t need to abandon its own bastion in the Arctic to threaten Europe or the United States. Its intercontinental ballistic missiles can hit targets thousands of kilometers without moving from the Barents Sea or the White Sea, protected by layers of defenses and favorable geographical conditions. There they play a hiding place lethal where the West cannot penetrate without significantly escalating the conflict. The paradox is clear: the United Kingdom can reinforce its waters and monitor every meter of the GIUK gapbut it cannot deny the Russian nuclear capacity deployed in its natural refuge, a reality that frames the entire British effort within a logic of containment rather than domination. An underwater chess. If you want, Atlantic Bastion ultimately represents the recognition that underwater competition has returned with a vengeance, now fueled for digital capabilitiesdistributed sensors and autonomous platforms that transform the nature of ocean surveillance. The North Atlantic once again becomes a stage silent maneuvers where Russia and the United Kingdom measure their technological resistance in an environment reminiscent of the Cold War, but with algorithms and autonomy as new weapons. A career that is not decided by great battles, but by the ability to listen better, process faster and anticipate invisible movements. In this theater of shadows, the advantage is not whoever shoots the most, but rather whoever is able to detect first (already happens in Ukraine). Thus, Atlantic Bastion aspires to return that capacity to the British, although the contest that is opening now does not look like it will be brief nor simple: In the depths of the Atlantic, the prelude to the next era of strategic rivalry between Russia and the West is underway. Image | SEVMASH/VKONTAKTE In Xataka | A Russian submarine has appeared off the coast of France. And Europe’s reaction has been surprising: have a laugh In Xataka | Russia’s most advanced nuclear submarine was a secret. Until Ukraine has revealed everything, including its failures

Something big is coming in European money. The ECB has set a date for a key step towards the digital euro

The European Central Bank has made a move in one of the most sensitive projects in its recent history. After two years of preparation, the organization has decided to move on to the next phase of the digital eurothe initiative with which it seeks to adapt public money to the era of electronic payments. It is not a launch, nor a final decision: if the European regulations are approved in 2026, there will be a pilot starting in 2027 and the Eurosystem wants to be ready for a possible first emission in 2029. The decision comes after a preparation stage started in November 2023in which the ECB and the national central banks defined the technical and operational pillars of the project. In these two years, progress was made in the draft of the operating regulations, in the selection of technological suppliers and in tests with market participants. Political momentum has also been key: euro leaders called at the October 2025 summit to accelerate work to ensure that Europe retains its own capacity in digital payments. A pilot to get out of paper. The announced step opens a phase aimed at validating that the system can work in practice, both from a technical point of view and from real use. The ECB talks about a pilot in which Banks, technology providers, businesses and consumers would participate, with tests on payments in everyday situations and security controls. The objective is to verify that the digital euro, if it exists, can operate reliably and offer a simple experience for the user. Despite the progress, this does not mean that the digital euro is ready for launch or that it will replace paper money. The institution emphasizes that the cash will continue to exist and that the project requires legislative support before any final decision. Furthermore, it is neither a decentralized token nor an experiment to displace the banking sector. The proposed architecture, they assure, maintains banks as the main access and operation channel for citizens and businesses. Three points before starting. The digital euro roadmap is supported by three conditions: legislative progress, technical validation and the formal decision of the ECB later. The European Regulation will establish the rights, limits and obligations of the system, including the way in which financial institutions participate. In parallel, the architecture will be deployed in modules to adjust development as results are obtained. Nothing in this phase implies committing unlimited resources or guarantees the final emission. A project that still needs to convince. Initial support for the digital euro is not homogeneous across Europe. In Germany, a survey prepared for the Bundesbank In April 2024 it showed that half of citizens “could imagine using it” and that 41% already knew about the project. In Spain, a study by Monitor Deloitte In 2024, it indicated that 61% would not adopt it for now, largely due to lack of knowledge and satisfaction with current methods. At European level, a survey published by BEUC In 2025, it indicated that privacy is a priority for 81% of those surveyed, along with security and the absence of commissions as essential elements. From now on, progress will be as technical as it is political. As we say, the ECB wants to have the pieces ready for a pilot in 2027 and to consider a possible initial emission in 2029, provided that the European regulation is approved and tests confirm its viability. The process will be gradual and reviewable, and therein lies its importance: Europe is preparing for an option that could expand its autonomy in payments Images | ECB | omid armin In Xataka | The world seemed unprepared for the end of cash. The digital euro makes it clear that yes

A Japanese city has had enough of its neighbors spending the day on their cell phones. So he has set a limit: two hours

“When you get on a train in Japan, most passengers are looking at their phones. They don’t do anything else.” Speaks Masafumi Kouiki, mayor Toyoake (Japan) and probably the country’s most recognizable face in the fight against addiction to smartphonesthe sleep hygiene and life away from the screens. The reason is very simple: despite the suspicion on the part of his neighbors, Kouiki has promoted an ordinance that limits the use of cell phones and tablets to two hours a day. The measure was launched October 1 and for now it has served one of the objectives that Kouiki pursued: to move consciences and generate debate. What has happened? That October has arrived with a curious legislative novelty in Toyoakea city of almost 70,000 inhabitants in Aichi Prefecture that in practice functions as a dormitory city for Nagoya. On Wednesday the 1st, a new rule came into force that restricts the time that your neighbors can spend in front of a screen for recreational reasons: maximum two hours. 120 minutes. Not one more. The measure was announced months ago, in Augustwhen it was still a proposal, and despite the huge stir that it generated has managed to move forward: in September it received the endorsement of the municipal assembly with 12 votes in favor and seven against. What does the standard say exactly? Roughly speaking, the ordinance, 2,400 charactersestablishes a limit on the recreational use of smartphones, tablets, consoles and computers. The rule applies to Toyoake residents and sets that limit at two hours a day, not counting time spent studying or working. There is an important nuance, of course: although it is an ordinance endorsed by the municipal assembly, in reality what it offers is a guidelinenot a mandatory rule. No one will check whether the residents of Toyoake conform to that standard or not. No sanctions are foreseen either. This is just a recommendation. Is it wet paper then? At all. To begin with, because Japanese culture exerts strong social pressure to follow official guidelines. Beyond its real impact, the rule has also served to open the debate on the excessive use of screens and its influence on aspects such as sleep. In fact, the same ordinance advises that younger children stop using their devices at 9:00 p.m. and those in secondary school and those under 18 should not drive them after 10:00 p.m. The objective: guarantee your correct rest. That’s all? No. On October 1, coinciding with the entry into force of the rule, the Toyoake Government sent emails to young people and parents in the city to insist on the same message. Primary and secondary school students were in fact urged to “take care of their rest and health hours” and agree with their families how much time they would dedicate to their devices. “The main objective of the ordinance is to guarantee sufficient hours of sleep,” underlines the organism. The City Council has also carried out a survey among 250 residents registered in its monitoring system and wants to find out the real scope of the guideline: whether the use of smartphones during free time, the duration of sleep or the hours of family conversation changes. TO beginning of next yearIn fact, the authorities want to do a new survey among their students. Why have they done it? To change habits. “It’s very sad to end the day looking at your phone all the time at home,” explained a few days ago Kouki a The New York Times. “I hope citizens change their behavior.” Rather than strictly limiting the recreational use of screens to 120 minutes a day, its purpose is to invite “reflection and debate” and make people think about how much time they spend on screens and until what time they do it. In 2024, a state study revealed that, on average, younger Japanese (those in primary or secondary school) invest about five hours up to date on their mobile phones. And not only that. More than 80% of Japanese people between 15 and 24 years old consider themselves “dependent” on smartphones and 14% already show symptoms of addiction. How have people responded? Depends. Not everyone has reacted equally well to Kouki’s attempts to restrict screen use. Although it is not a mandatory rule nor are there fines for breaking it, there are those who believe that the mere existence of the ordinance means an intrusion in the lives of the people of Toyoake.”In one sentence: it’s none of your business”, claims Mariko Fujie, one of the local politicians who voted against. In his opinion, there is no “scientific evidence” to support a norm that, he warns, also does not take into account the perspective of young people. “Many of my supporters find it condescending. This ordinance is complete nonsense.” Is Toyoake a unique case? Yes. And no. The Town Hall assures that theirs is the first standard of its kind in Japan. This is also presented by media such as The Japan Times either The Mainichiwhich have highlighted its pioneering nature. Whether or not this is the case, the truth is that it is not the first attempt by a Japanese public institution to put limits on the use of screens among the population. Especially among young people. A few years ago Kagawa promoted another ordinance that aimed to restrict young people’s access to video games. Their objective: that minors do not dedicate themselves to them more than one hour daily during the week, a margin that the authorities were willing to extend to 90 minutes on holidays. In Yamato, another town, they also prohibited use mobile to pedestrians while they walk. Images | Yifei Wong (Unsplash) and Launde Morel (Unsplash) In Xataka | In Europe we have a problem: we are becoming the Japan of the 21st century

Many are still convinced that Instagram listens to the microphone. The app of the app has wanted to set the subject

In a meeting between friends you talk about making a weekend getaway, rural houses and whether it is worth renting a car. Hours later, when Instagram opens, ads appear for travel agencies, car rental portals and route recommendations. The feeling that The mobile “hears us” It is installed easily. In this context, Adam Mosseri, head of Instagram, published a video to disassemble that myth and explain why we see ads that seem to guess our conversations. Meta has almost a decade denying sharply that their applications access the microphone without permission, but doubts never disappear at all. Now the discussion returns to the front line because The company has announced thatas of December, conversations with their artificial intelligence assistant will also be part of the customization of ads and recommendations (this change will not apply, at least for now, in the European Union). The denial and this novelty have communicated almost at the same time, which adds a striking nuance to the message. Mosseri says they don’t listen to us, while finishing another key novelty The closeness between both communications did not go unnoticed. Instagram head chosen a personal tone, even light, but at the same time he wanted cut the suspicion that the application listens to its users. He presented that statement as a starting point on which to build his explanation. His words were clear and it is worth reproducing them in full: “We do not listen to you. We do not use the microphone of the phone to spy on. If we did, it would be a serious violation of privacy. In addition, we would exhaust the battery of the device and noticed it, and you would even see a small light in the upper part of the screen that would tell you that the microphone is on.” For Mosseri, what seems to really be the effect of four quite common situations. One possibility is that before talking about a topic and lor we have sought without remembering it. Another is that someone in our environment has done it and the platform takes it as a indication. It can also happen that the announcement has appeared before and we did not pay attention, but that it later sneaks into the conversation. And the simplest explanation of all remains: it is a mere coincidence. “I want to reiterate that we do not listen to your microphone. I know that some of you are not going to believe me no matter how much I try to explain, but I wanted to make things clear. I am sure that the comments in this publication will be a bit intense. We see you soon. Paz.” It was not the first time that the company was trying to settle the suspicion. In 2016, Facebook said that did not use the phone’s microphone to guide ads or to change what appears in the Feed. Two years later, in an appearance before the United States Senate, Mark Zuckerberg was asked directly on the subject and rSponge With a “no” just as sharp. Years later, Meta support website itself included A document on the same line. Click to watch the video on Instagram Doubts about possible listening also led the academic community to test it. In 2017, a group of researchers from Northeastern University analyzed more than 17,000 Android applications, including those of Facebook, to check if they activated the microphone without the user knowing. After months of evidence, They found no evidence What happened. They did observe other data collection behaviors, but do not listen undercover. The authors themselves clarified, yes, that there were scenarios that were left out of their study. Beyond the studies, there is a technical aspect that should be remembered. In current mobiles, both in iOS and Android, any application that wants to use the microphone You need explicit user permission. In addition, when an actor appears active on the screen (a color point or a warning in the upper bar) that immediately points it. These notices, together with the extra battery consumption that would involve permanent listening, make it difficult to hide an undercover use of the microphone without the user noticed. The persistence of this myth is better understood if the context is observed. The directed advertising is so precise and so difficult to decipher for the average user that it is easier to attribute it to a microphone on than to an invisible data network. Our memory also influences. To this is added the history of mattress controversies in privacy. Images | Brett Jordan | Xataka with Gemini 2.5 | Screen capture In Xataka | I’ve been hooked to Sora 2 for two days: I’m generating absurd memes where I am the protagonist and I can’t stop

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.