The more we calculate the size of the Universe, the less sense it all makes

We have known for a long time that the Universe is expanding. However, the speed at which it does so is a headache. Depending on which method is used to measure it, a different result is obtained. Now a much more precise way to measure it has finally been found, but it doesn’t really unravel much of the mess. It messes it up even more. An overlay of techniques. Through a superposition of different techniques, an international team of scientists has made the most precise calculation so far of the expansion speed of the Universe: 73.5 ± 0.81 kilometers per second per megaparsec. The figure coincides quite well with those that have been calculated in the past using data from the nearby Universe. However, it is quite far from what is calculated when data from the dawn of the Universe is used. This indicates that there is something in the physics of that furthest point in the cosmos that we have no idea about. Far from being solved, the mystery has become more complicated. A balloon that inflates. When we talk about the Universe expanding, we refer to the fact that galaxies are increasingly distant from each other. But not because the galaxies themselves move, but because the space between them widens. We can see it as a balloon on which a series of dots are painted. As the balloon inflates, the dots appear farther away, even though they have not moved from their place. Hubble voltage. Traditionally, the expansion rate of the Universe is calculated in two ways. Or by measuring the distances between stars and galaxies in the nearby Universe, or by measuring the cosmic microwave background. This is the electromagnetic radiation that remained as remnant of the Big Bang. That is, the oldest light that we can find in the Universe, since it was formed in the explosion with which it was formed. Therefore, the data is not taken from the nearby Universe, but from the most distant and ancient one, the one approaching the Big Bang. The figure obtained with both types of calculations should be the same. However, with the nearby Universe a speed of 73 kilometers per second per megaparsec is obtained, while with the most distant Universe a speed of 67 kilometers per second per megaparsec is calculated. This incoherence is known as Hubble tension. and indicates that, possibly, the Universe is expanding faster and faster. That’s why the closest one expands the fastest. This graph represents the tension that exists between measurements of the expansion rate of the late and nearby Universe, versus what would be expected based on measurements of the early Universe, specifically the cosmic microwave background. It could be a mistake. One of the hypotheses that seek to explain the Hubble tension is that, in reality, there is some error when measuring the expansion speed in the nearby Universe. There are many methods to calculate the distance between stars and galaxies and there could be an error. Therefore, an international team of scientists has decided to use a superposition of techniques to make a more precise calculation. Different types of stars. This method consists of simultaneously analyzing a large amount of data obtained from ground and space telescopes. These focus primarily on the brightness of Cepheid stars, red giants, supernovae, and galaxies of known brightness. The three types of stars mentioned are characterized by having a characteristic brightness, which is used to map the Universe and, therefore, also to calculate distances. With this superposition of techniques, the figure of 73.5 ± 0.81 kilometers per second per megaparsec was obtained. There is no mistake. When one of the superposition methods was eliminated, the alteration in the expansion rate of the Universe was minimal. The figure was practically the same. This indicates that the number has been measured perfectly. There is no mistake. So if the Hubble strain is not due to error, why does it occur? The mystery continues. After obtaining these results, the Hubble tension remains the prelude to a mystery. However, it is true that there are some hypotheses. For example, it is believed that the different figures in the distant and near Universe They may be due to the intervention of dark matter. There’s a lot we don’t know about her, so maybe it could explain what’s going on. On the other hand, there is the hypothesis that the Earth is in a place with spatial characteristics. It would be an area where there is relatively little matter, comparable to an air bubble in a cake. As explained in 2023 by one of the scientists who support this hypothesis, Indranil Banik“the density of matter is greater around the bubble, so gravitational forces emanate from this surrounding matter, attracting the galaxies in the bubble toward the edges of the cavity.” “That’s why they’re moving away from us faster than you would really expect.” Now we will have to solve that part of the mystery. At least we know that there is no error in the calculations and that the Hubble tension is a reality. Image | CTIO/NOIRLab/DOE/NSF/AURA/J. Pollard In Xataka | Refuting Einstein is one of the great challenges of physics. We couldn’t even achieve it by changing the scale.

China has just launched its first undersea data center with total energy autonomy. The idea makes more sense than it seems

In the AI ​​race, having a robust data center infrastructure to power it is essential, but first you need energy to power it all. The United States may lead the chip industry (at least, the strategic ones), but China follows closely at an unstoppable pace and furthermore, has the energy. And he is already beginning to connect the dots, showing off his technical power and ingenuity: already It has the largest data center in the worldis also a pioneer to submerge them under the sea. Now it has taken a twist with the first underwater data center that ‘drinks’ directly from the wind that just opened. This project represents the perfect union of two of China’s strategic priorities: digital sovereignty and carbon neutrality. By placing computing infrastructure on the seabed and powering it directly with clean energy on siteChina is solving one of the great current technological problems: the insatiable energy consumption of AI and Big Data. The project. About 10 kilometers off the coast of Shanghai, at the bottom of the East China Sea, a steel cylinder receives electricity directly from wind turbines and is cooled with sea water. It is the Lingang Subsea Data Centeran ambitious project promoted by Shanghai Hailan Cloud Technology (HiCloud) and built by CCCC Third Harbor Engineering. It consists of a series of data storage and processing modules encapsulated in watertight and submerged containers, which are connected via two 35 kV submarine cables to offshore wind turbines operating off the coast of Shanghai. With a planned capacity of 24 MW in two phases, the first is already operational: it has a capacity of 2.3 megawatts and includes a ground control center, a vertical data module installed under the sea and two main 35 kilovolt submarine cables. Why it is important. In addition to the fact that it does not occupy land, in cities as crowded as Shanghai it represents a valuable saving in land and that it can be installed close to where it is needed (if there is a coast, obviously), because it solves at the same time three structural problems of the sector: Refrigeration. Seawater acts as a constant and free heat sink, eliminating the need for industrial air conditioning systems that consume 40 to 50% of electricity. The metric that measures the energy efficiency of a data center by comparing the total energy consumed versus that used purely by the servers is the PUE, which for a standard data center on land is an average slightly higher than 1.5. The project promises to lower it to a figure not greater than 1.15. Without consumption of fresh water. Traditional data centers evaporate millions of liters of water to cool their servers, but this uses thermal exchange with the ocean, so it does not consume water resources. Take advantage of the surplus from wind power. One of the handicaps of wind energy is that generation depends on the wind and not on demand, so if you do not have a battery, the energy that is not consumed is wasted. Thanks to this direct connection, the data center absorbs wind production in real time, functioning as a constant consumer that reduces the waste of renewable energy due to lack of destination, In figures. The magnitude of the project, with some official numbers: The budget is 1.6 billion yuan, about 200 million euros. Total planned operational capacity of 24 MW (2.3 MW in the first phase). The design PUE is less than 1.15. More than 95 percent of electricity comes from renewable sources. Context. The name of HiCloud is not new because in fact it is an old acquaintance: it is the person behind the underwater prototype in front of Hainan which began to install in 2021. However, the international reference is the Natick project from Microsoft (2013–2024), which demonstrated the potential of underwater centers: only 8 of the 864 servers failed, a much lower mortality rate than that of any conventional data center in the same period and also got a very low PUE of only 1.07. Despite this, Microsoft shelved the matter: viability in terms of costs and maintenance is another story. However, the Lingang project has top-level institutional support: is present on the List of Green and Low Carbon Technology Demonstration Projects of the NDRC, China’s top economic planning body. How they have done it. Servers are placed in pressurized steel cabins filled with inert gases to prevent corrosion and fire with a design that maximizes interior space and minimizes the impact of waves. Heat is dissipated by pumping seawater through radiators located behind the racks. The most complicated operation was raising the cabin in the open sea: the separation between the legs of the support structure and the steel piles on the seabed was only 0.18 meters and the maximum allowable deviation was 10 centimeters, so GPS and the Sanhang Fengfan crane vessel were helped. Roadmap. The project follows a staggered progression that leaves certain unknowns. First was the prototype in Hainan (2021-2024). In 2025 the project began in Shanghai, whose phase 1 concluded in October of that year and it has just been launched a few weeks ago. The key phase that will take capacity up to 24 MW has no official public date. Of course, the consortium of companies made up of HiCloud, Shenergy Group, China Telecom Shanghai, INESA and CCCC Third Harbor Engineering signed a cooperation agreement in October 2025 to scale to 500 MW linked to offshore wind, although where and when is unknown. Yes, but. That 2.3 MW of phase 1 is practically a demonstration, not commercial infrastructure as a large conventional data center operates between 50 and 500 MW. And in addition, it has to resolve the issues that Microsoft’s Project Natick left unresolved, such as underwater maintenance: HiCloud has not published protocols or long-term repair costs. And scalability to 500 MW is at the moment more of an intention than a project In Xataka | Where you see a mountain, China sees a … Read more

China spent 10 billion on oil it did not need. With Hormuz blocked, the puzzle finally makes sense

As the West panics over the possibility of the barrel break the $100 barrieran eerie calm reigns in Beijing. The Asian giant observes the crisis with the coldness of someone who has already done his homework. During the last few months, the world has been debating the excess oil supply, but the real winner of this war crisis is not firing missiles, but has been filling its storage tanks for years in the most absolute silence. World geopolitics has been blown up a few weeks before the expected summit between Donald Trump and Xi Jinping. As reported Nikkei Asiathe coordinated airstrikes of the United States and Israel (dubbed “Operation Epic Fury“) have culminated in the assassination of the Iranian supreme leader, Ayatollah Ali Khamenei. Tehran’s response has been a rain of missiles and drones on American allies in the region. The immediate impact has been felt in the water. The Strait of Hormuz, through which 20 million barrels a day flow (20% of the world’s oil supply), is blocked de facto. As detailed Bloomberg, Rates to hire a supertanker on the route from the Middle East to China have skyrocketed by 600%, reaching $200,000 a day (or 525 Worldscale points for a Suezmax). Besides, France 24 points out that insurers They have increased war risk premiums between 25% and 50%. As reported cnnBrent crude oil jumped 6.5% in the early stages, touching $82, driven by fear of prolonged logistical disruptions. Bob McNally, president of Rapidan Energy Group, warned the US chain that closing Hormuz would cause an immediate global energy crisis. China’s exposed vulnerability On paper, the Donald Trump administration’s offensive should be an absolute nightmare for Xi Jinping. As explained The TelegraphAmerican military adventurism is exposing the gigantic energy vulnerability of China, the largest oil importer in the world, which buys three-quarters of the crude oil it consumes abroad. Washington’s strategy seems clear: suffocate the “rebellious” suppliers that supply the Chinese industrial machinery at bargain prices. Earlier this year, the military capture of Nicolás Maduro has established what some analysts They already call the “Donroe Doctrine”. Trump has been explicit in his goal to control oil. If the United States manages to add Venezuelan production to that of Guyana and its own, it would de facto control 30% of the world’s reserves, according to JP Morgan. This movement cuts supply to China in the bud, evaporating imports that represented around 4% of its maritime purchases. according to data from Kpler collected by The Financial Review. However, Washington’s optimism collides with geology: the infrastructure is so in ruins that loading a supertanker today takes five days and the crude oil arrives so “dirty” that the Chinese and Indian refineries themselves have canceled orders, according to a Reuters investigation. Refloating this industry will cost 10 billion dollars annually for a decade, as Francisco Monaldi calculatesdirector of energy policy at Rice University. For its part, the current blow to Iran. From Chosun Daily details that China bought 80% of Iranian maritime exports last year (about 1.38 million barrels per day), which represents 13.4% of Beijing’s total maritime crude oil imports. As he points out Institute for Energy Research (IER) United States, cited by the same mediumChina has used the heavily sanctioned and cheap oil from these countries to cement its manufacturing competitiveness. Losing Iran and Venezuela would force Chinese refiners — especially the independent ones in Shandong, known as “teapots” — to look for much more expensive substitutes on the open market, threatening to import inflation and slow their economic growth. The master plan in execution If Western analysts expected to see China cornered, they were wrong. Beijing foresaw this scenario of isolation and has been executing a four-pronged master plan for years that today allows it to cushion the blow of Hormuz. While in 2025 the world feared a global oversupply, China dedicated itself to massive purchasing. Last year, China spent $10 billion buying an extra 150 million barrels that it didn’t immediately need, absorbing more than 90% of crude oil storage measurable globally. Supported by a new Energy Law that obliges the public and private sector to maintain reserves, Beijing today has strategic reserves equivalent to at least 96 days of imports, according to The Telegraph. Under the banner of national security, China is investing $80 billion annually in its state oil fields. In March 2025 they reached a production peak of 4.6 million barrels per day and they completed the drilling of the deepest oil well in Asia (10,910 meters). Its goal is not financial profitability, but pure autonomy. With Iran and Venezuela under fire, China has simply turned its head toward Russia and Saudi Arabia. According to oil price, Chinese refineries are absorbing record amounts of Russian crude oil (more than 2 million barrels per day in February 2026), taking advantage of the fact that India has given in to pressure from the US to stop buying from Moscow. Simultaneously, Saudi Arabia has cut the official price of its crude oil Arab Light to five-year lows to gain market share in Asia, which has led China to order between 56 and 57 million Saudi barrels by March. China’s definitive move is to abandon the oil board. As analyzed by Professor Hussein Dia in The ConversationChina’s massive commitment to electric vehicles (50% of new car sales last year) and renewable energy is a national security policy. How they collect in The Telegraph, The new five-year plan (2026-2030) seeks to peak oil consumption by accelerating the installation of solar and wind parks (430 gigawatts added last year alone). Unlike the ships in Hormuz, sunlight cannot be blocked by the US Fifth Fleet. The diplomacy of silence and the illusion of OPEC+ In the face of Khamenei’s assassination, the response of the Chinese Foreign Ministry has been one of calculated coldness. They condemned the act as “unacceptable” and a “violation of sovereignty,” but, as pointed out Chosun Dailythey carefully avoided directly mentioning Donald Trump. From Nikkei Asia explains this pragmatism: … Read more

These ideas show that they still make sense

There was a time when USB drives were almost essential. We always carried one with us to move files between computers or save them for a while without depending on the Internet. That scenario has completely changed. Today the focus is on mobile phones and cloud storage, which leaves many pendrives forgotten in a drawer. The question is inevitable: if we meet one again, is it worth giving it a second life? In a new video published on the Xataka YouTube channel We try to answer precisely that question. Ana Boria reviews different ways to recover the usefulness of this peripheral that has apparently been outdated by the passage of time. And the interesting thing is that several of the proposals it raises go beyond the most obvious uses that we all have in mind. Tricks to take advantage of our USB drives One of the first ideas points to something as everyday as it is necessary: ​​freeing up space on phones with little internal memory. “Well, if you have a pendrive and an adapter like this, you can use it to transfer all the photos, videos and files you want, to free up space on your mobile without having to erase your memories,” explains Ana, who also details what type of adapter should be used to do it in a simple and practical way. The video also covers more familiar functions, such as using a USB drive to install Windows, with instructions on how to create the installation media, and other less common functions, such as using portable applications. “It is very useful if you have to use other people’s equipment from time to time or you cannot happily install programs,” our colleague points out from her own experience. There is even room for proposals designed for more technical profiles. Ana shows different possibilities to execute operating systems in Live USB modea common practice in several Linux distributions. And on Windows? There are also alternatives, and the video mentions specific tools that allow you to achieve this without too much complication. “I also want to tell you about a very interesting option, but one that has its nuances… We can use our USB memory as a physical key for two-step verification.” With this idea, Ana enters the field of digital security and puts on the table a less known, but especially useful, use for this type of device. The review continues with other tricks that show that that forgotten pendrive can still have a journey. The video is now available on Xataka’s YouTube channeland the invitation remains open: tell us in the comments if you knew of any of these uses or if they were already part of your daily life. Images | Xataka In Xataka | We put Spotify, Apple Music and YouTube Music to the test: music streaming has changed and there is no longer an obvious winner

More and more car brands are fleeing from Android Auto and Apple CarPlay. And it makes all the sense in the world

My Volkswagen Polo is 10 years old, has a screen where I can see car statistics and play the radio or Spotify and little else: if I want to enjoy a GPS navigator, I have to place my phone on a support on the grille and it will work. So yes, I get really excited when I drive my partner’s Kona, with a screen bigger than a tablet on which I can visit Xataka from the web browser, watch videos either play a game. Android Auto is wonderful, but if I connect my iPhone, using apps like Waze on CarPlay is also another story. For someone who has a stupid screen in their car and the intention of not renewing it in the next five years, Android Auto and Apple CarPlay sound like a heavenly melody in my ears. However, Google and Apple’s infotainment systems are taking a step back: there are manufacturers who decide to back off, so their new models are left out. And it doesn’t surprise me. Goodbye to Android Auto and Apple CarPlay. Last summer and despite the delays, Apple promised they would be happy with their Apple CarPlay Ultra budding until he got a brand slam: There are barely Aston Martin and Porsche left. Land Rover, Mercedes-Benz, Nissan, Ford, Lincoln, Audi, Jaguar, Acura, Volvo, Honda, Renault, Infinity and Polestar got off the boat. In the fall, the leadership of General Motors explained in a The Verge podcast that it intended to remove both infotainment systems from its newer vehicles and replace them with its own Gemini-spiked system. Finally, German brands such as BMW, Mercedes-Benz, or Volkswagen they have joined to create an open source alternative called Safety Open Vehicle Core. S-Core, its abbreviation, is basically a base infrastructure with the essentials from which each manufacturer will build its adapted customization layer. It’s a matter of control. Android Auto and Apple CarPlay provide a unified and mainstream experience within the reach of the majority who have a smartphone and implementing them is not expensive. Although well, it is not so much because of the money they spend installing Android Auto and Apple CarPlay and more because of what they stop earning. Data collection and what you can do with it. It should be noted that with their respective infotainment systems, Apple collects information such as your position and how it varies over time, which allows you to know your speed, schedules, frequent routes… to give some simple examples. They also know what apps you use and when. An open door to the vein of subscriptions. In recent years we have already seen how large manufacturers launched a subscription model to release certain premium hardware functions: Volkswagen to unlock all the powerthe controversial BMW heated seats (then backed out), Mercedes and its improvements subscription accelerationor Polestar for offering similar performance packages. Having access to detailed information on usage habits would allow the establishment of a user profile and thus offer a more personalized experience in the form of a subscription. Materializing it will not be easy or fast. The GM news detailed that the measure would be implemented in the coming years and does not even imply a complete disengagement as long as it does not completely eliminate Google from the equation, since it implements Gemini, the Menlo Park company’s big bet. And Google’s AI is not exactly sparing in capturing information. Using an Android fork could also be an interesting option. S-Core- Eclipse Release Schedule The route of German companies does seem more viable. In fact, their preview schedule is available on HitHub and for now they are fulfilling it to the letter. Of course, one thing is that they are able to create a platform and another is the experience it offers. How cold it is outside of Android Auto and CarPlay. One of the great assets of Android Auto is the quantity and quality of compatible apps: Thinking about a platform without Google Maps, Waze or Spotify would feel like a huge step backwards. So later, they will have to get the companies behind them to bring their apps to these systems. And even if they did achieve it, then there are other hot potatoes such as updates to their frequency. Life without Android Auto or Apple CarPlay is an option and if you don’t tell Rivian or Tesla, but in the end it’s all about user experience. Don’t let it feel like taking a step back. Buying a car (especially if it is high-end) and finding a setback is not a dish of good taste. They don’t charge you a premium for unlocking functions or removing advertising either. The scenario of having to pay a monthly fee to access maps and extras when you have a solid and free alternative on the market sounds absurd. In any case, the winds of change are blowing on car screens. In Xataka | Android Auto is quietly preparing for us to drive with smart glasses. In Spain it won’t be easy In Xataka | This car was a pioneer with Android Automotive, but its users were crying out for Android Auto. Your wish has been granted

The latest from Lenovo is a gaming laptop with a rollable screen. It makes more sense than it seems

Playing on a laptop has historically been synonymous with playing in 16:9 or, at most, in 4:3 in some more work-focused models. Play ultrawide It is something that, for the moment, is relegated to desktop monitors for a fairly simple issue: space. That, of course, is assuming that the panel cannot be rolled and unrolled, because if possible, concepts as curious and peculiar as the one that Lenovo has shown at CES 2026 could be achieved. Lenovo Legion Pro Rollable (Concept). That is the name given to the laptop with a roll-up screen that Lenovo showed at CES in Las Vegas. It is, as its name indicates, a concept, that is, it is not for sale, but its proposal is striking. Lenovo is betting big on this technology with folding laptops and the roll-up concept we tested a few weeks agobut with this device the firm goes a little further. Lenovo Legion Pro Rollable | Image: Lenovo How it works. The laptop features a Lenovo PureSight OLED panel that, by default, has a size of 16 inches. Lenovo calls this size the “Focus mode.” Under the panel is a dual voltage-based motor that allows the screen to expand and contract “with minimal vibration and noise,” according to the firm. Lenovo also claims to have used low-friction materials and that the system maintains constant tension throughout the panel, which should translate into less abrasion during the winding cycle. From 16 to 24. The panel can be expanded in two sizes: from 16 to 21.5 inches (“Tactic mode”) and from 21.5 to 24 inches in a more panoramic format that Lenovo has dubbed “Arena mode.” This, depending on the player profile, may make all the sense in the world since it allows you to have an ultrawide monitor available at all times. Lenovo Legion Pro Rollable | Image: Lenovo Develop. If we take them seriously, competitive games win a whole lot if we play on a 16:9 monitor. Titles like ‘Counter Strike’, ‘Valorant’ or ‘League of Legends’ are played in 16:9 because this format allows you to see the entire screen without having to move your head. In some shooters, like ‘Battlefield 6‘, an ultrawide monitor moves the minimap, game and weapon information away from the center, forcing us to take our eyes off the reticle even more. That is to say, in games in which everything happens in the center and surrounding areas, a 16:9 monitor is the most suitable, at least on paper. However, simulation games, open exploration worlds or more cinematic games (think of a ‘Clair Obscure: Expedition 33‘, a ‘Cyberpunk 2077‘ or a ‘god of war‘) appreciate the panoramic format and the immersion they provide. The same with editing and productivity apps, which win in ultrawide. This laptop offers us, in theory, the best of both worlds: a 16:9 panel for shooters and competitive games; and an ultrawide panel for when we want to relax and enjoy a good story. But that, in theory, because the Lenovo Legion Pro Rollable is a concept and, as such, has yet to prove itself. And inside? Lenovo hasn’t left anything out. The laptop is based on the Legion Pro 7i, so it has a New generation Intel Core Ultraa NVIDIA RTX 5090 and the Lenovo AI Engine+. This uses Lenovo LA1+LA3 cores to optimize resources based on the gaming scenario, which, on paper, should conceptually keep the FPS up to par. Images | Lenovo In Xataka | The new thing from NVIDIA is called DLSS 4.5 and it seems like witchcraft: it can multiply the performance of the GeForce RTX 50 by six

It turns out that a longevity expert has said something that makes sense. And the reason is the juices

Peter Diamandis has returned. The famous doctor and engineer specializing in longevity has once again made simple dietary advice viral: “if you like oranges, eat them whole and not in juice.” And, to the surprise of all of us who closely follow the worldit’s a good idea. Beyond the joke, longevity is becoming serious. Very serious indeed. Since an open microphone confirmed to us in September that longevity is becoming a crucial issue for oligarchs of the present, it is impossible not to look at this community of researchers, influencers and entrepreneurs in a “different way”. However, the reality is obvious: most advice on how to live longer is a mix of cherry picking, scientific sensationalism and common sense. Ultimately, to the extent that society is increasingly obsessed with living longerthe ‘market’ for these types of ideas is growing (for better and worse). And Diamandis is a good example. As They explained in El Confidencialthis entrepreneur and researcher has a very long list of dietary ideas: from withdrawing dairy products due to the body’s inflammatory response to casein to avoiding red meat due to its saturated fats (basing his diet almost exclusively on vegetables and whole foods). As we saw a few days ago with other well-meaning advicethese kinds of ideas make some sense, yes. However, every heuristic has two sides: it illuminates a certain part of reality and helps us manage it more easily. But it hides other parts and makes it seriously difficult to be aware of them. But, let’s get to the juices. Because that is the latest advice that has been vitalized is precisely that: that the debate has never been “yes fruit” or “no fruit.” ¡Of course you have to consume fruit! The debate is how we consume it and in juice it is, possibly, the worst way. By squeezing the pieces of fruit, we not only reduce the fiber but we end up consuming something completely different: satiety is worse and sugar absorption is improved. When we talk about fruit being good, what we are saying is that we need the fiber it contains for its metabolic and satiety effects. Oh really? So much so that organizations like the AESAN they insist repeatedly that juice does not replace whole fruit. And yes, I know that for many it is a commonplace (and something very well known), but it never hurts to repeat it: the consumption of rooting fruit has fallen 14% in recent years. We already know that it is good advice, but also worse for longevity. Here, the truth is that the evidence is less clear. Above all, because it is never enough to ‘stop recommending something’, we must go further and put better options on the table. And yes, water is always an option. But unfortunately, it is not always a substitute for the social consumption of juices. Image | Zlatko Duric In Xataka | One of the leading experts on aging has just explained what he himself does to live longer. It makes sense

There are dozens of influencers obsessed with helping us choose the perfect can of tuna. The problem is that what they say doesn’t make much sense.

There is a fine line that connects volcanic eruptions, oil combustion, and waste incineration with our kitchens: mercury. A mercury that is produced in dozens of activities (mostly human), which ends up deposited in the waters, transformed into methylmercury by millions of microorganisms, stored in fish and, finally, in our stomach. It was only a matter of time before it became the huge food scandal it is today. Methylmercury also reaches social networks. The problem is so big that there is no shortage of experts and influencers that defend messages such as choosing cans of “tuna” over cans of “light tuna.” The music is that of institutions such as the European Food Safety Authority (EFSA) that recommends avoiding large fish; The lyrics hide many problems. At the end of the day, the viral message mixes correct intuitions, with more than debatable scientific evidence (it uses, to begin with, commercial classifications that do not have direct Spanish correspondence). This is not the first time that an idea that sounds good ends up giving us headaches. And why is that a problem? Because, like it or not, fish is a centerpiece of many diets. Not only for its protein contribution, but as a priority source of certain fats that are very difficult to replace by any other means (e.g. omega-3). The thing is, with all that, comes methylmercury. And exposure to methylmercury is a tricky thing: it can harm brain development and be toxic to the nervous system. In fact, it can cause symptoms such as tremors, memory loss, and cognitive dysfunctions. The most vulnerable groups are pregnant women, nursing mothers, babies and young children. Do all fish have the same amount of mercury? No, it doesn’t. According to the Spanish Agency for Food Safety and Nutritionthere are four really dangerous species: the swordfish or emperor, the bluefin tuna (Thunnus thynnus), the shark (dogfish, mako shark, dogfish, dogfish and blue shark) and pike. These are problematic in women who are pregnant or planning to be pregnant, nursing mothers and children under 10 years of age. In fact, AESAN recommends directly avoid its consumption. The rest of the species are not problematic for the effects of mercury: they are safe and healthy. And the AESAN recommends between three and four servings a week even in the at-risk population. And aren’t there more differences according to levels? That is, are there only dangerous and non-dangerous species? No no. It is true that each species contains a different amount of mercury. In fact, each copy has different levels. That’s where the problem comes from: we need simple ‘rules’ to help us deal with uncertainty. On a practical level, according to the available studies, we can only define species with low mercury content as those on this list: Pollock, Anchovy, Herring, Cod, Bacaladilla, Cockle, Mackerel, Squid, Shrimp, Crab, Cane, Coquina, Carp, Squid, Clam, Choco/Cuttlefish, Lobster, Coquina, Sea bream, Sprat, Prawn, Horse mackerel, Lobster, Prawn, European sole, Dab, Sea bass, Mussel, Merlan, Hake, Razor clam, Oyster, Pomfret, Flounder, Squid, Octopus, Shrimp, Atlantic salmon, Pacific salmon, Sardine, Sardinella, Sardinopa, Plaice, and Trout. Everything else has medium levels and making distinctions between them is impossible on a practical level. So it doesn’t make sense to follow these types of recommendations? In general, any attention we pay to food is good. The system is configured in such a way that, if we let ourselves goour diet gets worse. However, we know that Obsessing over diet is also full of problems.. Using heuristics that complicate the purchase without substantial improvements is not as good an idea as it seems. Image | Tobias Tullius In Xataka | The scientific reason why miracle diets don’t work is you

To the question of what sense it makes to compete with Google, OpenAI or Anthropic in AI, Mistral has an answer: small and local models

French startup Mistral AI Mistral 3 has been launcheda family of 10 open source artificial intelligence models that represent its most ambitious commitment to date. The Parisian company, which is often considered the main European hope in the development of AI, seeks to differentiate itself from the large American technology companies by betting on flexibility and deployment in all types of devices instead of raw power. Under these lines we tell you all the news. What Mistral has presented. The Mistral 3 family includes a flagship model called Mistral Large 3, with 675 billion parameters, and nine compact models grouped under the name Ministral 3 (in three sizes: 14,000, 8,000 and 3 billion parameters). All models are released under Apache 2.0 license, allowing unrestricted commercial use. The large model also has multimodal capacity, being able to process text and images. It is also multilingual, with a special emphasis on European languages. On the other hand, small models can run on devices with just 4 GB of memory, making them perfect for modest laptops, mobile phones and embedded systems without the need for an internet connection. Why strategy matters. While OpenAI, Google and Anthropic focus on increasingly powerful and closed systems with agentic capabilitiesMistral has focused on the breadth and scope of its models, efficiency and what its co-founder Guillaume Lample calls “distributed intelligence.” According to declared told VentureBeat, the company believes the future of AI is defined not by scale, but by ubiquity: models small enough to run in drones, vehicles, robots and consumer devices. The economic and practical argument. Lample explained It means that in more than 90% of cases, a small, specifically tuned model can get the job done, especially if it is trained with synthetic data for specific tasks. According to Lample, this is not only cheaper and faster, but it eliminates concerns about privacy, latency and reliability. The company also has teams that work directly with customers to analyze specific problems and fine-tune small models that perform specific tasks. This, above all, can attract companies that become frustrated when choosing the best possible model for a specific task and, if it does not perform adequately, they end up giving up. Europe is lagging behind. If we talk about innovation and technology around AI, we do not hesitate to say that Europe is leagues away of what companies in the United States and China are offering. This is why Mistral AI advocates a different approach in which it prioritizes massive deployment in devices and the flexibility of its smaller models. The capacity offered by open models can be a great asset to continue betting on these technologies. In China, for example, the open models of DeepSeek, Alibaba or Kimi are emerging widelyabove in certain tasks even competitors as large as ChatGPT. Lample explained that most leading Chinese models are exclusively text-based, with separate image processing systems. For this reason, they also want to opt for a multimodal approach. A complete ecosystem. Mistral no longer only offers language models. The company has built an entire ecosystem that includes Mistral Agents APIwith connectors for code execution, web search and image generation; Masterlyyour reasoning model; Mistral Code for programming assistance; and AI Studioan application deployment platform that also has analytical and logging capabilities. Furthermore, his assistant Le Chat It has incorporated a deep research mode, voice capabilities and a list of more than 20 enterprise integrations. Thus, in addition to its model offering, the company can provide other companies with a whole layer of personalized products and services, with the aim of being their main source of financing. Digital sovereignty. Although Mistral is often characterized as Europe’s answer to OpenAI, the company prefers to consider itself as ‘a transatlantic collaboration’. Its CEO, in fact, is in the United States, has teams on both continents and trains these models in collaboration with American teams and infrastructure. However, its positioning as a defender of European digital sovereignty has earned it strategic partnerships with the French army, the country’s employment agency, the Luxembourg government and various European public organizations. The European Commission presented in October a strategy to promote European AI tools that provide security and resilience while boosting the continent’s industrial competitiveness. Offline capabilities for democratization. The use cases that Mistral has designed for its small models include, above all, local applications, such as factory robots that use sensor data in real time and without relying on the cloud, drones in natural disasters or rescues that operate offline, and smart cars with functional AI assistants in remote areas. Lample stood out that there are billions of people without internet access but with laptops or cell phones capable of running these small models, which he considers potentially revolutionary. Additionally, by running on the device, these apps preserve the privacy of user data. Real “open source” debate. Not everyone celebrates Mistral’s approach. Some critics question his decision to opt for models’open weight‘, that is, free to access but providing less information about their code than truly “open source” models, which provide the code and training data necessary to train a model from scratch. Andreas Liesenfeld, assistant professor at Radboud University and co-founder of the European Open Source AI Index, declared to the Financial Times that data at scale is the missing key in the European AI innovation ecosystem and that Mistral does not contribute to that at all. The long-term strategic bet. Lample recognize that their models are “a little behind” the most advanced closed systems, but argued that the important thing is that “they are catching up quickly.” Time will tell if Mistral’s approach to low-cost, versatile models with local applications ends up working for them to end up positioning themselves as one of the great European bets on AI. Cover image | Mistral AI In Xataka | China already has an army of 5.8 million engineers. His new plan involves accelerating doctorates

There are a lot of people replacing the oil on ham toast with coffee and orange. And oddly enough, it makes sense.

“You insist on putting olive oil on our Iberian ham toast and this is like putting sugar on top of a chocolate cake.” Víctor Sanchego did not know it, but with those words was about to make thousands of people prepare the strangest breakfast we’ve seen in a long time. How come you don’t have to add oil to the ham? Sanchego’s argument is that “the fat of Iberian ham contains more than 60% oleic acid, the same component of extra virgin olive oil.” Therefore, as happens in a perfumery when we have already worn several colognes, when we mix oil and ham at the same time our taste buds become saturated. “Instead of helping it enhance the flavor, it is subtracting it,” says the ham man. The reality, of course, is more complex. The general idea is true for Iberian ham: adding oil (especially if it is an intense and complex one) blurs the flavor profile and can actually oversaturate the bite. This, however, does not happen with the rest of the hams or with the rest of the oils. It is, so to speak, a borderline case. And a well-known one, at that. The normal thing when we talk about Iberian ham, in fact, is that it is recommended to enjoy it alone or with an accompaniment that cleanses the palate, such as a piece of neutral bread. Nobody usually proposes eating a plate of ham with a glass of EVOO on the side. The striking thing about all this is not that. The striking thing is the coffee with orange zest. Because Víctor Sanchego does not propose to eat ham with white bread, nothing like that. He suggests smearing the bread in a mixture of black coffee and orange peel, toasting it and, now, putting the Iberian ham on top. It’s a strange thing, yes; but we cannot define it as madness either. We said before that the ideal thing is to eat Iberian ham with something that ‘cleanses the palate’ and Sanchego’s idea goes directly there: coffee, due to its dry and intense qualities, allows us to enhance the organoleptic properties of our ham. Is it the most interesting decision? Well, the truth is that I couldn’t say. On a theoretical level, there could be dozens of similar combinations that fit better with our usual organoleptic repertoire; but without a doubt it is bold and many of those who try it (on social networks) They are delighted with the result. And that, without a doubt, is good news. Not because of the ham, not because of the coffee, not because of the orange zest. It’s good news because culinary Talibanism It is a practice that greatly impoverishes our understanding of food. And it limits us for no reason. Being open to ‘playing’ with products as iconic as Iberian ham is a symptom of a gastronomic maturity that, used well, can help us resolve problems in a much simpler way. big problems of the food security of the century. Image | Stephan Coudassot | Nathan Dumlao In Xataka | Why salads are the biggest source of food poisoning and what to do to avoid it

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.