Your refrigerator has a compartment designed for eggs in the door. It’s the worst possible place to keep them.

Almost all refrigerators on the market, when purchased, come with an accessory designed specifically for this purpose: an egg cup that goes on the shelves of the appliance door. This has become the place that many of us look at at first to catch the eggs, but the truth is that it is the worst refrigerator place to save the eggs. The thermal trap. The reason lies not in the fragility of the food, but in a microscopic enemy that surrounds us and can be potentially dangerous: the Salmonella. Here the main problem with the refrigerator door is that it is the area most exposed to thermal changes, since every time we open the refrigerator to get milk, water or simply to think about what to eat, the temperature on the door shelves fluctuates drastically. Here are the regulatory bodies of the United States They are quite clear pointing out that these constant rises and falls in temperature are the ideal breeding ground for bacterial growth. Furthermore, as pointed out by the South Korean Ministry of Food, the door is prone to generate condensation, creating a humid environment that facilitates the proliferation of pathogens in the shell that end up in the food when we break the eggs in the same bowl where we beat them (something also not recommended). The ideal temperature. To keep salmonella at bay, the temperature must be stable and below 4°C – 10°C, since under these conditions, the growth of the bacteria is suppressed by more than 99%. But this on the refrigerator shelves is something that is not always achieved. What the studies say. Here the science is quite clear with different studies that have pointed to the survival of strains such as Salmonella Typhimurium and the Salmonella Enteritidis in very specific conditions. A 2021 study demonstrated that at room temperature the bacterial load increases alarmingly in both the white and the yolk. On the contrary, keeping them at 5ºC limits their multiplication and reduces virulence. But if we come more to the present, a study launched in 2024 found that, under alternating temperature conditions, that is, in cycles of 25 ºC to 5 ºC, similar to taking food in and out of the refrigerator, salmonella manages to migrate to the yolk in 64% of cases. How to preserve them. Taking all this into account, the big question is: what should we do when we get home from the supermarket? In this case, the health authorities point to two strategies, the first being to put them directly on the interior shelves, preferably on the lower or middle ones. In this way, the temperature remains stable below 4ºC, and especially if it is at the bottom of the refrigerator. Do not throw away the cardboard. Although we usually take eggs out of their boxes to put them in plastic egg cups for convenience, the truth is that it is a mistake. That is why the second conservation strategy is to keep them in the original packaging, since the cardboard not only protects them from possible knocks, but also acts as a crucial barrier against moisture loss, prevents the shell from absorbing odors from other foods and protects the egg’s natural cuticular barrier. Images | Onur Burak Akın Katie Bernotsky In Xataka | The internet has become obsessed with drinking hot water in the morning. Science is clear about what it does (and what it doesn’t)

Iceland has solved it in the middle of the desert

Trapping carbon dioxide emissions and literally turning them into stone seems like an invention straight out of the blue. Futuramawhere in the future everything is recycled. The problem is that this trick of underground alchemy hid a terrifying small print: his exorbitant thirst. To get carbon to mineralize underground, the system needs to swallow absurd amounts of liquid, specifically between 20 and 50 times more water than the mass of CO₂ we are trying to store. However, a new industrial-scale study published in the magazine Nature just rewritten the rules of the game. An international team, with researchers from Iceland, Saudi Arabia and Italy, has shown in the western Saudi desert that it is possible to petrify CO₂ without wasting a single drop of external fresh water. Salvation under the sands of Saudi Arabia. As the authors of the research detail, this area is a real challenge: it is full of large facilities that emit a lot of CO₂, such as refineries and desalination plants, but it lacks the underground saline aquifers or sedimentary traps that are traditionally used to inject carbon. Salvation was under his feet. About 24 kilometers from the Jizan Economic Complex and Refinery, geologists took advantage of an immense bed of highly fractured volcanic rocks (basalts) that have been there for between 21 and 30 million years. There they tested an ingenious system for recirculating subsoil fluids. The gigantic “soda” trick. To carry out the experiment, the engineers used two main wells, separated by just 130 meters: one functions as a “production” well (extracts water) and the other as an “injection” well. The process is a closed circuit and isolated from the atmosphere so that oxygen does not enter or gas escape. They extract the water that already lives in the depths, circulate it through pipes and, 150 meters underground, inject pure CO₂ into it in the form of bubbles until it completely dissolves. According to the project scientists, dissolving the gas in water has two brutal chemical and mechanical advantages: It gets heavy: CO₂-laden water is denser than regular still water, so it creates a non-buoyant fluid, greatly limiting the risk of the gas migrating to the surface and back into the atmosphere. It becomes acidic: This liquid is acidic and greatly accelerates the dissolution of the silicate minerals present in the basaltic rock. As the rock dissolves, it releases metals that provide the cations needed to form stable minerals, such as calcite. A question of geopolitical survival. The data from this pilot is a resounding success. The team injected 131 tons of CO₂ into the subsoil. After monitoring the area with trackers, they discovered that approximately 70% of all that injected carbon had been mineralized within ten months. Measurements showed that the concentration of dissolved inorganic carbon in the returning water had been reduced by 90% compared to what was initially injected. Reusing water from the reservoir itself offers substantial advantages. Not only do you forget about bringing external water, but you also reduce the risk of the pressure of fluids underground increasing dangerously. Furthermore, by injecting water that has the same composition as the original underground reserve, the risk of compatibility problems, such as losses of permeability in the reservoir, is reduced. The current dimension. As we recently analyzed in Xataka In the wake of military escalation in the region, the real Achilles heel of the Arabian Peninsula is not oil, but thirst. Countries like Saudi Arabia depend 70% on their desalination plants to survive. In a scenario where the supply of fresh water is a strategic vulnerability and a matter of biological survival, allocating massive volumes of water to bury emissions was simply unfeasible. Therefore, this advance opens the door for the Middle East – where a large part of global oil production is also concentrated – to be able to use its basalt rocks to store carbon without sacrificing a vital resource. A providential accident. Sometimes setbacks are the best of tests. In September 2023, the submersible pump in the extraction well broke down. When the technicians brought it to the surface, they found its interior full of rock grains cemented by up to 14% calcite, as well as other minerals such as siderite and ankerite. The isotope analyzes made it clear: these solid cements were formed from the CO₂ injected during the pilot project. The gas had literally petrified in the very bowels of the machine. An “energy bargain”. As if that were not enough, we must add energy savings. As the research details, injecting CO₂ with this method requires a surface pressure of only 12 to 14 bars. That’s 8 to 16 times less pressure than conventional carbon capture plants require. Basically, CO₂-laden water is drawn into the system driven by gravity. Regarding its future potential, engineers calculate that the underground pores of this particular area (estimated between 24,000 and 43,000 m³) would have enough space to house between 22,000 and 40,000 tons of mineralized CO₂. Geology dictates: the limit of the stone. Every geological technology has its own physical limits. As experts explain Natureas water, CO₂ and basalt interact, the total volume of solid minerals increases. This means that the pore space is reduced and can end up blocking water flow paths in the long term. To get around this problem, the researchers propose that we may have to resort to fracturing the rock (fracking), an option still little explored in basaltic systems. What is clear is that this technological innovation is proposed as a great complement to conventional capture systems, not as an exclusive alternative, since in the end it is the geological conditions that rule. But thanks to this pioneering experiment, there is something we can take for granted: the lack of rivers or fresh aquifers is no longer an excuse for not returning our emissions to the subsoil and turning them into stone. Image | Eric Gaba and Nature Xataka | Neither oil nor gas: if a total war breaks out between the US … Read more

When he finishes he will steal the last advantage that the US had left

It is estimated that around more than 80% of the planet’s oceans remains to be mapped in detail, and in many areas we know less about the seabed than about the surface of the Moon. Still, that unknown environment is key to some of the world’s most advanced technologies. Also for war. The invisible map. I had a few days ago in a extensive Reuters report that China has been mapping the planet’s ocean floor for some time and that, when it is finished, it will have the last tactical advantage that the United States had left: knowing better than anyone else the terrain where the quietest war of all will be fought. For decades, American superiority under the sea rested not just on more advanced submarines, but on something much more intangible: a deep knowledge of the ocean environment. Now that balance starts to change because Beijing is building, step by step, a detailed image of that invisible world that conditions every movement underwater. A global network. What at first glance seems like oceanographic research is actually a global scale operation which combines dozens of vessels, hundreds of sensors and years of data accumulated in the Pacific, Indian Ocean and Arctic. These ships travel repeated routes, scanning the seabed and collecting key information on temperature, salinity and currents, factors that determine how sound propagates underwater. It is not a trivial detail, it is crucial because, in underwater combat, seeing does not matter so much: what is really key is listening better than your opponent and hiding from them. The “transparent ocean”. Here is possibly the crux of the whole thing. radiography that Beijing is carrying out. Because the heart of the strategy is the idea of ​​creating a species of “transparent ocean”a network of sensors capable of monitoring what happens beneath the surface with an unprecedented level of precision. The reason: although not everything is in real time, even delayed data allows build models that anticipateFor example, where a submarine can hide or how to detect it. In other words, China not only wants to sail better, but reduce uncertainty which has always protected these ships, transforming the ocean into a much less opaque and much more controllable space. Military power. They remembered in Reuters that one of the keys to the Chinese advance is how it is using universities, scientific institutes and civilian ships to build this knowledge base without openly resorting to military means. This fusion between civil and military allows it to operate more freely in international waters, accumulating strategic information without raising the same level of alert that a direct naval presence would cause… although the result is the same: a database that can be translated into operational advantages in the event of conflict. The end of a historic advantage. There is no doubt, all this effort aims to a clear objective: erode one of the greatest strategic advantages the United States has had, its dominance of the underwater environment. If China manages to match (or even surpass) that knowledge, it will be able, a priori, deploy your submarines more effectively, detect those of the adversary and monitor critical routes such as the approaches to the Pacific or the Strait of Malacca. It is therefore not a race of boats, but of information, and in this field the one who best understands the bottom of the ocean will have the initiative. A new balance. Taken together, the Chinese strategy reveals a profound change in the nature of naval power: one where it is no longer enough to have more ships or better weapons, but rather dominate the environment in which they operate. By systematically mapping the seafloor and deploying sensors at key points, Beijing is preparing the ground for a competition in which the advantage will not be visible, but yes decisive. And if that process is completed, the United States could find itself for the first time in decades without its traditional superiority in the most difficult domain to control: the one that cannot be seen. Image | RawPixel, Youth Daily News In Xataka | There are two global superpowers fighting to gain a foothold on the coast of Peru: the United States and China. In Xataka | It’s not that China is serious in the Pacific, it’s that space has revealed the size of a dizzying naval domain

SpaceX is now a company in the railway sector and it is very bad news for its employees

For some people it will be ingenuity, for others a very hard face, but the point is that SpaceX has found a way to avoid lawsuits and strikes by its workers when obtaining the name of air transport company. This means that it is regulated under the Railway Labor Law, with all the benefits that it entails within US legislation. The news. On March 13, the official resolution was made public by which SpaceX, Elon Musk’s space agency, is now considered a company in the railway sector in the United States. This means that your activity is no longer subject to the supervision of the National Labor Relations Board (NLRB)which is typically responsible for protecting the labor rights of private sector workers. The layoffs that started it all. In January 2024, the NLRB put a lawsuit against SpaceX on the tableafter the company illegally fired 8 employees. The lawsuit requested reinstatement of the employees, back pay, and a letter of apology to each of them. Given this situation, SpaceX responded with another lawsuit to the NLRBalleging that the procedure being carried out was unconstitutional. Rockets have the same legal treatment as cargo planes. An ace up your sleeve. According to Elon Musk’s company, the NLRB should not be able to act against a company that is dedicated to transportation. He added that One of its main missions is the transport of humans and goods to the International Space Station.. In many cases, these jobs are carried out for NASA, so they would also be providing a service to the Government. For all this, they requested to be covered under the Railway Labor Law. A plan that suits many. In recent years, SpaceX, as well as other Elon Musk companies, have been the subject of complaints from a multitude of dissatisfied employees, either due to their personal situation or due to bad practices carried out in the company. In the case of Neuralink, for example, Very bad practice towards laboratory animals was reported. But returning to SpaceX, the increasing volume of complaints could put the company’s work pace at risk. This, logically, would harm its managers, but also the companies that benefit from its services. The entire US space program would probably collapse. For all this, although it seemed difficult, in the end Elon Musk’s company has had a resolution in favor of its new name. Immune to strikes. One of the peculiarities of railroad companies in the United States is that they benefit from special state protection. Since minimum transport services must be guaranteed, strikes and other similar activities that would normally slow down the normal pace of work are closely controlled. The NLRB no longer rules. Another of those special protections for railroad companies is that the NLRB no longer has power over them. Therefore, dismissed employees cannot resort to it to report their situation. Instead, the company is governed by the rules of the National Mediation Boardmuch more lax in the mediation of labor disputes. It is true that employees can request strikes, but to do so they must undergo a long and tedious process that often causes them to change their decision. And now what? With this new name, SpaceX has even more power and freedom than before. If measures are carried out that involve malpractice towards employees, it is difficult for their complaints to come to fruition legally. This gives them a lot of leeway and greatly speeds up their protocols. Other curious legal victories. It is not the first time that SpaceX has obtained an unexpected legal name. Last year, for example, The Starbase base was given the name of cityso that all employees who live nearby would also become inhabitants. This, far from changing a few patterns, also gave SpaceX more freedom when maneuvering in the areas surrounding its base. As with railway legislation, what may seem like a small name change can change everything. Image | Gage Skidmore (Wikimedia Commons) |SpaceX In Xataka | SpaceX is preparing the largest IPO in history: the fact that it is doing so right now is no coincidence

While Spain does everything possible to preserve the Iberian wolf, one group has very different ideas: ranchers

A few days ago, a six-year-old Iberian wolf named Raksha traveled from the Basabrere center in Lezaun (Navarra) to the Jerez de la Frontera Zoobotanical Center. A trip that aims enrich the captive breeding program started in 1995 in order to guarantee the conservation of the species. The problem is that it is being done at a time when ranchers are fighting against the presence of the wolf due to the damage it is causing. Wolf x-ray. To understand the conflict, you first have to look at the numbers. According to the last national censusSpain has 333 stable herds, which translates into about 1,600 to 1,700 individuals, and it is good news because it marks an increase of 12% compared to the previous census. Here the vast majority is concentrated north of the Duero River, although a clear trend of expansion is observed towards the south and east of the peninsula. The problem is that we are still quite far from reaching the 500 herds that can guarantee good genetic variability that allows them to survive. That is why the Government maintains until this March the classification of the wolf’s conservation status as “unfavorable.” The war in the countryside. If science is telling us that there is a need for wolves, livestock farmers affirm that there are plenty of them, and they see this due to the increase in attacks on livestock that has forced the State to inject 20 million euros annually for prevention measures with fences or mastiff dogs, as well as to compensate financially. However, organizations such as WWF denounce that management by the autonomous communities is deficient, with a lack of transparency and little progress compared to what is set out in the 2022 National Strategy. Lots of criticism. But these measures seem to be not enough for some, such as the Popular Party, which points out that in the province of Lugo, where more than 1,400 affected animals were registered, much more still needs to be done. The Xunta de Galicia itself also points out that right now the winners do not have state funds to be able to face these attacks. Although the tension is undoubtedly placed right now on the temporary inclusion of the wolf in the List of Wild Species under Special Protection Regime (LESPRE). Under this legal umbrella, any action of capture, disturbance, sale or destruction of the species’ habitat is prohibited. A legal pulse. If we look back, a few months ago various amendments and regulatory changes They allowed a partial departure of the wolf from LESPRE, authorizing controls based on hunting to mitigate economic damage. But in February 2026 a ruling from the Supreme Court turned the situation around 180 degreessince it tightened the requirements to authorize these extractions, obligatorily prioritizing non-lethal alternatives and drastically limiting hunting. This sentence has acted like gasoline in regions of northern Spain where ranchers report significant attacks on their animals, and that is why the autonomous communities threaten to report the Spanish government to the European Union for not acting on the regulation of this species. But what is clear is that the crossroads of the Iberian wolf in 2026 is the perfect reflection of a coexistence problem. While Raksha and other specimens in captivity ensure the genetic lifeline of the species, in the offices and meadows of northern Spain the formula that allows the wolf to howl without the rural world starting to tremble has not yet been found. Images | Arturo de Frias Marques In Xataka | We have managed to make the dire wolves return after 10,000 years of being extinct. The problem is that “come back” may not be the right word.

turn a neighborhood into an unbearable oven

In some cities around the world, the shape of a building has come to alter its environment greatly. more than expected: from skyscrapers that generate dangerous winds at street level to facades capable of concentrate sunlight as if they were giant mirrors. Contemporary architecture, in its search for recognizable icons, has shown that even the most invisible details can have very real consequences. From icon to problem. At the end of the 20th century, the city of Los Angeles decided to build one of the most ambitious concert halls in the world and commissioned the project to Frank Gehryalready converted into a global figure after the success of Bilbao Guggenheim. The result was the Walt Disney Concert Halla shiny, curved steel building that promised to redefine contemporary cultural architecture. However, in this commitment to formal spectacularity, the a basic factor: the actual behavior of materials in a dense urban environment. What should have been an icon became a source of risk, capable of reflecting sunlight with such intensity that it turned nearby streets and homes into authentic ovens. The invisible failure. The problem was not simply aesthetic, but physical. Some of the stainless steel surfaces, especially the more polished ones with concave shapes, acted like parabolic mirrors capable of concentrating solar radiation at specific points in the environment. This effect, amplified by execution decisions that altered Gehry’s originally intended finish, generated extreme glare and raised the temperature in nearby areas to dangerous levels. What on paper was a play of sculptural light became a real thermal phenomenondemonstrating how small deviations between design and construction can trigger unforeseen consequences in large-scale projects. Heat, complaints and public alarm. Shortly after its inauguration in 2003, the complaints from neighbors and workers of nearby buildings. The main problem? The reflection of the sun on the façade generated heat spots that They exceeded 60 degrees Celsius, affecting homes, sidewalks and even traffic, where drivers reported dangerous glare. The building, intended as a cultural symbol, began to be perceived as a threat urban. The local press documented how some areas became practically uninhabitable during certain hours of the day, turning the work into a paradigmatic case of how iconic architecture can fail when it ignores its impact on the immediate environment. An unusual solution. In the face of growing public pressure, the solution was as radical as it was symbolic: sand the building. Specialized teams subjected part of the façade to a process “sandblasting” to remove the polished finish and reduce the reflective ability of the steel. In practice, this meant physically altering one of the most distinctive features of the original design. And although Gehry defended that the problem derived more of the execution From conception, the episode made clear that even the most celebrated works can require drastic fixes when they come into contact with reality. As various media reflected at the time, the icon had to be “domesticated” in order to coexist with the city. Lessons from a partial failure in modern architecture. The case of the Walt Disney Concert Hall It was neither a structural collapse nor a total failure, but it was a strong warning on the limits of spectacle architecture. He demonstrated that formal innovation, when not accompanied by a deep understanding of factors such as solar radiation, the urban environment or real materials, can generate problems as serious and unexpected. Not only that. It also highlighted the fragility of the balance between aesthetics, engineering and habitability in contemporary architecture. The legacy. There is no doubt, today, the concert hall continues to be one of the most admired buildings in the world and a cultural reference in Los Angeles. But his story carries an uncomfortable lesson: even the most prestigious architect and a client with unlimited resources can overlook the most essential. In their search for a global icon, they forgot for a time that architecture is not only looked at, you also live. And in this case, for a few months, living near the work could mean something as simple and brutal as enduring unbearable heat generated by the building itself. Image | Pexels, Wally Gobetz, Slices of Light In Xataka | If the solution to the housing crisis is to “build high”, Spain has the best possible example at hand: Benidorm In Xataka | If the question is whether a skyscraper can be erased without demolishing it, Paris has the answer: yes, in exchange for a fortune

We had a perfect plan to decarbonize the electrical grid. The brutal consumption of data centers has dynamited it

The daily headlines multi-million dollar investments announced in new language models and cutting-edge chips. Venture capital investors have pumped more than half a billion dollars into AI startups over the last five years. But, as a revealing analysis warns of TechCrunchthe smart money has begun to change sides: today, the best investment in Artificial Intelligence is no longer software. The reality on the ground has become extremely arid. Putting up walls and stacking servers in a giant data center has become the easy part of the equation. The real wall the tech sector is crashing into is finding the electrons needed to power it. According to a report by the analysis firm Sightline Climateup to 50% of data center projects announced for 2026 could face delays. Of the 190 gigawatts (GW) of capacity the company tracks globally, just 5 GW are under actual construction today. The bottleneck is no longer the microchips. It is access to the electrical network. The tyranny of 24/7. Consumption has run amok at a pace that 20th century infrastructure cannot process. A Goldman Sachs analysis projects that AI will shoot energy consumption of data centers by 175% by 2030. The figures all point in the same direction: the Open Energy Outlook predicts that electricity demand combined data centers and crypto mining will grow by 350% this decade. As a result, the pristine image of the technological cloud is evaporating. Google’s emissions have increased by 48% in the last five years, and Microsoft’s by 31% since 2020. The reason? What is known in the industry as the “tyranny of 24/7”. The algorithms do not sleep and require a continuous and steady power supply; They cannot be turned off simply because the wind stops blowing or the sun sets. Given the lack of mass storage systems globally, the fuel that is covering this urgent gap is not green. It is natural gas, which has returned from retirement as the great structural support of the sector. A global collapse with two faces. The pressure has already broken the market balances. In the PJM region—which supplies 13 eastern US states and has the highest density of data centers in the world—capacity prices went from $30 to $270 in a single auction at the end of last year. As John Ketchum, CEO of NextEra Energy, noted, we are facing a “golden era of energy demand”, but with an insurmountable physical limit: “the new electrons cannot reach the network quickly enough.” This electrical asphyxiation is redrawing the global map, and Europe is the best example. Historically, the European market was dominated by the “FLAP-D” markets (Frankfurt, London, Amsterdam, Paris and Dublin). But the network of these cities is no longer going strong. According to data from Greenpeacedata centers accounted for almost 80% of electricity consumption in Dublin, forcing Ireland to impose a moratorium. The market share of these traditional capitals will fall sharply by 2035causing a mass exodus to the Nordic countries (with unburdened networks and cold climates) and to southern Europe, such as Spain, Greece and Italy, in search of green megawatts. The hardware and network problem. When we scratch beneath the surface of this collapse, we discover that the physical problem splits into two large gaps. First, the machine to generate the energy is missing. Since intermittent renewables are not enough, companies turn to gas. However, gas turbines have become a rare commodity. Three years ago, Siemens Energy executives considered this market “dead”; Today, the factories are so overwhelmed that the delivery times for these turbines can extend up to seven years. Second, the “plumbing” is missing. Once the electricity is generated, the task of taming it within the building falls to the transformers. It is an iron and copper block technology that has barely changed in 140 years. As explained TechCrunchAs servers demand more power, traditional electrical equipment will take up twice as much space as the servers themselves. It is mathematically unsustainable. ‘Smart Money’ changes sides. Against this backdrop, venture capital is pivoting. Big tech companies (Amazon, Google, Oracle) are starting to behave like energy giants, devising alternatives to minimize their dependence on an outdated public grid through hybrid or generation approaches. on site. The solutions are divided into several fronts: The nuclear resurgence: Google has signed a pioneering agreement with Kairos Power to develop seven small modular reactors (SMR) by 2030, and Amazon tried (although regulators temporarily blocked it) connecting a data center directly to the Susquehanna nuclear power plant. Super batteries: Google is collaborating in Minnesota with the company Xcel Energy and the startup Form Energy to install batteries capable of discharging energy for 100 hours, thus stabilizing the peaks of renewables. Hardware innovation: Dozens of startups (such as Amperesand or DG Matrix) backed by investment funds are developing silicon-based “solid state” transformers, seeking to finally retire old iron and copper to save vital space in facilities. Regulatory surgery: In southern Europe, organizations such as the CNMC in Spain are applying “flexible access permits”, forcing centers to accept cuts in emergencies so as not to collapse the entire country. The paradox: AI as savior of the electrical system. However, the story has a fascinating twist. The same technology that today threatens to burn the cables of half the world could be the one that ends up saving the electrical system. According to the consultant’s estimates Deloittethe application of artificial intelligence to optimize industrial systems and electrical networks will save more than 3,700 TWh globally by 2030. That is, AI will save almost four times the energy consumed by all the data centers on the planet combined. A report of Ember over Southeast Asia (ASEAN) support thiscalculating that integrating AI into the management of its networks will save more than 67 billion dollars and avoid the emission of almost 400 million tons of CO2. But to get to that future of efficiency, you first have to turn on the machines today. And what is at stake is the world economic map. Hosting these centers is … Read more

why the “given time” tastes like glory to you

It’s a pretty specific feeling to be looking at your calendar and mentally preparing yourself for a string of endless video calls or a meeting that threatens to consume the entire morning. But there comes a time when an email or a written message arrives with a phrase that gives relief: the meeting has been cancelled. The relief you feel is instantaneous, but there is a quite curious phenomenon behind it: the hour that has just been recovered in the day feels much longer and more useful than a free hour that was already scheduled for quite some time. The big question. It may be something that we feel subjectively, and now that you have read this paragraph you have realized that the sensation that is perceived is true. But now science has arrived to answer the question of why this happens. And it is not magic, but rather it is pure behavioral psychology. The answer is in a recently published study where the research team set out to understand what happens in the brain when the clock gives us an unexpected break. To do this, they carried out seven experiments involving more than 2,300 participants. Your conclusions. The first thing that have seen It is precisely that the time “gained” unexpectedly is subjectively perceived as much longer. The researchers explain that this is due to a powerful contrast effect, since the mind was rigidly prepared to not have free time and undergo a cognitive load such as the meeting. When that obligation abruptly disappears, the empty space that remains contrasts brutally with our expectation of saturation. In short, the brain, faced with the sudden absence of scheduled stress, stretches our perception of those minutes. What do we do with time? This altered perception of time has direct and measurable consequences on our behavior, since, as the experiments detail, the feeling of “liberation” pushes us to make very specific decisions about how to invest that time. And because we perceive that we have a lot of extra time, we are more likely to invest it in more extensive, leisure-oriented activities. This explains why, after a cancellation, it is rare that we launch into a harder and more tedious task that we have pending. Instead, that false sense of temporary abundance invites us to have a long coffee, read a pending article, chat with a colleague or do low-intensity tasks. It is as if you were literally tasting freedom. The modern era. With everything around us, science reminds us that there is a cost to living obsessed with the agenda. Previous research suggests that our leisure time drastically reduces how much we enjoy it and makes us perceive that time passes faster. This is why overscheduling contracts our perception of time, while unexpected cancellations expand it. Images | Campaign Creators In Xataka | It is possible that you have been studying poorly all your life: neuroscience is destroying the myth of “crazing” the night before

Houston, we have a problem with Outlook. Microsoft spends millions on AI, but Artemis II does not escape the failures of its email

On April 2, we experienced a historic event for humanity: the mission Artemis II It successfully took off towards the moon after more than 50 years without orbiting near the Earth’s satellite. Although the takeoff was a success, the path to get here was not without problems: it already had to delay the first date launch and also the second. Even on the official day there were problems. In the previous hours it was necessary check an anomaly in a temperature sensor of a battery abort system and also appeared another incident in the flight termination system (the safety mechanism that makes it possible to destroy the rocket if it deviates from its trajectory and becomes a threat). When the Orion spacecraft was flying almost 150,000 kilometers from Earth according to FortuneCommander Reid Wiseman encountered a mundane problem faced by any mortal with a computer and Microsoft email: an Outlook crash. The incident. The launch of Artemis II could be followed live and in that live, approximately 13 hours and 15 minutes after the broadcast began there is a fragment where the problem appears: “I see that I have two Microsoft Outlook accounts, and neither one works. If you could connect remotely and check Optimus and those two Outlook accounts, that would be great.” At first, Wiseman had issues related to the Optimus software, but then he pointed out a more trivial concern: There were two instances of Outlook running on his personal computing device. As a curiosity, the live stream to follow the takeoff still available on YouTube. Why it is important. The Artemis II mission is historic and the stream has left for posterity its first hours of flight and this anecdote that constitutes what is probably the first Microsoft technical support ticket generated from space. Beyond the joke, the episode shows that today’s space exploration and its cutting-edge technology coexist with commercial productivity software and its common failures. When an agency standardizes its entire infrastructure on a single technological ecosystem, the problems of that ecosystem also become problems of the mission. Tap to go to the post There is a support ticket from the Moon. As with any standard corporate ticket, the user first reported the incident, the technical team took over remotely, and finally closed the case. Houston accepted the request for remote access to the commander’s device, identified in records as PCD 1, and about an hour later, Outlook was back up and running. After 14 hours and 20 minutes of broadcast, someone from mission control communication said: “We managed to open Outlook. It will appear as “offline”, as expected”, as pick up Tom’s Hardware. Why they use Outlook in space. That there is Microsoft software on board is not something casual or improvised: Microsoft is a strategic partner of NASA that provides everything from productivity software to cloud data infrastructure and artificial intelligence (NASA Earth Copilot), hardware and mixed reality and Minburn Technology Group is your partner for software support and maintenance. In fact and according to NASAthe personal devices of the astronauts on the Orion spacecraft are Microsoft Surface Pro and the software they run is Commercial Off-The-Shelf, That is, standard commercial software for everyday tasks like talking to your family or managing your photos and videos. Another thing is the spacecraft and main flight systems: these are powered by specialized radiation-resistant hardware and specialized software with rigorous maintenance. The bathroom was also broken. The Outlook failure was not the only technical problem in the first hours of the flight, as can be seen in the broadcast. About two hours after launch, a malfunction light came on in the ship’s waste management system: the urine extractor fan had jammed. This component is responsible for sucking urine into a collector, avoiding the uncomfortable and unhygienic effects of microgravity. NASA confirmed shortly after the toilet problem had been solved. In Xataka | NASA had been refusing to allow its astronauts to carry iPhones for decades. For Artemis II you have made a historic decision In Xataka | The Artemis II astronauts will carry out experiments in what will be their own study models Cover | POT and Ed Hardie

Data centers are real “heaters”. And they are settling in regions as hot as Aragón

The data centers They are a black hole in several senses. They are drinking the global NAND chip manufacturing capacity (what affects SSDs, to RAM oa SD cards), the companies that they make batteries they can’t cope and consume wateryes, but much more alarming is energy consumption. In this sense, they are insatiable and, in the end, thousands of pieces of equipment that generate heat are causing another unexpected effect: they are turning the facilities into heat islands. And it is something that has the potential to affect 340 million people. What’s happening. Andrea Marinoni is an associate professor in the Earth Observation group at the University of Cambridge. Also the coordinator of a group of researchers from both the center and the Nanyang Technological University who have published a study called “Heat Island Data: Measuring the Impact of Data Centers on Climate Change.” In it, they present the results of measuring more than 6,000 data centers located far from dense urban areas with the aim of identifying whether these facilities, by themselves, are a notable heat source. The result? “An impact elderly than expected,” according to the researchers. They compared historical temperature measurements from the locations of those data centers over the last 20 years to compare how things have changed recently and identify whether those data centers have had any influence. And, as we said, the impact seems to have been strong: an average of 2°C, with maximums of up to 9°C in some cases. Doesn’t matter the place. This generates a heat island effect, which is when a large amount of heat is concentrated in one area that should not be there. In big cities It’s something that usually happens and that’s why the most efficient urban architecture seeks to combat the phenomenon. And it doesn’t matter where the data center is. In the study they present several examples: Bajío region in Mexico: high data center density, stable climate, but a land surface temperature increase trend of 2 degrees Celsius in the last two decades. It is something that was not identified in nearby areas without data centers. States of Ceará and Piauí in Brazil: increasing trend of 2.8°C with a projection of reaching 3.5°C in the next five years when this is not observed in the rest of the areas. Aragon in Spain: an anomalous increase of 2°C in surface temperature that stands out compared to neighboring provinces. Potential damage. Aragón is a worrying example because heThe region is consolidating as one of the ‘lungs’ of hyperclimbers in Europeas well as one of the regions of Spain key to the expansion of data centers and European technological sovereignty. And the problem is that, according to the study, the impact of this increase in surface temperature reaches up to 10 kilometers away from the hyperscalers. They detail that in the surrounding areas that are about 4.5 kilometers from the data centers, an increase of 1°C can be measured, which seems little, but when we talk about these climatic effects, it is a lot. And, furthermore, they estimate that the impact of increased temperatures due to this broad heat island effect is something with the potential to affect 340 million people. Yes, but. This research has not been the only recent one on the effect of data centers on the land on which they are installed. Researchers at Arizona State University they installed sensors on cars driving near these centers to capture measurements and noticed the same thing as the Cambridge researchers. But one thing to keep in mind: both studies show measurements, but they have not been peer-reviewed. And there are experts, such as Ralph Hintemann, principal investigator at the Borderstep Institute for Innovation and Sustainability, who point out that, although the results are there and are interesting, some figures “seem very high.” In fact, it focuses not so much on the heat that is concentrated around data centers but on the big problem: the amount of energy they need and the return to fossil fuels to meet peak demand. Image | Tedder In Xataka | Data centers in space promise to save the planet. And also ruin the earth’s orbit

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.