30 years later it is the glue that keeps the internet alive

Three decades ago, a joint release from Netscape and Sun Microsystems introduced the world to JavaScript, a scripting language designed for creating interactive web applications. Behind that press release A story of technological survival was hidden: said language had been born months before, the result of a frenetic ten-day sprint led by engineer Brendan Eich. What began as a hurried prototype to give life to the netscape browserhas today become the infrastructure that supports a huge percentage of the visible web. The myth of ten days. The legend tells that Eich wrote the core JavaScript in just over a week. And it is true, but the result was a hybrid of influences. Pressured by Netscape management to make the language more like Java, Eich adopted a syntax of curly braces and semicolons. However, under the hood, it injected the functional elegance of Scheme and Self’s prototype-based object model. This mix, born out of haste, left a legacy of technical inconsistencies that developers still suffer from (and love) today. From Mocha to confusion. You may not know that the language was not always called that. It was born as Mocha, became LiveScript and was finally named JavaScript in a marketing maneuver to take advantage of the popularity of Java. What’s more, the confusion over names continues to this day among less knowledgeable users: but Java and JavaScript have the same thing to do with each other. car (car) and carpet (rug), as is usually answered when someone asks about their differences. The strategy worked, but angered rivals like Microsoft. His response was to create his own version called JScript, something that caused notable fragmentation that made Bill Gates himself complain about Netscape’s constant changes. To bring order to the chaos, the language ended up being established in 1997 under the name ECMAScript. Image by Claudio Schwarz on Unsplash Ajax and the conquest of the server. For years, JavaScript was seen as a toy for doing simple validations, but that all changed in 2005 with the arrival of AJAX. This technology It allowed websites like Gmail or Maps to update data without reloading the page: the step was taken from static websites to dynamic apps. The second leap occurred in 2009 with Node.js, which took JavaScript out of the browser and onto the server. Key for developers to use a single language for the entire stack and which now involves between two and three million packages in the npm registry. Absolute domain. Despite the emergence of modern rivals, the hegemony of JavaScript is indisputable. According to the 2025 Stack Overflow surveycontinues to be the language most used by 62% of developers, something that puts them ahead of others such as Python or SQL. Its ubiquity is such that it has transcended the web: it powers desktop apps using Electron, mobile development with React Native and even AI tools. It is the default language for learning to program and chosen by 60% of students. This mass success has brought with it a complexity in the JavaScript ecosystem: Frameworks like React, Angular and Vue dominate the market (used by 40% of web developers). The weight of libraries is beginning to take its toll on the performance of the web. Therefore, predictions for 2026 point to a resurgence of pure JavaScript either Vanilla JavaScript. Forced maturity. Despite its birth defects, JavaScript was able to evolve. In 2015, the ES6 update radically transformed the syntax, but the real paradigm shift came from Microsoft: with the TypeScript creationa layer of security and types was added that solved much of the original chaos, something that allowed it to become the almost mandatory standard for professional development. JavaScript is still the engine, but TypeScript is the precision flywheel. A legal problem called Oracle. The paradox of JavaScript is that, despite being an open standard, its name is proprietary. Oracle inherited the “JavaScript” trademark after purchasing Sun Microsystems, although it has never released a product with that name. Recently, key figures such as Brendan Eich himself and the creator of Node.js have signed a request so that the US patent office can cancel the trademark due to abandonment. The legacy of a hack. It is ironic that the companies that sponsored his birth have disappeared or been absorbed, while his creation remains more alive than ever. Authoritative voices like Douglas Crockford (creator of JSON) have come to suggest they should “retire” it for its basic design flaws, but the reality is that the modern web would not exist without it. JavaScript is not just code; is the lingua franca of the internet, the invisible glue that turns static documents into digital experiences. Without its existence, the network would only be a collection of texts and images without movement, something similar to a PDF newspaper that we see on our screen. In Xataka | There is a shadow giant pulling all the technological strings that connect TikTok with AI: Oracle

There is an obsession with protein to gain more and more muscle. Science has more and more doubts that it works

Until not so long ago, protein was a technical term, linked to clinical nutrition and sports. Today it has become a cultural symbol. Under what some have called the era of Protein Chicprotein is no longer just a nutrient, but a promise: for health, body control and active aging. Eating well has come to mean, almost automatically, eating “with protein.” The market pushes. This change has consolidated an idea that is as simple as it is deceptive: that if protein is good, the more it is, the better. However, while the market push this logic Without nuances, the human body continues to function with very specific limits. And there arises the question that rarely accompanies packaging and slogans: how much protein do we really need to age well, and at what point does it stop adding up? What does science really say? This is where the noise of marketing collides with the evidence. In an extensive report published by The Washington PostProfessor Stuart Phillips, leading researcher in protein metabolism, muscle health and aging at McMaster University (Canada), issues a clear warning: “Consuming more and more protein is not necessarily better. There are no infinite benefits associated with higher intake.” Phillips is not a marginal voice in this debate. He has been studying for decades how nutrition and exercise interact to slow age-related loss of muscle mass —sarcopenia—and he is one of the scientists most cited in this field. His message dismantles much of the dominant narrative. So, let’s get to the data. The classic recommendation of 0.8 grams of protein per kilo of body weight —the well-known recommended daily intake (RDA)— is usually interpreted as an objective to achieve. In reality, it is designed as a minimum to avoid malnutrition. According to Stuart Phillipswhen the focus is on aging healthily and preserving muscle mass, the evidence points to somewhat higher ranges, always combined with strength training. This approach fits with what was published by harvard and Mayo Clinicpoint out that exceeding intakes close to 2 grams per kilo of body weight rarely provides clear advantages to the general population. Instead, they insist on the need to adapt the amount of protein to age, physical activity and health status. Protein: necessary, but not miraculous. It is worth remembering something basic that is often lost in public conversation: the body does not store protein. Once the needs are met, the excess is used as energy or transformed into fat. Eating more protein, by itself, does not build muscle. As they remember from Mayo Clinic: “Muscle is built by strength training, not by shaking.” From 40 or 50 years old, the equation changes slightly. The progressive loss of muscle mass begins and here protein takes on a strategic role, but always in combination with resistance exercise. Spreading the protein throughout the day (between 15 and 30 grams per meal) and not concentrating it only at dinner seems more effective in stimulating muscle synthesis, a point that also underlines the McMaster University researcher. The word of the year: protein. At least in the nutritional field, because – for those who want to know – the word of the year has been “tariff”, and no wonder. But getting back to the topic at hand, protein has sneaked in on social networks, in cafes and in viral morning routines. And going further, the new ritual of well-being involves coffees protein, clear protein, functional supplements and smoothies that promise sculpted bodies. This obsession coexists with other contemporary phenomena: the fear of aging, the cult of the “perfect” body and the popularization of weight loss drugs like Ozempic. In this context, protein is sold almost as a talisman: it satisfies, slims, tones and protects against aging. Nutritionists, however, are more cautious. Many agree that we are paying a premium for ultra-processed products that do not provide more benefits than the real food that we already have at home: eggs, legumes, fish or natural yogurt. The origin of the protein. Another important turn in this debate. We come to a meta-analysis that shows that following patterns like the Planetary Health Dietrich in plant proteins, is associated with both lower mortality and a lower climate footprint. It is not about eliminating animal protein, but about moving it from the center of the plate and prioritizing legumes, nuts and whole grains. The experts introduce a key concept here, widely cited by Harvard: he protein package. It’s not just the protein that matters, but what comes with it. It is not the same to obtain it from an ultra-processed “high in protein” food than from a dish of lentils with fiber, minerals and antioxidants. The nutritional context matters as much as the isolated macronutrient. So who really needs more protein? Protein deficiencies are not common in the general population. They appear especially in older people, patients with illnesses, very restrictive diets or chewing problems. In these cases, supplements can be a useful tool, never a universal shortcut. Alma Palau, dietician-nutritionist and manager of the General Council of Official Colleges of Dietitians-Nutritionists, warned in an interview in CuídatePlus that excess protein is not harmless. “Proteins that the body does not need are metabolized and eliminated, but this process involves making organs such as the kidney or liver work unnecessarily,” he explained. Palau insists that consuming more protein than necessary does not translate into more muscle or more health if it is not accompanied by sufficient carbohydrates, a varied diet and physical activity. In other words: without context, the protein loses its meaning. Along the same lines, Carlos Andrés Zapata, nutritionist interviewed by La Vanguardiawarns that protein has been overstated in current discourse and remembers that it is not more important than other macronutrients such as carbohydrates or fats, nor does it replace a balanced diet or strength training. Less obsession, more balance. Protein matters, a lot. It is essential to maintain muscle, autonomy and quality of life with age. But science does not support the idea that it is infinite or magical. … Read more

The origin of December 25 is in an obscure third century antipope obsessed with the birth of Christ

For years, we have repeated that Christmas is an invention. Not only does the Bible not specify that Jesus was born on December 25, it is that It is implausible that it was on that date. The gospels themselves detail that there were shepherds tending flocks outdoors (something unlikely in the Decembers of the time in Bethlehem), but the idea that the Romans were going to take a census on those dates is almost delirious. For this reason, we have repeated over and over again, the most reasonable explanation is that during the 4th century, the Church set the birth of Christ on December 25 to make it coincide (and in the process ‘Christianize’) the pagan festivities of Sol Invictus and Saturnalia. The only problem is that the latest available evidence goes in another direction: that of an obscure third century antipope who, obsessed with making a chronology of the scriptures, arrived at the date of the 25th independently. This is the story of how Hippolytus of Rome invented Christmas. The myth of the Christianization of Roman festivals Hail, Caesar! Io, Saturnalia! by Lawrence Alma-Tadema But let’s start by reviewing the best-known theory and seeing why some authors They have started to doubt them. As is often read on the Internet, this theory tells us that there is nothing coincidental about the December 25 election. On that date there was already a birthday, that of the “Unconquered Sun” (which would be the winter solstice for the Romans) and the Church, which during the 4th century struggled to — and would succeed — to become the official religion of the Empire, would have taken advantage of the pull of the pagan festival to place Christmas there. And the theory makes sense. However, it has a big problem; does not really resolve the question at hand: why 25? As explained Thomas C. Schmidta researcher at Princeton University, indeed the Roman Saturnalia fell on those dates, but not on that date. Certainly, it is difficult to be conclusive when we talk about that historical period, but everything seems to indicate that the strong day of Saturnalia fell closer to the 17th than to the 25th. In fact, if this approximation is true, we could not even say that it is the end of the ‘sigilarias’ (the celebrations – of a week – that followed the birth of the Unconquered Sun). Other festivities such as the Kalends (which were already celebrated in January) or the brumals (the solstice festival) do not fit well with the date in question either. That is to say, the idea that these Roman festivals are the origin of Christmas is, as I say, suggestive; but it still does not provide a convincing explanation as to why the Church chose the 25th. To answer that question we have to dig a little deeper. Since when is Christmas celebrated in “Christmas”? As says Schmidt.the first historical reference to December 25 as the day of the “birth of Christ in Bethlehem of Judea” can be found within the Filócalo Calendarin a document dated 336. It is a curious fact. And, although it is not something that explains the central issue of our question (the reason for December 25), it does give us a time frame: it tells us where to look for that explanation because, for practical purposes, we can assume that during the 4th century the festival was already relatively consolidated. That is, you would have to search a little before. Specifically to 222. In that year it is dated a statue of Hippolytus from Rome found in 1551 near the Via Tiburtina. The interesting thing about the statue is that, among its many inscriptions, it includes a lunar tablet that is kept today in the Vatican Library. Who is Hippolytus of Rome and what does he have to do with all this? Adoration of the Shepherds, by Gerard van Honthorst Hippolytus of Rome is a very multifaceted figure. Considered one of the great theologians and preachers of his time (in fact, Origen can be considered his disciple in some respects), he led a schism in 217 that led him to distance himself from the Church for a decade. He is, at the same time, the first antipope in history and a saint who, according to what is said, died martyred 235: he is, in fact, the only antipope canonized to this day. Well, we know that already in 220 after Christ, Hippolytus (in a commentary on the book of Daniel) defended that “The first coming of our Lord, in Bethlehem, was on Wednesday, December 25.” However, we also know that this text is manipulated. There are several versions with modified dates: among them, some that explain that the birth was in March or April. And the truth is that if Jesus was born in April many of our problems would be solved suddenly. However, looking only at the texts, it is not clear. That’s where the statue comes in. Because in the lunar table of the inscriptions, all past and future Easters appear calculated and, along with them, two key notes for us: the original Good Friday (which fell on March 25) and the “genesis” of the Lord (the year 2 AD) which fell on April 2. In the year 235, in a very ambitious work in which he traced the complete chronology of creation, Hippolytus It advanced that origin to March 25 for the simple (and, seen from today, absurd) reason that that was the date on which, according to their data, the world had been created. The true “genesis” of Christmas Paolo de Matteis But what does all this have to do with December 25? The answer is in the statue although I have overlooked it: specifically, in the word “genesis.” Because what is the “genesis” of a person? His birth or his conception? While it would be better for us if it were his birth (because it would fit with what the Bible says about Christmas), … Read more

The real reason why Musk, Bezos and Pichai want to build data centers in space: bypass regulation

The construction of data centers is proliferating so much that although the largest in the world They are in Kolos (Norway), in The Cidatel (United States) and China, you can find them now even in Botorritain the province of Zaragoza. The limit is the sky. Or well, not even that: because Silicon Valley has been put between eyebrows set up data centers in space. And the main big tech companies are making moves to achieve this. Former Google CEO Eric Schmidt bought rocket company Relativity Space with that objective. Nvidia has supported the startup Starcloud in its project to launch the first NVIDIA H100 GPU into space a few weeks ago and Elon Musk has even condensed how he would do it in a tweet: “It will be enough to scale the Starlink V3 satellites, which have high-speed laser links.” He when Jeff Bezos slipped it in a prediction at the Italian Tech Week: We will see “giant training clusters” of AI in orbit in the next 10 to 20 years. The moon is a gift from the universe The next question would be “why?”. The reality is that there is no shortage of reasons. AI is a real energy guzzler and as demand does not stop growingspace offers a couple of differential advantages over Earth: almost unlimited energy and free cooling. On the one hand, in space we have a sun-synchronous orbit where solar panels receive energy almost continuously. On the other hand, you can install a radiator so large that the space functions as a kind of ‘infinite heat sink at -270°C’. The enormous amounts of water essential for cooling on Earth would not be needed. Let’s face it, today there are no plans to have data centers in space. But not too far away: University of Central Florida research professor and former NASA member Phil Metzger esteem that perhaps within a decade it could be economically viable. But its viability is so clear that it considers that taking AI servers into space are “the first real business case that will give way to many more“in the face of a future human migration beyond Earth. So for now, they try it on Earth. Consequence: that Donald Trump declare an energy emergency due to the enormous electricity demand expected for the coming years. As the power grid catches up (or tries to), AI companies have decided to move from a passive to a proactive position: Meta is going to become an electricity marketer. xAI by Elon Musk is using gas turbines as energy sources temporary. OpenAI is pushing to the United States government to lend a hand to electricity companies to add 100 gigawatts per year. That figure doesn’t say much, but it is astronomical: what OpenAI is asking for is that The United States built almost an entire Spain (around 145 GWh considering the 129 GW consolidated at the end of 2024 plus the solar and wind deployment of 2025) every year and a half in terms of infrastructure. AI is growing faster than electrical bureaucracy is advancing How could the Trump Administration help? With the eternal bureaucracy. Because on Earth they face great technical challenges, but they also face a legislative wall. To have more energy, the simplest and most immediate step is to build new power plants, but that means successfully going through the tangle of procedures that slow down the process. There is only one small problem: that in the United States depending on technology, it can take five to ten years… if you’re lucky. Interconnection to the grid alone can take six years, successfully overcoming an interconnection queue with more than 2,000 GW in projects who are already in line. Then, up to four years of federal and environmental permits to end in another couple of years for state and local licenses that must come to fruition. ‘Permit Stack’ they call it. And the journey does not end here: they must also avoid andthe citizen movementNot in my backyard‘ (not in my backyard, kind of like “yes, but not in my house”), which has already backed down the Battle Born Solar Project (Nevada), which was going to be the largest solar plant in the United States, or Danskammer gas station (New York), among others. This can delay the operation even further as rights of way must be negotiated with individual owners who may refuse, going through the courts again. The never ending story. To avoid processes NIMBY that last fifteen years or more, companies like OpenAI or Microsoft are buying plants that already exist, such as Three Mile Island, which is going to reopen only for Microsoftinstead of trying to build new ones from scratch. Amazon has also signed infrastructure that is already on the network like the Talen Energy Campus and it has partnered with Dominion Energy and X-energy to develop mini reactors (SMR). SMRs are also Google’s solution, in this case thanks to an agreement with Kairos Power. Everything is to avoid that tangle of ‘Permit stack’ procedures that in practice and according to estimates, makes it is faster to opt for the space route to build a power plant on the old, familiar Earth. At the end of the day for AI companies “The moon is a gift from the universe”, as already Jeff Bezos glimpsed. In Xataka | Musk has created the perfect circle: Tesla’s megabatteries power the AI ​​that will define its next cars In Xataka | Researchers have dismantled the batteries of Tesla and BYD. You already know which one performs better and is much cheaper. Cover | İsmail Enes Ayhan and NASA

V16 beacons with associated application. Five models to comply with DGT regulations as of January 1

There is nothing left for us to have to carry a mandatory one in the car. V16 beacon. Which one to buy? There are many models, so in this article we are going to focus on five V16 beacons that come with their own smartphone application. Help Flash IoTan affordable beacon compatible with myIncidence. Help Flash IoT+a beacon with a good number of candles that is also compatible with myIncidence. LEDOnea V16 beacon that stands out for its format and is compatible with its own app. LEDOne for trucksincludes the beacon and an arrow for commercial vehicles. FlashLEDcomes with an anti-shock case and is compatible with its own app. Help Flash IoT The first V16 beacon that we have introduced in this list is from Netun Solutions: the Help Flash IoT. It has a similar design to the Help Flash IoT+, but has different characteristics: it offers visibility up to 1 km, its autonomy is approximately two hours and it offers more than 40 effective candles. Plus, it connects to the app myIncidence to be able to quickly contact the insurance company and emergency services. The price could vary. We earn commission from these links Help Flash IoT+ Secondly, we have introduced the most complete version of the previous V16 beacon: the Help Flash IoT+. It has a more interesting technical sheet and its price is usually similar: it offers approximately 290 candlesits autonomy is up to 2.5 hours and it also connects to the myIncidence app. The price could vary. We earn commission from these links LEDOne The LEDOne It is a particularly interesting beacon due to its format, since it incorporates a support so that it can be placed a little higher, thus improving visibility. Offers 120 effective candlesits autonomy is approximately two hours, the brand mentions that it is suitable for all vehicles and can be connected to the LEDOne app to notify insurance and emergency services. The price could vary. We earn commission from these links LEDOne for trucks An alternative to the previous V16 beacon—or rather a more complete option—we have it at Leroy Merlin. The LEDOne is available in a truck pack which, in this case, includes both the beacon that we have mentioned before and a signaling for industrial vehiclesthus improving visibility. In addition, since it is the same beacon, it is compatible with the LEDOne app. The price could vary. We earn commission from these links FlashLED Finally, PcComponentes has the V16 beacon FlashLEDwhich in this case comes along with a anti-shock hard case. It works using a single battery and is compatible with its own app SOS alert. Of course, the brand does not mention either the theoretical autonomy or the figure in candles. The price could vary. We earn commission from these links Some of the links in this article are affiliated and may provide a benefit to Xataka. In case of non-availability, offers may vary. Image | Netun Solutions, LEDOne, FlashLED In Xataka | Safety, organization and entertainment gadgets and accessories for cars on long trips In Xataka | Clarifying all the mess that the DGT has on its hands: the V-16 light, the V-27 signal and the emergency triangles

We have been believing for decades that wet hair makes us sick in winter. Science knows perfectly well that it is a lie

“Don’t go out with wet hair or you’ll catch pneumonia” or “put on your coat or you’ll catch a cold” are very grandmotherly phrases that almost all of us have been told in our childhood and that have been burned into our brains. But the question we can ask ourselves: is this true? The reality is that not directly. The culprit. May we have a cold or flu It doesn’t exactly depend on the cold. The culprit in this case are infectious agents such as viruses, the most common being rhinovirus. The fact that this microscopic germ accesses our body and overcomes our defense barriers causes it to begin to replicate and generate its effect that In the long run it’s really annoying when accompanied by fever, cough and a host of other symptoms. In this way, the equation is quite simple: if there is no exposure to the virus, the external temperature is irrelevant. To understand it, if we put ourselves in the situation of going out to Antarctica with our hair soaked and naked, we would surely die of hypothermia, but we wouldn’t catch a cold unless a penguin sneezed rhinovirus on us. The same thing happens if we are in an environment completely isolated from viruses and at a very low temperature: no infection would occur. The experts. Just as it isExperts from the Mayo Clinic explain and disseminating pharmacistscold alone does not have the ability to spontaneously generate a pathogen. Cold is a physical condition, not a biological agent. And science has been trying to explain this for decades. One of the most cited and relevant studies is the one carried out by the University of Rochester where they separated volunteers into two groups. One of them was exposed to low temperature and cold conditions; the other was kept in a warm and comfortable environment. Subsequently, they were exposed to rhinovirus that causes colds. The result. In this way, it was seen that between the two groups there was no significant difference in the contagion of the virus or in the symptoms they presented. The group subjected to the cold did not have a harsher cold, so the factor in getting sick was solely and exclusively the virus. Getting sick in winter. It is a reality that when winter arrives the rates of people with colds or flu increase greatly, as we are seeing in Spain these days. This makes us think that the relationship really exists, whatever science says. And this is where we give a little point to ‘grandmother’s advice’. Science suggests that rhinoviruses they replicate better at the temperatures we usually have in our noseswhich ranges from 33 to 35 °C. But in addition, the cold temperature also causes our defenses to lower, so it is much easier for the virus to access our body and begin to spread in a much simpler way. And that’s why winter is where we see a higher rate of colds. Other factors. But he is not the only one. The social factor is also a big culprit, because when it is cold the truth is that it is better to be locked up at home with Netflix. But in these cases we would be in an interior space with little ventilation (because it is cold) and very close to other people. In this way, if a person has the virus, the probability of contagion skyrockets in a heated indoor place much more than in an open-air park at 5°C. Another point is the dry environment that exists at this time due to the cold outside and the indoor heating. This causes the nasal mucous membranes to dry out, which is a serious problem for the mucus, which is our first line of defense at the entrance to viruses and bacteria. If the mucosa is dry, its effectiveness decreases and facilitates the entry of pathogens. Wet hair. A special distinction must be made for this myth since today there is no evidence to justify a relationship between wet hair and an increase in viral infections. Going out with wet hair causes a great loss of body heat (since the head has a lot of vascularized surface), which generates notable thermal discomfort. This translates into a feeling of very cold, feeling cold and perhaps accompanied by a headache due to muscle tension derived from the cold, but the humidity on the scalp does not attract germs or facilitate infection. Images | Dmitriy Kievskiy Brittany Colette In Xataka | H5N1 bird flu unleashes a massacre in Antarctica: half of the female seals have already disappeared

Something is going wrong with AI. The US is turning to energy solutions that it thought were buried to power data centers

The race to develop and operate increasingly powerful artificial intelligence models comes at a cost that is rarely at the center of the technological narrative. It is not in the chips or the software, but in the huge amount of electricity needed to keep active data centers running around the clock. In the United States, this pressure is already being translated into concrete decisions: polluting power plants that were in retirement are being restarted to cover increasing peaks and tensions on the grid. The paradox is evident, the most ambitious advance in the technology sector depends, for the moment, on energy solutions from another era. The problem is not so much an absolute shortage of electricity as a time lag. The demand for data centers linked to AI it’s growing much faster than the ability to launch new electrical generation, especially renewable, in short terms. Building large energy infrastructures takes years, while these complexes can advance in much shorter time frames. Faced with this temporary shock, network operators and electricity companies are turning to what already exists and can be activated immediately, even if it is more polluting. PJM in context. The clash between electricity demand and supply is perceived with special clarity in the PJM region, the largest electricity market in the United States, which covers 13 states and concentrates a very significant part of the country’s data centers. We can understand it as a large regional electricity exchange that coordinates generation, prices and network stability in real time. There, the growth of data centers linked to AI is putting to the test a system designed for a very different consumption pattern, making PJM the first thermometer of a problem that is beginning to appear in other areas. What is a central peaker. The calls central peakeror peak, are facilities designed to come online only during short periods of peak demand, such as heat waves or winter peaks, when the system needs immediate reinforcement. They are not designed to operate continuously, but to react quickly. According to a report According to the US Government Accountability Office, these facilities generate just 3% of the country’s electricity, but they account for nearly 19% of the installed capacity, a reserve that is now being used much more frequently than expected. South view of the Fisk plant in Chicago The case of the Fisk plant, in the working-class neighborhood of Pilsen, in Chicago, illustrates well how this shift translates on the ground. It is an oil-fueled facility, built decades ago and scheduled to be retired next year, that had been relegated to an almost testimonial role. The arrival of new electrical demands associated with data centers changed that equation. Matt Pistner, senior vice president of generation at NRG Energy, explained to Reuters that the company saw an economic argument to maintain the units and that is why it withdrew the closure notice, a decision that returns activity to a location that many residents believed was in permanent withdrawal. When the price rules. The change is not explained only by technical needs, but also by very clear market signals. In PJM, the prices paid to generators to guarantee supply at times of maximum demand skyrocketed this summer, more than 800% compared to the previous year. An analysis by the aforementioned agency shows that about 60% of oil, gas and coal plants scheduled for retirement in the region postponed or canceled those plans this year, and most of them were units peakerjust the ones that best fit in this new scenario of relative scarcity. The bill for this energy shift is paid above all at a local level. The power plants peaker They tend to be older facilities, with lower chimneys and fewer pollution filters than other plants, which increases the impact on their immediate surroundings when they operate more frequently. Coal is also postponed. The phenomenon is not limited to power plants peaker fueled by oil or gas. On a national scale, several utilities have begun to delay the closure of coal plants that were part of their climate commitments. A DeSmog analysis identified at least 15 retirements postponed from January 2025 alone, facilities that together represent about 1.5% of US energy emissions. Dominion Energy offers a clear example: In 2020 he promised to generate all its electricity with renewables by 2045, but after the company projected that data center demand in Virginia will quadruple by 2038, it is now taking a step back. Images | Xataka with Gemini 3 Pro | Theodore Kloba In Xataka | A former NASA engineer is clear: data centers in space are a horrible idea

We have found the Achilles heel of the most feared fungus in hospitals, and that already gives us hope

In the hospital environment there is a fungus that undoubtedly It is a real nightmare for modern healthcare systemssince it can put an entire hospital floor in check. We talk about the fungus Candida auris, which was first identified in 2009 and is undoubtedly a “superfungus” resistant to most common drugs and that it can spread quickly and be a silent epidemic that kills more and more human beings. Your weak point. Due to its aggressiveness, science has a clear objective: find your weak point to be able to develop a drug that allows us to destroy it. Now a group of researchers has published research in Communications Biology that changes the rules of the game: They have identified the exact genetic process that the fungus uses to survive inside the human body. And knowing its insides gives us options to destroy it. The iron problem. Like almost any living organism, this fungus needs iron to grow, replicate and cause damage. In the human body, iron is not “free” precisely as a defense system to prevent pathogens from using it against ourselves. Now science has seen that the fungus Candida auris It has a strategy to avoid this defense barrier that our body has. And the secret is in your genetics, specifically in some specific genes called XTCthat They literally act as ‘suction pumps’ which allows the fungus to capture iron even in the most hostile conditions. And this is the key. If iron is what feeds them, and we already know how they get the mineral from our own body… we already have the key to preventing them from consuming our own reserves. An unexpected ally. One of the biggest challenges in studying this fungus is that it has the ability to reproduce at high temperatures such as 37ºC. This makes it difficult to use traditional models to carry out studies, which until now were zebrafish, which want cold waters. To overcome this drawback, the research team used a rather innovative model: the killifish. A small fish that is capable of living in desert environments and tolerate temperatures of up to 37 °C, making it a perfect “living laboratory” to observe how the fungus behaves in real time within a vertebrate organism. Its importance. We must keep in mind that we are dealing with a pathogen that the WHO classifies as “critical priority”and that is why this research gives rise to creating drugs that attack the ‘suction’ system of fungi in order to defeat them. Plus, we already have something in our drug repository that we could use: iron chelators. An option that can ‘starve’ mushrooms, but has yet to be tried. In addition to this, the pathogens will be able to be identified much better, since there are strains of fungi that are much more aggressive because they capture a much greater amount of iron inside. The future. Although we have the focus about superbugs that can doom humanity, research must also focus on fungi that are developing resistance to specific treatments. In this way, finding a route that the fungus “cannot avoid” gives us, for the first time, a strategic advantage that we should not hesitate to use. Images | masakazu sasaki In Xataka | A viral video has “shown” all the bacteria in a drinks can. It’s more complex than it seems

In 1844 there were already people playing chess online, although not in the way you are thinking

On November 18, 1844, the Washington Chess Club challenged their Baltimore counterparts to a game. Nothing out of the ordinary, except for one detail: the Baltimore players were still in Baltimore, and the Washington players remained in their city, separated by a distance of about 60 kilometers. The feat was achieved thanks to the Internet of the time: the electric telegraph. And just six months after Samuel Morse inaugurated the first telegraph line in the United States with the message “What has God wrought?” The origin of an idea. Just like relates IEEE Spectrum, it all started days before with a game of checkers. On November 15, Alfred Vail, Morse’s associate in Washington, proposed to Henry Rogers in Baltimore to play by telegraph. Rogers devised a system of numbered squares to communicate positions, and the idea soon evolved into chess, at which time both clubs challenged each other from their respective cities. An ingenious system for transmitting plays. Vail and Rogers assigned a unique number to each of the 64 squares on the board. In this way, each shift was summarized in transmitting two numbers by telegraph. In this sense, chess was ideal for a test with said device, since it requires little information per move and does not need a complex communication channel. During the games, 686 moves were transmitted with almost no errors, as Vail recorded in his magnetic telegraph journal, which is now It is preserved in the Smithsonian. More than just entertainment. Although it began simply as a test leading to a little private pleasure between two enthusiasts, telegraphic chess soon attracted public and political attention. Orrin S. Wood, a telegraph operator, wrote to his brother-in-law on December 5, 1844, about the “considerable excitement” generated by these items, adding that many congressmen seemed interested. Morse took advantage of the moment, for in his letter to the Secretary of the Treasury to obtain financing and expand the network to New York, he argued that the telegraph could transmit news from Congress or the whereabouts of wanted criminals, but he also noted that several games of chess had been played “with the same ease as if the players were sitting at the same table.” Encrypted information system. The organizers of these games considered that they had devised a pure information system that fit perfectly with the possibilities of the media that were beginning to emerge at the time. And if we think about it, each play was a precise and brief data packet that traveled through copper cables. However, the initiative generated controversy, since on December 5, Rogers warned Vail that they were causing “an unfavorable impression on the religious part of the community”, although it is currently unknown what the complaints were. What is known is that on December 17, 1844, chess was no longer played along those lines. A tradition that lasted. Just like account In the middle, in 1845 a game was played between London and Gosport with the participation of the inventor Charles Wheatstone and the teacher Howard Staunton. Decades later, between 1890 and 1920, confrontations between clubs by telegraph became common. As time went by and new technologies developed, playing chess from two different places became increasingly easier. In 1965, grandmaster Bobby Fischer played from New York against opponents in Havana by teletype, since the State Department prevented him from traveling to Cuba. And if we go even further, in 1999, world champion Garry Kasparov He faced a team that represented “the world” through a Microsoft forum. Chess as proof of inventions. Today, millions of daily games are played online around the world through platforms such as Chess.com. The truth is that chess has become a kind of natural companion for each new means of communication that has emerged throughout history. Despite how difficult it is to master all the legs of this game, the information needed for the games to flow is extremely simple. And perhaps that is why, 181 years after that first game via telegraph, chess continues to endure in the digital age. Cover image | Denis Volkov In Xataka | In 1938 Spain was divided in two. So two “Gordos” were delivered from the Christmas Lottery

Millionaires from the US and Mexico invest their fortunes in Spain

In 2025, the luxury real estate market in Spain he has lived a silent movement but constant. Madrid and Barcelona have become the main destinations for investments of the great fortunes from the US and Mexico, which are buying luxury homes in some of the most exclusive urban areas of the main capitals. The data of the General Council of Notaries confirm a clear increase in foreign buyers in high-value transactions, especially in neighborhoods where the price per square meter already moves above 10,000 euros per square meter. The new buyers. The statistics of the General Council of Notaries show that in 2025 the purchase and sale of luxury homes by foreigners maintains considerable weight in Spain. According what was published According to Idealista, in Madrid, operations carried out by foreigners already represent around a fifth (21%) of sales in prime areas. In Barcelona, ​​this percentage is somewhat higher, especially in districts where luxury housing concentrates a large part of the available supply. Within this group, buyers of American and Mexican nationality stand out for the average amount of the operations, well above the market average. Specific neighborhoods and heart-stopping prices. He interest of these buyers concentrates on very specific enclaves. In the center of Madrid, neighborhoods such as Salamanca, Recoletos, El Viso or certain areas of Chamberí accumulate a good part of the operations carried out by large international fortunes. These are areas where the price per square meter easily exceeds 10,000 euros and where it is common for the price of housing to be above one million euros. In Barcelona, ​​the pattern is similar. Districts such as Sarrià-Sant Gervasi, Pedralbes or Ciutat Vella attract foreign buyers looking for unique, rehabilitated or properties with high heritage value. Why the US and Mexico are looking at Spain. Behind this movement there are several factors that reinforce each other. On the one hand, Spain offers legal stability, property security and a relatively predictable tax framework for large assets. On the other hand, Madrid and Barcelona function as international business hubs well connected to America, with frequent direct flights that keep them connected to Miami, Mexico or New York. In the case of Mexico, the cultural and linguistic link also plays a relevant role, while American buyers especially value the relationship between price, quality of life and services compared to other large European cities. In this way, they use their home in Spain as a way to improve your quality of life or as a gateway to your businesses in Europe. They can pay more, so prices skyrocket. The impact of this international demand can be seen in prices. According to data According to Idealista, the average value of housing in Spain has risen around 7.9% year-on-year in 2025, with Madrid and Barcelona leading the rising prices. In the luxury segment, the pressure is even greater due to the scarcity of properties of this type and its high demand. Although these purchases do not compete directly with affordable housing, they do contribute to reinforcing the dynamic of rising prices in the most sought-after areas. The result is a market in which a crowding out effect occurs in which local rich are displaced to other neighborhoods by wealthier millionaires. In this way, Madrid and Barcelona are consolidated as attractive places for millionaires to have their second residence, especially in a context of international uncertainty. In Xataka | How much money do you need to be among the richest 1% in Spain Image | Unsplash (Eddie Pipocas)

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.