We had been wondering for years why the Chernobyl wild boars were so radioactive. The answer was not in the accident

Four decades after the accident at the nuclear power plant located in Prypiat, the animals of Chernobyl they continue generating fascination. These survivors in one of the most contaminated regions in Europe they surprise us in many ways, but if there is an enigmatic species in this place it is the wild boar. One of the most radioactive species from Chernobyl. Solving the mystery. In 2023 it appeared a new trackrevealed by a team of researchers, about these animals: we finally know why their radioactivity is greater than that of other species. The answer has less to do with the nuclear accident itself than with something that happened long before. More radioactive? There is very little we still know about the animals of Chernobyl. One of the most curious enigmas was that of wild boars. To understand why we have to talk about one of the most polluting radioactive isotopes, caesium 137 (Cs137). The half-life of this isotope (the time in which half of the atoms we have of the material will have disintegrated) is just over 30 years. The concentration of cesium in the food chain should in principle be reduced even further since the atoms tend to leach into the soil or be carried away by water into rivers. Going down. That is why the level of radioactivity in animals such as deer or roe deer has decreased significantly in the area. Not only has this situation not occurred in wild boar populations: their radiation levels have remained almost constant, that is, the decrease is not even in line with what the semi-disintegration of Cs137 would imply. It is the “wild boar paradox”. Nuclear tests and radioactive truffles. The answer comes from cesium 135. The team that solved this mystery did so by focusing not on the radiation levels but on its origin. They verified that it was this other isotope of cesium that was behind this phenomenon. Cs135 has a much longer half-life, which explains why the reduction had been smaller. This also makes it more difficult to detect the presence of Cs135. As explains the responsible team From the study, each type of nuclear incident has its own “signature.” It is estimated that 90% of the Cs137 present in Europe was released by the Chernobyl accident, but this is not the case for Cs135. The origin of this is 68% in the nuclear tests carried out in the context of the cold war. Just the right depth. The diet of wild boars has also been one of the key factors when it comes to understanding the reason for their radiation levels. These animals feed on a type of truffle (Elaphomyces) that grows in the subsoil, at depths of between 20 and 40 centimeters. As we pointed out before, part of the radioactive cesium It was seeping year after year into the soil of the area. At the rate of a few millimeters a year, cesium (both from nuclear tests and from the accident) has been advancing towards these depths, contaminating these mushrooms, a source of food for wild boars. From Chernobyl to Bavaria. The study that clarified this mystery was carried out by analyzing a population of 48 wild boars in the state of Bavariasouthern Germany. The analysis details were published in the magazine Environmental Science & Technology. In the long term. The results of the study invite us to think that the situation will not change in the short term. That is, it is unlikely that the radioactivity levels of wild boars will begin to decline in the coming years until they are equal to those of other similar animals such as deer or roe deer. The greater radiation present in these animals has made hunters resist their capture. This implies that the populations of these wild boars will go increasing in the future. Perhaps their expansion through central Europe will cause the radiation levels of these animals to decline generation after generation but, from what we have seen, this process could still continue for decades. In Xataka | When Chernobyl exploded in 1986, Spain was freed from the radioactive cloud. AEMET has now discovered that it did it for very little In Xataka | Some Spanish scientists are recreating the Chernobyl accident in Seville. Objective: see how it affects biodiversity Image | Joachim Reddemann / Кирилл Пурин *An earlier version of this article was published in July 2024

1,800 years ago the Romans had an amulet against bad luck. It was literally a tiny penis.

Measures about three centimetersis cast in bronze with great detail (anatomical) and despite being around 1,800 years old, it is surprisingly well preserved. We talk about a phallus. A penis. An ancient figurine representing male genitalia that archaeologists have just unearthed in a roman site from Cumbria, in the northwest of England. The most curious thing, however, is not the appearance of the penile statuette itself. But that it took so long for researchers to find it. We explain ourselves. Under a cricket pitch. He Carlisle Cricket Club is a large resort for cricket lovers located on the outskirts of the town of Carlise, in Cumbria, England. That’s today, of course. If we go back almost 20 centuries to that same land, located on the banks of the eden riverwelcomed some hot springs where the Romans came to chat and relax. Years ago a group of archaeologists started investigating in the area to search for remains of that remote Roman past. Among the many things they recovered at the site, in addition to ceramics, fragments of pillars and heads sculpted in stone, there is one that has attracted attention: a penis. What do you mean, a penis? The figurine in question revealed it a few weeks ago the photographer Pete Savin in And archaeologists believe that the piece has some 1,800 years. It would be logical to think that Savin or the director of the site, Frank Giecco, raised their eyebrows when they encountered such a discovery. However, the opposite happened: what had surprised them for some time was not finding any phallic figurines among the Roman ruins of Carlisle. “It is unusual that we have not found a phallus-shaped object at the site before, as it is very rich in other types of objects,” admits Giecco to the BBC. Don’t say penis… No, say best amuletwhich is the function fulfilled by the figurine found in Cumbria. The researchers they are convinced that its purpose was not to simply represent a penis and the piece did not have an obscene or sexual nature either. It was not even a symbol of fertility. At least that wasn’t his main goal. For the Romans the device surely acted as a talismana protective tool designed to attract good luck and ward off the evil eye. The Romans were so convinced of the healing power of these phallic representations that they frequently resorted to them, either by capturing them in figurines that they would then hang from their belts and use as jewelry or by carving them on the walls. Click on the image to go to the tweet. A phallus for the collection. The truth is that you have to take a quick look through the newspaper archive to see that discoveries like the one in Cumbria are relatively frequent. Even in England. Or in Cumbria itself. In 2019 a group of archaeologists from the University of Newcastle cataloged there several inscriptions left by Roman soldiers in a quarry near Hadrian’s Wall, a series of ‘graffiti’ drawn on the rock in 207 AD and including (exactly!) the relief of a phallus. Last year another team focused on Vindolandaone of the Roman forts that protected Hadrian’s Wall, encountered another similar surprise. During their excavations they located a penis-shaped pendant hidden among the challenges of a wall from the 4th AD. Archaeologists speculate that the piece, made of jet, was lost at the beginning of that same century. And given how polished its surface is, they believe that the owner of the amulet handled it frequently. Small, big, huge. Carlise’s piece barely exceeds three centimeters and Vindolanda’s (at least for the photos shared by researchers) appears even smaller. However, not all representations were so minuscule. In 2022, while investigating a site in the province of Córdoba, archaeologists discovered a bas-relief that shows a 45 centimeter phallus long. The figure was carved directly on the cornerstone of a large building, another relatively common habit. “It was common to place them on the facades of houses and soldiers wore small phallic amulets as symbols of virility,” explains to The Country Andrés Rodlán, director of the project, although he also recognizes that Córdoba engraving breaks the mold. “This one is unusually large.” The list of phallic representations found in recent years goes on and on, with discoveries stretching from the distant lands of Britannia. to Omritin Israel. Why this obsession? The experts believe that phallic figures were so popular not because of their explicit nature, but because of their enormous load of meanings. Whoever carried a figurine of a penis or decided to sculpt it on their wall did not simply intend to show a male genital. He sought to protect himself with an amulet capable of warding off the evil eye. In fact, they not only surrounded themselves with images of more or less anatomically accurate penises. They also created figurines of winged phalluseswith animal shapes or with bells. “Phallic emblems are found on a wide variety of Roman objects, from amulets and frescoes to mosaics and lamps. They were symbols intended to attract good luck and ward off evil spirits. As the ancient author Pliny attests, even babies and soldiers wore such amulets to invoke divine protection,” they explain from the MET Museum. The reality is that, if history has shown anything, it is that humanity has always shown a fascinating inclination to represent penises everywhere. Images | The MET Museum and Carole Radatto (Flickr) In Xataka | Almost 2,000 years ago a Celtiberian soldier visited the most remote frontier of the Roman Empire. Then he returned to Soria with a souvenir

Google and Apple have been wanting to kill SMS for years. So they have signed peace between their messaging apps

Apple and Google have been betting on their own protocols for years RCS messaging. Relevant solutions in territories like the United States, but that do not fully penetrate the rest of the world. Despite this, both companies have closed an important agreement, so that when chatting from an Android to an iPhone the communication is encrypted. The novelty. Google has announced an agreement with Apple to implement end-to-end encryption for RCSensuring that chats between Android and iOS are secure by default. Although both systems had encrypted device-to-device communication (Android to Android and iPhone to iPhone), this security measure did not apply when we communicated with a different operating system. Why is it important. From now on, if you are looking for a safe way to communicate without going through applications like WhatsApp or Telegram, use the native Messages app (have an iPhone or have an Android) is an excellent option. There is no need to download anything, files can be shared, and the information does not pass through the hands of anyone other than Apple or Google. It is not the perfect solution for those looking for absolute anonymity, but it is a great plan to do without giants like Meta. What is RCS?. RCS stands for “Rich Communication Service”. It is a protocol that came to succeed SMS, and allows communication to be carried out in an encrypted and fast way. Being a protocol and not an app, developers need to create them to use RCS. In the case of Google it is the Messages app and, on iOS, too. When you send a message via RCS, it goes through our operator’s server, and from there to a server certified by the GSMA. It allows you to send images and videos of up to 10 MB and, most importantly, it does not require an internet connection to work. SMS vibes. Why it fails. Apple and Google’s efforts with RCS have to do with a phenomenon that has been happening for years in the US: the overwhelming success of the iPhone and iMessage. In the United States, iMessage is used more than WhatsAppsomething unthinkable in our country. Spain is the country of absolute dominance of WhatsApp, with Apple representing just over 10% of the market share and making it impossible for iMessage to be a rival for the Meta app. Why will he still be alive?. Google, despite controlling 70% of the mobile market with Android, needs a direct way for its users to communicate. And that way is RCS. Apple was forced to adopt it due to European pressure and, although it may not be a massive protocol, it is a key alternative to rival services. Be that as it may, good news for those who want alternatives to WhatsApp or Telegram when communicating from one mobile phone to another without the need for a network connection. In Xataka | Meta will pay $1.4 billion to Texas for violating the privacy of its users. Used facial recognition without permission

164,000 galaxies and 13.7 billion years of cosmic history available to anyone

The James Webb Space Telescope has made a super-detailed cosmic map, which includes 13.7 billion years of the Universe. No other telescope had been able to reach so far with such precision. Hubble tried, but didn’t achieve that much. What was invisible to him is now shown majestic before our eyes. Further and more precise. This new cosmic map it has been possible thanks to the work of a team of scientists from the University of California, Riverside. They have been in charge of analyzing a catalog known as COSMOS-Web, which includes the most extensive compilation of data from this telescope to date. In a space of sky equivalent to three full moons, they have seen what until now was invisible. James Webb’s superpowers. We know that the Universe is expanding, so the galaxies are moving further away, like painted dots on a balloon that inflates more and more. Since light is a wave, the wave emitted by these galaxies also stretches. That involves longer wavelengths which, in the electromagnetic spectrum, correspond to the infrared. This is known as redshift. The older and more distant a galaxy is, the more of that stretching it will have experienced, so there will be more redshift. Therefore, in order to detect very old galaxies, it is necessary to use instruments capable of detecting these infrared radiations very well. That’s where James Webb comes into play, since he has an instrument called NirCAMwhose specialty is precisely that. Furthermore, thanks to the size of its mirrors, with an area 7 times larger than that of Hubble’s mirrors, much more light can be captured and more precise images obtained. Lifting the cosmic veil. The James Webb also has the ability to look through clouds of gas and dust that normally surround younger stars and planets. It’s something Hubble can’t do either, so many more structures are revealed that were invisible to its predecessor. What Hubble didn’t see. Unlike James Webb, Hubble is specialized in detecting mostly the visible and ultraviolet spectrum of light. For this reason, the oldest structures in the Universe have gone unnoticed. By comparing the James Webb cosmic map with the more precise one made with Hubble, it has been seen that what previously seemed like a single structure is actually many. The sharpness of certain structures that seemed very diffuse has also been increased. In short, the resolution has increased. Distances are better measured and some structures are better distinguished from others. We can all see it. The catalog that has just been created contains 164,000 galaxies and a video that shows the movement they have experienced for 13.7 billion years. It is the furthest journey that has been made in the universe with one of these maps. And the best thing is that all this information is open access. Therefore, anyone can access it. Scientists who wish to do so will be able to study it, in search of data that may have gone unnoticed by researchers at the University of California. In short, teamwork is sought. Just as James Webb works as a team with Hubble and soon he will also do it with Romanscientists on Earth should do the same. Image | Image taken by James Webb that is not part of the map (NASA) In Xataka | We have been studying the planets of TRAPPIST-1 for years with great hope. James Webb just knocked it down

Europe has been depending on Amazon, Google and Microsoft for its most critical data for years. You are about to cut off their access

The European Commission is taking action. This organization is expected to present its “Technological Sovereignty Package” on May 27. This directive will include a series of measures aimed at boosting the EU’s strategic autonomy in sensitive areas, and that means something unique: stopping depending as much as possible on US hyperscalers to store critical data. The fear of the off button. The measures are being applied due to growing political instability and some recent cases that have demonstrated the power that the US has over the European technological infrastructure. In May Microsoft “cancelled” the email of Karim Khan, a prosecutor who had been directly cited in an executive order from Donald Trump. Microsoft he denied itbut the damage had already been done, and these problems have raised fears that Trump could use a kind of “off button” against European institutions that depend on the hardware and software infrastructure provided by companies like Microsoft, Google or Amazon. Legal espionage. The CLOUD Act (Clarifying Lawful Overseas Use of Data Act) is a 2018 US law that allows law enforcement to force US-based technology companies (such as Google, Microsoft or Amazon) to provide data, regardless of where it is stored, whether inside or outside the United States. This law updates the Stored Communications Act to prioritize data control over its location. Or what is the same: if you use the services of US hyperscalers, the US may end up accessing your data. And since you’ve accepted their terms of use, you agree to let them legally spy on you if they “need to.” If you want my critical data, you’ll have to protect it. The new regulations require service providers who want to work with critical European data to demonstrate that they are not subject to requests from non-EU governments. This automatically excludes Microsoft, Google or Amazon, because all three are subject to the CLOUD Act. Europe is thus looking for providers that guarantee that critical data will not be in the possession of companies that then have to transfer it to foreign powers. Europe depends on the American cloud. The reality is that today Amazon (AWS), Microsoft (Azure) and Google (Google Cloud) currently control more than 70% of the Cloud Computing market in the old continent. Losing these institutional contracts would mean a significant financial blow, but it also sends a powerful signal to European private companies: if Brussels does not trust the US with its secrets, why should European corporations? The domino effect could be huge. Europe has its own clouds. This directive would give an important opportunity to initiatives that seemed stalled like GAIA-Xbut there are also companies with their own infrastructure such as OVH (France) or T-Systems (Germany). There are significant technical challenges in that area, because US hyperscalers have been refining their offering over the past two decades. However, Brussels seems willing to accept a somewhat less efficient or complete service in exchange for greater autonomy. The options existno doubt, but the challenge is enormous. Migrating is going to be expensive. It is one thing to make the decision and quite another to complete that migration that will require moving decades of data and systems to a different infrastructure. Current data centers would have to be expanded to meet demand, they say some analysisand that would mean a cost of between 14,000 and 24,000 million euros. Consulting companies like Forrester they don’t see anything clear that the EU can achieve cloud sovereignty, and other experts also make it clear that Europe will not abandon the hyperscalers. Traceability. In addition to changing suppliers, the board also wants to impose strict requirements regarding transparency. AI systems that have access to that data must be auditable by the newly created EU AI Office. The Commission wants to know who has access to the code, who maintains the servers and who has the technical capacity to manage and even intercept such data transfers. Data too sensitive. In comments to CNBCEU officials explained that there are active debates demanding that financial, judicial or health data used at the government level and in the public sector have a sovereign cloud infrastructure. That’s also true for military data, of course, and There are already movements in that direction. Fragmented Internet. The move confirms that the world appears to be heading toward a future with a fragmented internet and one that will have important geopolitical boundaries. While the US tries to defend its technology against China, Europe and the entire world are trying to avoid or at least mitigate their excessive dependence on American technological solutions. Image | İsmail Enes Ayhan and François Genon In Xataka | Europe no longer trusts Google. That is why several start-ups are designing an independent payment system on Android

models point to worst El Niño in 140 years and one of the key reports is published on Thursday

All the meteorological agencies in the world are looking at the same building on the east coast of the United States. On Thursday, May 14, before markets open, in College Park, Maryland, a room full of oceanographers and meteorologists will discuss a four-page pdf. In that pdf it will be written the future of the planet. It sounds epic, but it’s more prosaic than it seems. It will not be written clearly, sharply, or with absolute certainty: but it will be. What’s in that PDF? NOAA’s Climate Prediction Center (CPC) publishes the second Thursday of each month your ENSO Diagnostic Discussion. It is the most important report from the global El Niño monitoring systems and, from what the models are saying, the probability of a “very strong” El Niño is going to exceed 25% (and growing). But if it’s monthly… why is this specific report important? Because the index the agency uses to monitor and predict ENSO has changed. Until this year, NOAA I used ONI: an index to measure the sea surface temperature anomaly, but which does not discount the average anomaly produced by climate change. What is expected is that the predictions under the new index (I’ll call it RONI) is significantly less than under normal conditions. If the magnitudes shoot up despite the correction, things will look worse. This report is important because it is the first that will capture the “acceleration” of El Niño at full capacity. What would this entail? Each new NOAA report translates into a cascade of decisions in agricultural, energy, fishing and fire policies. He last major El Niño (2023-24) coincided with 2023 would be the second warmest year on record and 2024 the warmest ever: a strong 2026-27 El Niño could push 2027 to another global record and the impacts are not well measured. It is true that between March and May the reliability of ENSO forecasts drops sharply (because the equatorial Pacific anomalies go through their transition phase); but, in the absence of the June report, this is the best clue we have. What can we expect? We already know that there is a 61% chance that El Niño will be with us between May and July 2026. A 25% chance that it is “very strong.” The important thing to keep in mind is what that means. 61% measures the probability that the equatorial Pacific crosses the threshold of what we understand as El Niño. But, unfortunately, it does not measure how much it will rain in Cádiz, nor what will happen to the crops in Misiones, nor how many hurricanes the Yucatán will see. It is worth remembering that, during the warm phase (that is, during El Niño), the absence of strong trade winds that cool the surface of the equatorial Pacific causes the temperature of that area of ​​the ocean to skyrocket. It is this, through different atmospheric teleconnectionswhich disrupts all the weather systems in the world. What we are not clear about is exactly how. The effects are varied and change depending on the region (“drier conditions than normal in certain parts of the world; while in others it causes more precipitation. Some countries have to deal with major droughts and others with torrential rains”, says AEMET); but when we talk about temperatures there is no doubt: El Niño is synonymous with heat. Everything else remains to be written. Image | Xataka In Xataka | “It is so extreme that it is difficult to believe”: El Niño forecasts depict an event of unprecedented intensity.

“In five years, robots and AI will have to pay taxes for the middle and lower class”

They say that the devil knows more because he is old than because he is a devil. Therefore, when it comes to have a vision of the future In the technological field, few voices have the weight of Bill Gates. After all, he was one of the avant-garde protagonists of the revolution that brought about the arrival of the personal computer into our lives. The co-founder of Microsoft gave an interview to the middle Australian Financial Review in which he presented his vision on the impact of AI on employment and warns of something that is already being debated in some political and technological circles: whether AI and robotics are going to reduce the need for laborHow will the subsistence of those who lose their jobs be guaranteed? Taxation of the future: robots that pay taxes. The millionaire exposes a concern that other technological billionaires like elon musk or Sam Altman have already expressed on numerous occasions. As Gates explained in his interview, the arrival of AI and robotics to industrial production will have a direct impact on millions of middle and lower class workers who you may lose your job without the option to return to one of the newly created jobs that are expected to replace current jobs. As Gates explained, “We have not yet reached the point where it is necessary to completely change tax structures, but we may do so within five years.” The businessman suggests that the solution could be to “shift the tax burden from labor, at least from medium or low-income workers, to capital, or specifically to the taxation of robots or artificial intelligence.” The millionaire’s proposal is that, if a robot or an algorithm occupies the position of a personthat machine should contribute financially, also replacing the employee in his tax obligations. Gates does not ask that innovation be stopped, but rather that the benefits of automation not remain solely in the hands of those who own the technology, but that the benefit of this advance be distributed to society as a whole. The debate, he insists, must occur now, before the displacement of workers is irreversible. On the verge of an inevitable transformation. The Microsoft founder acknowledges that the current focus is on the productivity offered by AI and robots, but points out that his real concern is how governments are going to manage the displacement of human workers from their jobs. It is not a question of if it will happen (something the millionaire takes for granted), but of when and with what speed. The International Monetary Fund has already warned that up to 40% of global jobs have some degree of exposure to AI, with a special impact on middle-class workers and administrative positions, much more susceptible to automation with AI. Gates argues that governments must begin to design fiscal policies adapted to an economy where a growing percentage of the work will not be done by a contributing employee, but will fall to automated systems. Most AI companies will fail. In his speech, the technology millionaire also left room to analyze the current scenario of technology companies participating in the AI ​​race, and he does so with a serious warning: “If you chose the right company, like Microsoft, Google or Apple, you will have done very well. But most AI companies will fail. It is difficult for a non-technical investor to distinguish which ones will prosper.” The businessman advises not to get carried away inflated valuations and bet on established names. The notice comes at a time of massive investment in AI projects, with prices that skyrocket the capitalization of these companies even before having demonstrated that their products They are really competitive. As in the Internet boom of the late 1990s with the dotcomwhen the dust settles only a few actors will still be standing. Global competition and monopoly risk. Beyond the impact on AI employment, Gates warned about geopolitical competition in the development of this technology in this kind of space race that we are living. “What we are seeing now is fierce competition.” China, for example, offers AI models for free, which puts pressure on other companies to set very low prices. “China offers free models and the rest of the companies offer very, very low prices. We would not want a single country or a single company to be the only one good at AI. But I do not see things going that way, at least for now,” said the millionaire in the face of the technological race for AI that the US and China are starring. In Xataka | While technology companies dispense with juniors to replace them with AI, IBM is doing the opposite: catching bargains Image | Flickr, amazon

We had been searching for the origin of the most massive black holes for years. The answer is a cosmic carom of extreme violence

All black holes They are the fruit of a very violent activity. However, there are some for which the known processes are insufficient. Now, an international team of scientists has discovered how the most massive black holes in the Universe form. It is a process so violent that it needs a huge star cluster to support it. Two groups of black holes. This team of scientists has analyzed the LIGO–Virgo–KAGRA Gravitational Wave Transient Catalog (GWTC4), which includes 153 detections of black hole mergers through gravitational waves. By analyzing all the available data focusing on the spin of black holes, they have seen that all of them can be divided into two large groups. On the one hand, black holes of lower mass, which arose from an ordinary stellar collapse. On the other hand, very massive black holes, arising from secondary mergers in the environment of dense star clusters. Okay, now that you understand. Generally, black holes are formed when a very massive star that has already run out of fuel collapses. This gives rise to an explosion in which the outer layers of the star are expelled, leaving only a very dense core. It is so dense that it generates a great gravitational pull and nothing can escape from it. On the other hand, there are such massive holes that do not fit with this process. They are believed to be second generation black holes. That is, two black holes they merge and then the result merges with another black hole, becoming much more immense. That would be the second group that has been detected in the GWTC4 catalog. Something doesn’t add up. This black hole merger process is so violent that, as soon as the first merger occurs, the result would fly away like a rocket For it to stay in place and merge with a third black hole, something is needed to retain it. These scientists have discovered that these are densely populated star clusters. There are so many stars in them that the gravitational attraction of all of them keeps the black hole still in place. And what does spin have to do with it? Spin is a parameter that refers to the spin of black holes. When formed in the conventional way, the spin is predictable and perfectly aligned with the star that gave rise to the black hole. On the other hand, when they are formed by a process as violent as these consecutive fusions, the spin takes a random direction, but a value predictable from the sum of the spins of the rest of the black holes. These scientists, therefore, saw that all the data coincided with that hypothesis: consecutive mergers in the environment of a very populated star cluster. A forbidden zone. On the other hand, these scientists found a forbidden strip of stellar size in which black holes could not form. There are small or huge ones, but not medium ones. Although this is something that was intuited, the complete set of data they have obtained gives a twist to what is known about the formation of black holes. Relationship with nuclear physics. As explained by these scientists, this detected mass limit seems to be related to a series of nuclear reactions that take place inside stars. Stellar nuclear reactions are nuclear fusion. Humans have learned to control nuclear fission, but it poses risks that would be solved if we also mastered nuclear fusion. Until now It is being a complicated challengebut perhaps these new findings, obtained thanks to gravitational wave analysis, could shed a little more light on this research. Everything adds up. Image | NASA, ESA, STScI and A. Sarajedini (University of Florida)/NASA, ESA, CSA, Ralf Crawford (STScI) In Xataka | What happens if you fall into a black hole, explained simply in an overwhelming NASA simulation

2,500 years ago Athens suffered an epidemic that marked the end of its golden age. Science is determined to know what caused it

“Words are insufficient when trying to describe this disease. As for his suffering, it seemed almost beyond what is humanly bearable.” Although the news about the hantavirus They make it sound even scarier, that commentIn reality, it is more than 2,000 years old. The chronicler Thucydides wrote it in his ‘History of the Peloponnesian War’ to give an idea of ​​the terrible plague that devastated Athens around 430 BC, an ailment that he himself suffered and took the lives of some 75,000 people. For centuries that epidemic has been remembered as the ‘plague of Athens’although we don’t actually know exactly what caused it. Now a group of Greek researchers have shed some more light on that dark episode. Epidemic detectives. In a hyperconnected world, in which people are capable of traveling thousands of kilometers in a few hours and it comes with blocking a remote strait of the Middle East to put the world economy in check, the specter of pandemics seems more present, but the truth is that humanity takes centuries dealing with him. Before the COVID pandemic, we had, for example, the 1918 flu or the disastrous Black Deathwhich devastated Europe between 1346 and 1353 and (by some estimates) reached 60% case fatality rates in some regions. Long before any of them, in the times of Classical Greece, another equally devastating epidemic was recorded: the plague of Athens. Thanks to authors like Thucydideswho in addition to being a chronicler suffered it himself, today we can learn in detail how that outbreak developed and experienced, which left tens of thousands of dead. The episode was important not only because of its death toll: between 75,000 and 100,000 in the four years that elapsed from 430 to 426 BC One of the deceased was Periclesa historical leader of Athens. In fact, experts usually agree that the plague precipitated the decline of the Athenian Golden Age and its death toll facilitated its final defeat in the war against Sparta. The great unknown. Despite this historical value, the Athenian plague remains shrouded in unknowns. We know when it developed, we know where it developed and there is even evidence suggesting that the initial outbreak occurred in sub-Saharan Africa, spread to Egypt and Libya and then passed to Athens via Piraeus. What is not clear is what exactly caused the plague and why it was so disastrous. And Thucydides was in charge of describing all its symptoms. Now a team from the University of Athens (NKUA) have wanted to clear up this mystery by analyzing the symptoms described by the chronicler and comparing it with that of known ailments. The result they have published it in the magazine AMHA. A pulse on history. If it is difficult to track a viral outbreak in 2026, the task becomes daunting when we are talking about one of the first known epidemics in human history. To face such a challenge, Dr. Dimosthenis Papadimitrakis and his colleagues had an idea: they looked at the symptoms described by Thucydides and other sources, They selected 17 diseases known that more or less fit that symptomatology and created a “metric system” with different scores to determine which of them best fit the epidemic that hit Athens 2,400 years ago. “The most terrible thing, despair”. Whether due to his zeal as a chronicler or because he himself suffered from the disease, Thucydides detailed the symptoms suffered by those who contracted the Athenian plague: migraines, high fever, redness and inflammation of the eyes, bad breath, sneezing, cough and profound gastrointestinal discomfort, such as nausea, vomiting, spasms and painful diarrhea. Over time, rashes, pustules and ulcers appeared on the patient’s skin, especially in the abdomen area. Those who could not stand the disease died after seven or nine days, after experiencing intense burning that led them to take off their clothes or even immerse themselves in cold water. “Gangrene of the extremities and eyes was common among both survivors and victims,” detail experts, who remember that it was not unusual for patients who survived the plague to do so with amnesia. “The most terrible thing was the despair into which people fell when they realized that they had contracted the plague. They immediately adopted an attitude of absolute hopelessness and, by giving in in this way, they lost their capacity for resistance,” Thucydides reflects. “Words are insufficient when trying to give a general image of the illness.” Ruling out candidates. With that starting point, Papadimitrakis and his colleagues developed a list of diseases that the Athenians of 2,400 years ago could have contracted and that coincided to a greater or lesser extent with the symptoms described by Thucydides. They came up with 17 potential ‘candidates’, including cholera, measles, scarlet fever, tuberculosis, Ebola, malaria, smallpox, bubonic plague, ergotism or Lassa fever. Then with that chart on the table, two questions were asked: Which of those diseases caused rashes and gangrene? How many are transmitted between humans? And what historical evidence is there for each of these ailments? Thanks to this analysis they reached a series of conclusions, although the team warns that they are only hypotheses based on probability, not firm and unquestionable truths. “The plague of Athens presents difficulties in identifying the causal agent due to several factors. The main source of information is the accounts of Thucydides, but his lack of medical knowledge and the lapse of up to 20 years between the events and their documentation can lead to erroneous interpretations,” the authors explain. “Furthermore, the inability to isolate or culture the responsible microorganism poses a major obstacle. Even if preserved bodies of plague victims were discovered, the microbes would have decomposed over time.” And what is the conclusion? That of the diseases analyzed, the one with the most votes is typhoid fever. “It appears to meet most of the criteria, so it is considered the most likely agent,” summary the researchers. Furthermore, in a necropolis from the time of the epidemic, remains of the bacteria that trigger this disease … Read more

When the fathers of quantum physics discovered the fundamental ideas of reality, they discovered that a Jesuit had already been there 200 years before.

The story is a classic of popular science: 200 years before the birth of quantum physics, the Jesuit Ruđer Bošković advanced the central ideas of 20th century physics: field theory, the uncertainty principle and even dark energy. Furthermore, he did it alone. What Bošković did, as Héctor Farrés points outit’s incredible. Not only is it real and important, but it is beyond doubt (Heisenberg himself lor recognized in 58), but what he didn’t do too. The latter is, in fact, the most interesting. What Bošković knew. In 1758, the Jesuit (who was one of the great mathematicians of the time and had even helped fix the dome of St. Peter’s) published in Vienna ‘Philosophiae naturalis theoria redacts ad unicam legem virium in natura existentium‘. In this book he developed ideas that he had already presented almost 15 years earlier in Rome: that matter was not made of extended solid corpuscles (as Newtonian physics maintained), nor of inextended metaphysical monads (as Leibniz thought). For Bošković, matter is essentially composed of dimensionless points that only exist as points of force. In essence, Bošković believed that Newton’s inverse square law was a ‘limiting case’ (for planetary bodies) of a different equation that governed the relationship of all things in nature. Just this idea that scale is important, that the behavior of forces could change radically depending on it, deserves to go down in the history of physics. Because? Because it is the piece that helps us stop understanding matter as impenetrable ‘bodies’ and allows us to understand that impenetrability as an effect: it was giving mathematical entity to atomism. And the most interesting thing is that his later influence is real. It is documented, come on: there is a chain of readings that takes us from these ideas to those of William Rowan Hamiltonthe most direct precursor of quantum mechanics. Apparently, Werner Heisenberg, he of the uncertainty principle, he even said in 1958 that “the remarkable concept that forces are repulsive at small distances and must be attractive at greater distances has played a decisive role in modern atomic physics. (…) Bohr’s quantum theory of the atom can be precisely related to this concept, and the study of the atomic nucleus during the last thirty years has taught us that the particles that constitute the nucleus, protons and neutrons, are bound together by precisely such a force.” However, one should not exaggerate either. As Borges said when talking about Kafka, authors create their own precursors. That is, as Heisenberg himself said, Bošković’s work “contains numerous ideas that have only achieved full expression in modern physics in the last fifty years.” They were brilliant intuitions that are fully understood in the light of quantum physics, but not seeds that logically contained all the physics of the 20th century within them. A very common mistake. Too common, in fact. We don’t usually approach history from what we already know and there, of course, the similarities shine in the middle of the night. The reality is that what we see are usually ‘pareidolias’: things that say more about us and the functioning of our brain than about what happened in the past. Image | Xataka In Xataka | One of the greatest philosophers of the 20th century already identified the problem of Generation Z: “Not tolerating boredom”

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.