Fed up with paying almost 8 euros for a Guinness, someone thought of setting up an index to find cheap beer

How delicious is that little beer that you drink right after leaving work or after a paddle tennis game and how angry it is when you find out that they have raised the price. Matt Cortland He paid €7.80 for a pint of Guinness in Dublin in March 2026 and didn’t like it one bit (the price, not the beer). So instead of criticizing the waiter or posting a review on Google complaining like some people do, he adopted another strategy that was slightly more laborious but much more effective (judging by its results): a very complete price index where he would know where to drink the best and at what price. Because revenge, like beer, is served cold. The project. Is called Guinndex and is independent of the very famous Irish beer brand. You go to the website, enter a pub, a city, a county or a postcode in the box and it returns pubs and the cost of a pint, as well as useful information such as its location or its score. Or you zoom in on the map to see with a traffic light map which taverns look cheaper than others. A good way to save if you travel to Ireland and fancy a pint of Guinness. In fact, it has very diverse rankings ranging from how long it takes to earn a pint (depending on salary) to pubs named after animals or the best pub names (praise be the “Hairy Lemon”). Today it has almost 6,500 registered pubs in the 32 counties of the country and almost 1,300 prices verified and rising thanks to anonymous contributions from users. The price index for Dublin. Guinndex Why is it important. Because the Irish Central Statistics Office stopped tracking the price of a pint since 2011, leaving a data gap of more than a decade in a country where Guinness is much more than a beer. And although Guinness is almost a religion in Ireland, it is the same everywhere: no one knows for sure if they are overcharging you compared to the standard price or how much extra. The Guinndex fills that gap with real, verified data, not estimates. Furthermore, it does so publicly and for free, so that it allows obtaining an objective reference so that consumers have information and can put pressure on prices. It’s the market, friend. On the other hand, and leaving aside the anecdote of finding where to drink cheaper, what it shows is relevant: that the cost of carrying out a complex idea has plummeted and streamlined so much that a single dev is capable of setting up a project of this magnitude in just 48 hours when before it took weeks of work, a certain budget and a team. Context. Matt Cortland likes AI, data and Guinness, as he himself admits on the project website. He is an American engineer based in London with strong ties to Ireland: his partner is irishlived and trained there with the George Mitchell scholarship and course the Creative Digital Media master’s degree from TU Dublin. He is not just a tourist they are trying to scam. The project came at a critical time: Diageo, the company that owns Guinness, had applied several price rises in a row and some pubs had taken the opportunity to inflate margins. If you’re not careful, you can pay up to €11 for a pint, although the average price in Dublin is €6.94 and €6.06 nationwide. How has he done it. With an AI agent named Rachel who looked human, understood Irish humor, and had a Northern Irish accent (after several tests, she concluded that this worked best), as its author tells. The task was simple and quick: call, ask the price of a pint of Guinness, say thank you and hang up. Few people discovered that it was a chatbot and there were all kinds of responses, even waiters who offered to buy him a round. During the St. Patrick’s weekend he called 3,000 pubs, answered more than 2,000 calls and more than a thousand pubs provided a price: he already had the Guinndex base. The technical stack was jack, knight and king: the Google Maps API, ElevenLabs for the voice and agent logic, Twilio for making the phone calls, and Claude for extracting Guinness prices from the transcripts. Cortland explains What cost him the most was time, since he only invested about 200 euros. The consequences. The most immediate impact is behavioral: Cortland account that the owner of a pub lowered the price of his Guinness by 0.40 euros and then updated the information in the Guinndex himself. When there is price transparency and it is available to everyone, it is capable of changing behaviors. However, the biggest consequence is the technological moment in which we live: three APIs, 200 euros and a weekend are enough to build a project from scratch, with real utility and that is already changing prices. The bottleneck is no longer money or infrastructure: it is knowing what problem is worth solving. In Xataka | Spain can tell itself as many times as it wants that it hates Cruzcampo. The figures say a very different thing In Xataka | We humans like beer. The big question is whether we like it enough to have invented agriculture Cover | Guinndex and Christopher Zapf

We thought that AI was going to collapse the electrical grid. The solution is to “unplug” it 18 days a year

Daily headlines bombard us with the insatiable hunger for Artificial Intelligence, painting a future where data centers will devour our infrastructure. However, reality hides a fascinating irony: the same technology that clutters cables today could be our greatest ally. According to estimates of DeloitteAI will optimize global systems saving more than 3,700 TWh by 2030, almost four times the energy consumed by all data centers on the planet combined. But to get to that stage, you first have to turn on the machines today. And the solution is surprisingly analog. Paweł Czyżak, from the Ember analysis center and one of the most authoritative voices in the European energy transition, sums it up with a simple idea: A data center does not need to operate at full power every hour of the year. In the face of system collapse, the industry’s new survival dogma is clear: “Connect now and operate flexibly.” The heart attack of the network. We have been victims of what we once defined as “tyranny of 24/7”. Algorithms do not sleep and demand uninterrupted supply. This voracity has caused a heart attack in the traditional data epicenters in Europe (the “FLAP-D” markets: Frankfurt, London, Amsterdam, Paris and Dublin), almost completely paralyzing new deployments. The bottleneck is no longer the latest generation microchips; transformers and free electrons are missing. Added to this physical collapse is the bureaucratic one. The European University Institute (EUI) warns that connection queues are a critical funnel: in countries such as the United Kingdom or Italy, the requested capacity exceeds the peak of national maximum demand by more than 10 times. All of this is aggravated by speculative “zombie” projects that block entry to legitimate developers. The obstacles are, as detailed in the recent study by Camus, encoord and Princeton ZERO Laba double wall: there is a lack of cables for day-to-day operations and a lack of clean capacity built to provide backup. Flexibility as a lifesaver. Is it possible to “turn off” part of the AI ​​brain without the system crashing? Yes. A recent trial led by Nebius, Emerald AI and National Grid showed that an AI cluster was able to cut its consumption by 30% in just 40 seconds to relieve the network, keeping critical tasks intact. Even Google already boasts of having reached 1 GW of “demand response” by combining batteries and the ability to move loads between regions. As Czyżak explainsmoving just 5% of the load (the equivalent of a few critical hours per year) unblocks the grid massively. In fact, this strategy would save more natural gas than a country like Denmark consumes in electricity generation, by preventing electricity companies from having to turn on expensive and polluting combined cycle plants to cover demand peaks. For its part, the Camus and Princeton report proposes to scale this with two mechanisms: Flexible connections: The center operates normally 99% of the time, but in the scarce 40 or 70 hours a year of extreme network saturation, it reduces its computing or draws on its own batteries. BYOC agreements (Bring Your Own Capacity): Big tech finances its own clean energy capacity instead of waiting for the state to modernize infrastructure. The combination is magical: it reduces the wait to connect to the network from 7 to just 2 years. For a technology company, this means starting to bill three years earlier, generating net returns of between 1,000 and 4,000 million dollars per site. The citizen will not pay the bill. On a social level, the transition towards this flexible model brings excellent news for the average citizen. The detailed modeling of Princeton’s ZERO Lab confirms that a flexible data center (under BYOC schemes) assumes practically all of the incremental costs it generates to the electrical system. In other words, the billions needed to host the cloud will not be transferred to household electricity bills. On the contrary, by making the most of the existing network instead of building massive new lines, the fixed costs are distributed among more actors. In Spain, organizations such as the CNMC are already applying “flexible access permissions”forcing by law to accept controlled cuts in emergencies to protect the stability of the country. The plug that will rule the world. In the frenetic geopolitical and business race to dominate the future of Artificial Intelligence, the narrative has changed. It is no longer enough to design the fastest microchip or have the most brilliant engineers. Today absolute victory belongs to whoever has a free plug. But rather than desperately burning gas or waiting a decade for governments to bury thousands of kilometers of copper, the industry has found a pragmatic way out. Demand flexibility from Big Tech Not only does it allow them to turn on their servers years earlier; It protects citizens’ bills, squeezes the infrastructure of the 20th century and banishes the dangerous ghost of a Europe forced to relapse into its old addiction to fossil fuels. Image | Photo by Scott Rodgerson on Unsplash Xataka | There is no energy for so many data centers and the consequence is clear: half of those planned for 2026 in the US are in danger

We thought that the price of World Cup tickets in the US was going to be the biggest nonsense. Wait to travel by train

The World Cup is a universal spectacle, but its prices during the tournament that will be held this summer in North America (United States, Mexico and Canada) will not exactly be within reach of all budgets. Especially if you want to enjoy the final, which will be played July 19 in it MetLife Stadium from New Jersey. And not just because their tickets are sold at exorbitant prices. The region’s public transportation operator has revealed that round-trip tickets between Manhattan and MetLife will cost 150 dollars. That decision has already generated a intense controversy. What has happened? That the celebration of the World Cup in the US is being marred by the enormous cost what it will mean for the fans. Until now we knew that those privileged who want to follow the matches directly in the stadiums will have to pay stratospheric sums for the tickets, especially if we talk about the final which will be played at the end of July at MetLife Stadium. That was relatively predictable. Now we know something else: even tickets to go to the stadium on public transport will be priced at the price of gold. Are they that expensive? Yes. A week ago The New York Times has already advanced that round-trip tickets to MetLife from New York’s Pennsylvania Station would cost more than $100, although the public transportation operator, New York Transit (NJT), was reluctant to confirm the information. The mystery did not last long. On Friday, when announcing the mobility plan for the World Cup, the company revealed (almost in passing) that the filtration of Times had fallen short. “Non-transferable, non-refundable, round-trip train tickets will be on sale exclusively to ticket holders on May 13 through NJ Transit for $150,” keep it up the operator when informing of the transportation services that will connect MetLife Stadium, renamed temporarily as New York New Jersey Stadium to conform to FIFA’s sponsorship policy. In the same statement NJT explains that round-trip bus tickets (also non-transferable and non-refundable) will be sold for $80. Is it more expensive than normal? A lot more. NBC News I remembered These days a round-trip ticket to MetLife Stadium usually costs $12.9, so the fare that those who want to use the train on the day of the final will have to pay will be 11 times higher than normal. The price will be very superior This is what fans who travel between Penn Station (New York) and MetLife pay to enjoy NFL Jets or Giants games. Although the price of bus tickets will also quadruple in Boston, where they will be disputed four gamesthere has been international competitions in which fans with tickets could freely use public transport. In the case of the USA, The Wall Street Journal remember that the original 2018 pact between host cities and FIFA included free transportation, but the requirement was relaxed a few years ago. Now fans must pay $150 for a trip that is covered in less than half an hour by car. Click on the image to go to the tweet. Has it generated controversy? Yes. Because of the amount itself (150), but also because the NJT plan does not contemplate Reduced rates, which means that children and seniors will have to pay the same amount as everyone else. It is important because MetLife Stadium will host a total of eight games of the World Cup in which the teams of Brazil, France, Germany and England (among others) will compete. Among those events also includes the most significant of all: the final. Those who want to skip the train or bus and go by car to MetLife will not have it easy either. The celebration of the World Cup will cut considerably the availability of parking in the area, which explains, among other things, that passes are being offered to park in the parking lot of a shopping center in the area for $225, such as has revealed NCB News. Why does it go up so much? That question connects directly with the political debate that has broken out in New Jersey around the World Cup, its costs for the public coffers and the return it will have for the region. Governor Mikie Sherrill (Democratic Party) assures having “inherited” an agreement by which FIFA “does not contribute a single dollar” for transportation and warned that NJ Transit will be forced to pay “a bill of 48 million dollars” to mobilize the tens of thousands of fans who will come to watch the games. MetLife Stadium seats more than 80,000 spectators and Sherrill’s message, just like the one NJT has transferred to the New York Times is clear: “The cost of the eight matches will not be borne by our regular users of public transport.” That is to say, the first step is for the fans (if not FIFA itself) to pay for the transportation required by the competition. Sherrill’s position has caused tensions with the federation, which warns of “deterrent” effect What will the train fares have and remember that MetLife has hosted other macro events without the organizers having to pay for transportation. During the debate, there was also talk of the income that FIFA will receive thanks to the tournament and the return for the USA. Is it just transportation? The truth is that no. The transport controversy is added to another that already goes back a long way: that of the price of tickets to enjoy the World Cup matches. A few weeks ago, FIFA already made headlines because tickets for the final were selling for up to $10,990. Not only are they astronomical figures that threaten to become “the most expensive in history”, as warns the BBC. They also far exceed those of a few months ago. In March, after the president of FIFA recognize that prices could “go up or down according to demand,” the OCU denounced the use of “dynamic pricing”. The rates already they have put on guard to Euroconsumers. Images | … Read more

We thought that Voyager 1 had already given everything it could. NASA continues to turn off parts to keep it alive

to some 25,000 million kilometers from Earth, Voyager 1 continues to send us data from interstellar space, Farther than any other ship built by humanity. The probe was launched in 1977 and, almost half a century later, it remains operational with an increasingly delicate condition: to keep it alive, the mission team is shutting down parts of the ship itself. That is exactly what has just happened with one of its scientific instruments, in a maneuver that reveals the delicate moment the mission is going through. The maneuver. On April 17, engineers at the Jet Propulsion Laboratory in Southern California they sent the order to turn off the experiment Low-energy Charged Particlesbetter known as LECP. It is an instrument dedicated to measuring low-energy charged particles, including ions, electrons and cosmic rays from both our solar system and the galaxy. The decision was not improvised. According to NASA, this instrument was next in the order agreed upon years ago by the scientific and engineering teams to cut consumption without terminating the mission. There are no solar panels. To understand why NASA has reached this point, we have to look at how Voyager 1 is powered. The probe does not work with solar panels, but with a radioisotope thermoelectric generator that converts the heat generated by the decay of plutonium into electricity. This system has allowed the mission to be sustained for decades, but its capacity is not infinite. According to NASA, both Voyager 1 and Voyager 2 They lose about 4 watts of power per yeara small loss on paper, but decisive when you have been managing each watt with extreme care for almost half a century. The scare that accelerated the decision. Although the shutdown of the LECP was part of a previously defined roadmap, there was a recent episode that forced the team to move more carefully. During a routine turn maneuver on February 27, Voyager 1’s power levels dropped unexpectedly. The US agency explains that any additional descent could activate the ship’s undervoltage protection system, designed to disconnect components on its own and protect it. A calculated “pruning”. The shutdown sequence was decided a long time ago, in joint conversations between those who design the scientific part of the mission and those who technically keep it alive. Of the 10 instruments each Voyager had, seven have already been turned off. In addition, the LECP will not be completely disconnected: the small motor that allows the sensor to rotate to scan in all directions will remain on, because it barely consumes 0.5 watts and keeps a remote option open to reactivate it later. The plan that comes now. With this shutdown, NASA does not consider the issue closed, but rather gains time to attempt a deeper intervention. According to the agency, switching off the LECP should give Voyager 1 about a year of respite. During that time, engineers want to complete a more ambitious energy adjustment for the two probes, dubbed “big Bang“The idea is to change several energy-consuming devices at once, turning off some and replacing others with lower consumption alternatives, to conserve the necessary heat and continue operating scientific instruments for as long as possible. When will the maneuver be attempted?. NASA will first test this setting on Voyager 2, which is closer to Earth and has slightly more power. The tests are planned for May and June 2026 and, if they go well, the team will try to apply the same maneuver on Voyager 1 no earlier than July. Images | POT In Xataka | The paradox of artificial gravity: Einstein told us how to do it, engineering tells us it is almost impossible

We thought we were 8 billion people on the entire planet. Until some researchers started crunching the numbers

In November 2022, the UN celebrated that we were now 8 billion humans on Earth. They are estimates, of course, but beyond the figure, the really interesting thing is that in 2023 we do not reach the replacement rate and that humanity will reach its peak at the end of the century to, inevitably, start to fall. But… to what extent can we trust these accounts? It is something that has been on the table for some time and, according to a study of 2025, we have made a mistake in counting. So much so that we have left several hundred million people behind. Can we trust the numbers? “Calculating the number of people on the planet is an inexact science.” That was demographer Jakub Bijak’s comment to BBC in mid-2024, just when the World Population Prospects study. Something scientific is something exact, but the researcher also commented that the only thing you can be sure of when predicting population figures is the lack of certainty. That, be careful, does not mean that demographers take figures out of thin air. “It is a difficult thing based on our experience, knowledge and every piece of information we have access to,” said Toshiko Kanera, an expert in demographic forecasts. Demographers draw on the data and trends of each country since 1950, but… what if it had not been counted correctly? We are missing millions. In a 2025 study published in Natureresearchers at Aalto University in Finland show how the data sets handled by demographers “profoundly and systematically” underestimate population figures around the world. The serious thing is that we would be talking about hundreds of millions more people living on Earth. Example of the tools that demographers use in their analysis. Each one corresponds to a different bias Rural areas. Josias Láng-Ritter is one of the researchers in charge of the study and points to the accounts carried out in a specific segment: that of rural population. “For the first time, our study provides evidence that a significant proportion of the rural population could be missing from global population data sets,” he notes. As we say, we are not talking about a few million, but billions. “Depending on the data set used, rural populations have been underestimated between 53% and 84% in the period studied. The results are notable, since these data sets have been used in thousands of studies and have widely supported decision making, but their accuracy has not been systematically evaluated,” comments the researcher. The map shows the location of the 307 rural areas analyzed in the study. The populations reported in the graph were found to be underestimated by between 53% and 84% | Aalto University Biases. Attempts to review this data are not new, but previous research has focused on specific countries or urban areas. Researchers from Aalto University wanted to give a more global picture by comparing the five most used population data sets worldwide. They have used maps that divide the planet into high-resolution grids and have taken something very specific as a reference: resettlement figures from more than 300 rural dam projects in 35 countries. Why this bias of the dams? Because when a dam is builtthe population that lives in the area that will be flooded is relocated and accurate resettlement data is usually available. Comparing that population data from 1975 to 2010, the researchers found that the 2010 maps were more accurate, but still left out between 32% and 77% of the rural population. Between 2015 and 2020 the data sets were updated, but demographers continue to believe that underestimation of the rural population continues to exist and is a problem that persists in all regions of the world. Consequences. And we are talking about a problem whose resolution is complex. According to the researchers, no matter how much the data is reviewed, it is a structural problem. Governments do not have the resources to collect accurate data in these rural regions, there is a huge discrepancy between the real population and that reported on the population maps used to carry out demographic studies and that influences decision making. Average percentage of rural population underestimated (red and orange) and overestimated (blue) | Aalto University And it’s important. Current estimates place 43% of the 8.2 billion inhabitants of the world in rural areas -about 3,526 million people- and if we take into account that it is a percentage that has been underestimated between 53% and 84%, we are not talking about a small population, precisely. And it is essential to know exactly how many we are for a simple reason: the redistribution of resources. No data. The lack of accurate demographic records can affect political decision-making. Ritter gives the example of social decisions. “In many countries, there may not be enough data available at the national level, so they rely on global population maps to support their decisions: Do we need a paved road or a hospital? How much medicine is needed in a specific area? How many people could be affected by natural disasters like earthquakes or floods?” he says. Doing quick math, in the best scenario – that of 53% deviation in the rural population – we would be talking about 1,869 million people who would not have been counted. In the worst case, in that of the 84% not registered, we would talk about 2,962 million people. In the Nature study, they give the example of Paraguay, which in the 2012 census may have left out a quarter of the population. Reviewing the methods. In the team’s analysis, there are countries that fare better than others. They point to Finland as an example of reliable data, even in rural regions, because they began keeping digital records of the population 30 years ago. However, in countries where this thorough digital registration has taken longer to be implemented due to crises of any kind, the differences between the real population and the estimated one can be significant. “To provide rural communities … Read more

If you thought the crisis in Hormuz was enough, the war in Ukraine has triggered another maritime drama in Europe: the Gulf of Finland

About five years ago, the container ship Ever Given became stuck in the Suez Canal for six daysblocking one of the most important commercial arteries in the world and leaving hundreds of ships trapped waiting. That incident, caused by a failed maneuver and adverse wind conditions, was enough to disrupt global supply chains in a matter of hours. A new seafront. As global attention focuses on the Strait of Hormuz, the war in Ukraine has opened another critical scenario much closer to Europe: the Gulf of Finlanda small but key space for Russian energy exports. There, far from spectacular drones or large fleets, the conflict manifests itself in a more silent way but just as revealingwith ships detained, routes blocked and growing tension between actors trying to avoid a direct escalation. This new focus demonstrates that the war is not only being fought on the land front, but also in the nerve centers of maritime trade. Ukraine attacks and a collapse. The situation has its origins in a clear kyiv strategy: to hit key russian ports to export oil, such as Ust-Luga and Primorsk, where it comes a fundamental part of the income that finances the war. The attacks have drastically reduced the operational capacity of these facilities, leaving dwhole days without activity and causing an immediate chain effect. The result: a unprecedented maritime traffic jamwith dozens of oil tankers (many of them linked to the so-called “floats in the shadows” Russian) accumulating waiting to be able to load. A system on the limit. They remembered this week in Political that this traffic jam in the Gulf of Finland is not just a striking image, but a symptom of something deeper: an energy and logistics system that begins to fracture under the pressure of war. Unlike conventional vessels, these tankers cannot be easily redirected to other ports due to the risk of being detained or sanctioned, which forced to remain anchored for days or weeks. As a result, there is an unusual concentration of aging and, in many cases, unsafe ships in European waters that were not prepared to absorb that volume. Europe trapped between control and escalation. Under this scenario, countries like Estonia and Finland They are in a particularly delicate position, since, despite being within the NATO framework, they have chosen not to intervene directly against these ships. The reason is clear: any attempt to stop or board an oil tanker could trigger a Russian military responseas already happened when a Russian fighter intervened to protect one of these ships. Since then, Moscow has reinforced its naval presence in the area, making it clear that it considers these strategic routes a red line. The Mirror of Hormuz. There is no doubt, what happens in the Gulf of Finland connects directly with the crisis in Hormuz: In both cases, the war moves towards maritime straits where traffic control becomes a strategic tool. The difference is that there is no formal block here, but an indirect disruption which generates similar effects, with stopped ships, tense routes and altered markets. In both scenarios, it is enough to interfere enough to collapse the system, and also without the need for a total shutdown. A war that spreads across the map. If you like, the result is a conflict that is no longer limited to Ukraine either to the Middle Eastbut it extends to the critical nodes of global trade, affecting Europe directly. The Gulf of Finland has thus become in another hot spot where energy, legal and military interests intersect, with an extremely fragile and volatile balance. And what seemed like a localized war is proving to have a much greater scope, generating new sources of tension that, like in Hormuz, can escalate quickly. without prior notice. Image | LAC, NormanEinstein In Xataka | If fog was deadly in Ukraine’s winter, spring is offering Russia a key advantage: greenery In Xataka | Ukraine is close to what no one has achieved in a war: shooting down missiles for less than a million dollars

Booking has been hacked. If you thought phishing was dangerous, wait until you see the follow-up phishing attacks

Basic-Fit’s hack yesterday It has not been the only relevant event in terms of cybersecurity in recent days. Last weekend several Booking users received emails with less than reassuring content. In these messages, the company reported that a cyber attacker could have had access to the information on its reservations. On Monday Booking confirmed that the security flaw existedbut has not given too many details about the problem. Your name and reservations were leaked, your card details were not. The information accessed by the attacker(s) includes names, email addresses, phone numbers and booking details. However, Booking has highlighted that the users’ financial data have not been part of this unauthorized access and they have not accessed the users’ home addresses either. To try to mitigate possible problems, the company forced reset of backup PINs of all affected reserves, both active and past. Too many unknowns. Although it has confirmed the incident, Booking has not provided clarification on it and it is not clear if its systems were hacked directly or the problem occurred through other means. There are also no details on the number of users affected nor is it a problem of real scope or limited to certain countries or regions. Booking has indicated that it will inform affected users individually without giving figures. According to its own website, Booking manages hundreds of millions of reservations a year and it is estimated which have about 135 million users of their mobile app. Phishing attacks have already started. These types of data thefts are exploited for massive phishing attacks, and it appears that such attacks have already begun. At least one user indicated on Reddit that he had received a suspicious message on WhatsApp with details of his reservation and personal information. That seems to confirm that the attackers were already using the stolen data to deceive customers before the public announcement occurred. But beware of “tracking” phishing. But in this case the risk is somewhat greater because this is the type of platform from which we are not so surprised to see messages that inform us of the follow-up of the reservation (with the style “There is one week left for your trip!”). Precisely these types of smishing messages can now be generated by attackers fraudulently leveraging the reservation data they have extracted to appear legitimate. If you are a Booking customer and have a pending reservation, be especially careful if you receive one of these follow-up messages. It’s not the first time. In 2021, Dutch regulators fined Booking.com with 475,000 euros after a hack exposed the data of more than 4,000 customers, including credit card information in some cases. On that occasion, Booking notified the Dutch authorities of the cyberattack 22 days late, well above the 72-hour limit required by the GDPR, which caused the company to be fined. In June 2024, the platform itself warned that phishing attacks against its clients had increased by 900% thanks to the use of AI. The company has reported the security breach to Dutch authorities, but it remains to be seen if it again took too long and could face further fines. What to do if you are a Booking user. Theoretically nothing if you have not received an email from Booking.com notifying you of the problem. If you receive it, it is important that you distrust any message, call or WhatsApp that mentions details of your reservation even if they seem legitimate. Attackers may have data about your reservations and may be using it to deceive you. You should not provide your financial data through any channel other than the platform’s official website or app. This data can be used for phishing attacks from other services that use your name or email, since this information is usually sold to be reused by other groups that carry out massive phishing attacks. In Xataka | A family paid 1,800 euros for a tourist house in Galicia. Upon arrival there was no house and no response on Booking

We knew that lynxes were smart, but not that smart. Five females from Toledo have just rewritten what we thought we knew about wild felids

Science works like this. One day, a member of the Hunting Resources Research Institute is reviewing trapping cameras, and the next, this research team is rewriting many of the things we thought we knew about terrestrial carnivores. And also for a handful of mothers taking care of their children. What has happened? As I said, a team led by IREC has used trapping cameras to document for the first time as females iberian lynx They deliberately immerse freshly hunted rabbits in basins of water before giving them to their young. It may seem like an ethological curiosity; but we are talking about the first known case (eight different events) of deliberate manipulation of dams with water (in five different pylons) by wild felids. A complex cognitive behavior that we did not think was possible. And it’s curious because it’s not a “put in and take out,” or anything like that. It is not at all subtle, nor easily confused with something else: lynxes maintain the dive for more than 60 seconds without letting go of the prey and they do it, of course, completely on purpose. Why do they do it? Well the truth is that they don’t know. The researchers point out that the females could be using the rabbits as a vehicle for water to their young in especially hot periods. It must also be taken into account that the puppies are just weaned at that time of year. However, as I say, we don’t know for sure. Why is it important? Until now we had found many cases of animals that They washed their food in water (Japanese and Thai macaques; great apes in captivity, wild boars and cockatoos), but all in omivorous or frugivorous species that used this manipulation to remove sand and dirt. We had never seen a carnivore doing it. But the interest goes beyond that. Because not only does it challenge the idea that terrestrial carnivores capture and hide their prey without manipulating them; but rather questions the idea that solitary lynxes do not have a great capacity for social transmission. This finding suggests the opposite: that there is what we could call a “lynx culture“. Things that separate each other. We know so little… That is the main conclusion of the series of studies that this team is developing in the Montes de Toledo: that although we have been living with animals and plants for centuries, there are many things (too many) that we still do not know. Above all, when they have to do with this: with animals that are getting closer and closer to what we have called ‘humanity’ for years. Image | Wildlife Ecology and Management Research Group of the Hunting Resources Research Institute In Xataka | The question is no longer whether reintroducing the lynx in Aragon makes sense: it is what are we going to do to stop the rabbits?

We thought that quenching hunger with Ozempic was the definitive remedy against obesity. Until we look at the muscle

The revolution of drugs as they are OzempicWegovy or Mounjaro have undoubtedly marked a before and after in the approach to obesity, which previously went under the knife when measures focused on lifestyle changes failed. A priori, we are sold the idea of ​​having a very pronounced weight loss, but the reality is that many patients are falling into protein malnutrition and losing a large amount of muscle mass. We are getting more data. We do not know the drugs completely when they begin to be marketed, but as patients use them, they emerge. new side effects or situations that pharmaceutical companies had not initially imagined. Here a revealing new studywhich will be presented at the European Obesity Congress, has put figures on these effects of malnutrition that accompany successful treatments. And all this indicates that the strategy followed with Ozempic must be changed so that doctors can give precise indications of the nutritional strategy that must be followed during the treatment period to avoid serious health problems. Not having an appetite is bad. A priori, those who take Ozempic want to have less desire to eat, and for it to practically be a task that becomes an obligation in order to survive. But the problem is that people end up having too little hunger, as a new scientific study has shown. analyzed over 5,700 days of nutritional data of 332 overweight adults between July 2025 and 2026. What was seen. Of all of these, the 116 users who were taking drugs like Ozempic consumed drastically fewer calories than the control group, something that was quite expected, but the most important thing was the protein intake that plummeted when the drugs were taken. Specifically, medicated patients consumed an average of 53.8 grams of protein per day, and adjusted to their body weight, this amounts to just 0.6 g/kg/day. To put it in perspective, 88% of these patients fell well below the official recommendation of consuming 0.8 g/kg/day of protein, and far from optimal levels for preserving muscle during weight loss. The reason. Lack of hunger literally makes people skip meals, basically because they do not have that physiological need to put anything in their mouth. Here the study could see that patients taking Ozempic or similar skipped 40.4% of dinners, 31.3% of breakfasts and 30.5% of lunches. In this way, by reducing eating to a few moments of the day, it is almost impossible to achieve the around 25 grams of protein per meal that the body needs to synthesize new muscle and maintain the structure it has. It has consequences. In medicine, the fact of losing muscle mass is what is called sarcopenia and until now it was mainly associated with people who were not physically active, such as the elderly or those who were bedridden. Here scientific reviews suggest that between 25% and 40% of all the weight lost by users taking Ozempic is muscle. And in a very important proportion with respect to fat loss, making it so that almost for every two kilos of fat lost, one is muscle. In older adults or patients with type 2 diabetes, the situation is even more serious, since high doses of semaglutide accelerate sarcopenia, reducing vital metrics for longevity and quality of life, such as grip strength or walking speed. In addition, severe calorie restriction brings with it deficiencies of micronutrients such as vitamin D, vitamin B12 or iron. How to avoid it. Here, medical guidelines increasingly point to a personalized approach in which a highly nutritious diet is established to prevent the patient from having a macronutrient deficiency while losing weight. This is why it is already being recommended that during weight loss it is recommended to increase protein consumption from 1 to 1.6 grams of protein per kg of weight, prioritizing a minimum of 20-30 grams at each meal. In addition, a ban should be established on skipping meals, with eating times having to be very measured, even a little, but at different times throughout the day to avoid prolonged fasting. Maintain muscle. This should be the primary objective and that is why, along with a diet rich in protein, strength exercise with weights, bands or even calisthenics should be encouraged. In this way, the body has the signal to maintain the amount of muscle despite the fact that there is a loss of fat due to the medication. And here the objective is to lose weight, but without having to remove the bricks that are literally building our body, since if we manage to lose weight, but are left without muscle mass, the quality of life is not going to be the best. Images | Haberdoedas Anastase Maragos In Xataka | Ozempic’s “great rebound”, in figures: science reveals that the weight returns four times faster than with a diet

Anthropic has become the darling of AI and has sought a partner to guarantee its future. It’s not the one we thought

When we think about the big players in artificial intelligence, we tend to draw pretty clear lines between competitors and allies. Anthropic and Google They usually appear on the same board, yes, but as direct rivals that develop their own models and compete for the same ground. Therefore, the fact that they now appear linked in the same agreement draws attention from the first moment. The firm led by Dario Amodei has closed an alliance with Google and Broadcom to ensure next-generation computing capacity, and that movement, beyond the technical, leaves a message that does not go unnoticed. If we go to the details of the announcement, what is relevant is not only who participates, but the scale of what has been signed. Anthropic speaks of multiple gigawatts of next-generation TPU capacity that it expects to come online from 2027, an infrastructure designed to support its famous Claude models. In its statement it insists that demand from its clients has accelerated this year, and presents this movement as a direct response to that pressure. In fact, it describes it as its biggest bet in computing so far, although Amazon remains its main cloud provider. The unexpected partner in the battle for computing The agreement makes a lot of sense if we look at the figures that the company has shared. In 2024, it registered annualized revenues above $30 billion and more than 1,000 business clients exceeding one million annual spending, when in February there were more than 500. So this undoubtedly translates into a greater load on your infrastructure. And that’s where this movement fits in, not so much as an isolated strategic coup, but as a response to that growth. And, as we can see, this agreement has two different pieces. On the one hand there is Broadcom, a semiconductor company that has benefited greatly from the rise of AI. On the other hand, the Mountain View giant appears, which in addition to providing infrastructure, driven by its focus on TPUalso competes directly in model development. And that is where the agreement gains interest, because it mixes technical collaboration with a competitive relationship that already existed. It is also worth stopping at where Anthropic is, because it helps to understand why it can close such a deal. The company has been building its position by moving away from the race for the flashiest features and focusing on business environmentwhere security, control and reliability outweigh the initial impact. This approach has allowed him to excel in tasks such as programming, with Claude Code, and security with the new Mythos. And, little by little, it has been gaining something that is not achieved overnight: the trust of large companies. But there is more. Anthropic makes it clear that Claude works on AWS TrainiumGoogle TPU and NVIDIA GPU, and adds that this variety allows it to improve performance and resilience. That gives us a pretty clear clue about what he’s doing now. Rather than betting everything on a single supplier or a single family of chips, it is consolidating a more flexible base to sustain its growth. And in an industry so stressed by hardware demand, that decision makes a lot of sense. Images | Anthropic In Xataka | The “token economy” is broken: flat AI programming fees are mathematically unsustainable

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.