Five technology offers to take advantage of MediaMarkt’s VAT-free Day that ends in a few hours

If after Christmas you are looking for a good mobile phone, headphones or a reader to devour digital books, for a few hours MediaMarkt will have its particular campaign active Day without VAT. It ends tomorrow, January 13 at 9:00 a.m., so in this article we are going to review the five best deals on technology that are available. Sony WH-1000XM5SA by 189.26 eurosa very reasonable price for one of the brand’s best headphones. Google Pixel 10 Pro by 751.24 eurosthe best price the store has had (without coupons) on this Google mobile. Kindle Paperwhite by 139.67 eurosAmazon’s eReader with the best quality-price ratio. Samsung Galaxy Watch8 by 230.58 eurosa very elegant smart watch. Samsung TQ55S85FAUXXC by 774.38 eurosa TV with a very low price to include an OLED panel. Sony WH-1000XM5SA If you are looking for good headphones, MediaMarkt has them right now. Sony WH-1000XM5SA with one of the best prices we have seen to date. By 189.26 euroswe are talking about a model that offers a very good active noise cancellationThey are very comfortable and their battery offers a range of approximately 30 hours of use with ANC. In this case, it includes a soft carrying case. The price could vary. We earn commission from these links Google Pixel 10 Pro Google mobile phones have been dropping in price in recent months and now we can find one of the best discounts on the Google Pixel 10 Pro. On the Day without VAT it remains for 751.24 euros and it is ideal if what you are looking for is a small size with a 6.3-inch screen, that has a very elegant design and a photographic section with an excellent camera configuration. Google Pixel 10 Pro (128GB) The price could vary. We earn commission from these links Kindle Paperwhite One of the most popular eReaders on Amazon is the Kindle Paperwhitesince it has the best quality-price ratio, especially when it is on sale. Now, during the Day without VAT, it remains for 139.67 euros. Set up a screen seven inchesso it is a good size for reading at home or taking on a trip. It also offers a good autonomy of up to 12 weeksits screen is anti-reflective and is water resistant (IPX8). The price could vary. We earn commission from these links Samsung Galaxy Watch8 If what you are looking for is a good smartwatch, the Samsung Galaxy Watch8 in its 44 mm Bluetooth configuration has dropped to 230.58 euros. It comes with a good assortment of sensors to monitor physical activity, has 32 GB of internal storage and its operating system is WearOS. In addition, it incorporates a pair of side buttons, includes Google Gemini and has more than 100 sports modes. Samsung Galaxy Watch8 (44mm) The price could vary. We earn commission from these links Samsung TQ55S85FAUXXC Today we can find some televisions with OLED panels for less than 1,000 euros, as is the case of the Samsung TQ55S85FAUXXC which on MediaMarkt’s VAT-free Day has dropped to 774.38 euros. It is a 55-inch smart TV that incorporates a anti-reflective panel with OLED technologyin addition to a 100 Hz refresh rate, compatibility with HDR10+ and Dolby Atmos, Alexa and HDMI 2.1. The price could vary. We earn commission from these links Some of the links in this article are affiliated and may provide a benefit to Xataka. In case of non-availability, offers may vary. Images | MediaMarkt and Compradicción (header), Sony, Google, Amazon, Samsung In Xataka | The best mobile phones, we have tested them and here are their analyzes In Xataka | The best smartwatches (2026): their analyzes and videos are here

The US has just started live-fire exercises with its nuclear aircraft carrier. And it has done so in the waters claimed by China

Since the end of the Cold War, the naval presence has been one of the pillars of the United States’ strategic balance in the Asia-Pacific, an architecture designed to guarantee open trade routes and deter unilateral changes to the status quo. However, the rise of Beijing as a maritime power and the transformation of the South China Sea into one of the most disputed spaces of the planet have turned each naval movement into something more than a simple military routine, loading it with readings of all kinds. That’s why Washington’s latest move is so important. A deployment with high strategic value. The deployment of the nuclear supercarrier USS Abraham Lincoln At the end of November it occurred with a almost total discretionwithout official statements from the Pentagon or public indications about its area of ​​operations, a common practice when the US Navy wants to preserve freedom of strategic maneuver. This silence coincided with a moment loaded with internal symbolism, as Abraham Lincoln took over from USS Nimitzthe dean of the fleet, who returned to the United States after completing his last operational mission before beginning a long process of retirement and recycling. The handover is not a simple exchange of platforms, but rather a visualization of how Washington maintains its global presence seamlessly while orderly renewing the core of its naval power. Guam as a logistics anchor. It we have counted before. The battle group’s stopover in Guam reinforced the island’s role as one of the less visible pillars, but more decisive of US military architecture in the Indo-Pacific. Guam works like an advanced node from which prolonged operations are sustained, large units are resupplied and forces deployed thousands of kilometers from the continental territory are coordinated. That Abraham Lincoln was the second aircraft carrier to visit the island in a few weeks stressed the importance of this enclave at a time when the USS George Washingtonthe only aircraft carrier permanently based in Japan, remains out of commission for maintenance, demonstrating that asset rotation does not imply a real reduction in presence, but rather a carefully calculated redeployment. The “routine” in the South China Sea. The subsequent entry of the Abraham Lincoln into the South China Sea is part of an American strategy long term based on the normalization of its naval presence in waters that Beijing considers its own. Washington is not looking for a specific gesture or a spectacular demonstration, but for something more subtle and persistent: to operate regularly to prevent absence from consolidating territorial claims through deeds. By presenting these activities as routine, the United States intends reduce capacity of China to define the narrative, keeping open sea lanes that are essential for global trade and regional strategic balance. Demonstration of capabilities without escalation. During its recent activity, the combat group has integrated live fire exercisesresupply operations at sea and flights of the F-35Cthe fifth-generation shipborne fighter, composing a complete picture of its operational capability without resorting to explicit political messages. Added to this are tests of defensive systems like the Phalanx and the escort of Arleigh Burke destroyerscapable of operating in anti-aircraft, anti-submarine and land attack missions. The package conveys a clear signal of preparedness and self-reliance, one based on observable facts rather than public statements, and designed to deter without provoking unnecessary escalation. Strategic persistence against Beijing. With more than four decades of service, a profound mid-life modernization, and a track record that ranges from humanitarian evacuations to high-intensity conflicts, Abraham Lincoln represents the material continuity of American naval strategy. His presence against China It does not respond to a specific crisis or a specific situation, but to a structural logic that defines the Indo-Pacific like a central theater for the United States. In a context of growing competition and transition of the international order, the underlying message is that Washington has no intention of withdrawing or giving up operational space, and that its naval power will continue to be a constant, visible and functional factor in the region for the coming years. Image | US Navy In Xataka | The US has detected a naval advantage over China. The catapult of the Beijing aircraft carriers comes with a “factory” failure In Xataka | The US faced its invincible aircraft carrier with a tiny Swedish submarine. The zasca was anthological for years

Microsoft continues to confuse the world with its obsession with Copilot. Almost no one is very clear if Office is alive or not

“But then, does Office exist or not?” It is a question that seems trivial, but it is not so, and with good reason: the constant name and brand changes have meant that the Microsoft office suite is being the latest victim of his obsession with AI and with its avalanche of products with the Copilot surname. The usual Office is no longer what it was. The evolution of Office was relatively stable until 2020. The office suite, officially launched in 1990, made it possible to bring together all the office applications that Microsoft already had and that it would later expand. This is how we soon saw an Office that consisted of Word, Excel, PowerPoint, OneNote, Outlook and even Access and other tools. Changes and more changes. Since then the suite has been undergoing paradigm shifts… and name changes: 2010: The Office 365 brand is introduced as a cloud version of the traditional office suite. The goal: compete with Google Docs 2013: After the launch of Office 2013, Microsoft begins to promote the Office 365 service as the main alternative to access office tools 2017: Microsoft presents a second evolution of these services, which this time were aimed at companies and which it named Microsoft 365. This platform combined Office 365 with volume licenses for Windows 10 Enterprise, as well as some additional solutions. 2020: Office 365 change your name to Microsoft 365 2022: Microsoft announces that the branding “Microsoft Office” would be abandoned in favor of the “Microsoft 365” brand. Even so, Microsoft continues to sell perpetual Microsoft Office licenses for local installations. The latest version Today it is Microsoft Office 2024. 2025:Microsoft rename the Microsoft 365 app to Microsoft 365 Copilot, referring to the “Office/Microsoft 365 Hub.” This application is actually like an aggregator of the different Microsoft office tools (Word, Excel, etc.). And Perplexity adds fuel to the fire. A few days ago those responsible for Perplexity published a tweet in which they seemed to indicate that Microsoft had changed the name from “Office” to “Microsoft 365 Copilot app.” In reality, what had been renamed, as they point out in Windows Latestis the “Office/Microsoft 365 Hub”, but this name change had already been announced a year ago, in January 2025, as we indicated. Perplexity also added that this decision had caused “400 million users to become “AI users” overnight.” Both the tweet and that statement were somewhat exaggerated, and did not help clarify a situation that is already confusing. Microsoft clarifies it. Microsoft officials have indicated in The Verge and other means that: “We have not made any recent changes to the names of our Office applications. Word, Excel and PowerPoint, the Office applications included in the Microsoft 365 productivity suite, remain unchanged In November 2022, we just renamed the Office hub app for web and mobile to the Microsoft 365 app. In January 2025, we updated it to the Microsoft 365 Copilot app to reflect its role in bringing the Copilot and Microsoft 365 productivity experiences together in one place.” More trouble with the Office.com website. Although Microsoft hasn’t just “killed” the Office brand, it doesn’t seem to want it to be used much either. In fact, if one goes to the office.com website What you see as soon as you load it is a message that says “We welcome you to the Microsoft 365 Copilot application”, or in other words, that “hub” or aggregator from which you can launch the different office tools in the Microsoft suite. It doesn’t seem like a lucky decision. like others in this line in recent times. How to destroy a recognizable and recognized brand. The truth is that Office was a brand recognized by users, but for years Microsoft has wanted to transform it into part of something bigger. The intention, we believe, was to try to make it clear that Microsoft 365 was more than traditional office tools, but the only thing that has been achieved With these changes it is adding more and more confusion. Office is still alive as a product and as a brand, but it has ended up being absorbed by these new brands and, of course, because of Microsoft’s obsession with AI and with Copilot. In Xataka | Thanks again, Microsoft, for letting us buy Office 2024 instead of putting up with another subscription

When a town found a dead whale on its beaches, it decided to dynamite it. 55 years later they still celebrate it

One of the most excessive and gory stories you have ever heard in your life is also one of the funniest, because for a change it does not involve the suffering of any living being, but rather a series of unfortunate decisions and systematic ignorance of the laws of physics. It is the story of the whale Oregon explosion, a crazy event that just turned 55 years old… and is still being celebrated. The problem. On November 12, 1970, engineers from the Oregon Highway Division, which is in charge of road traffic on a day-to-day basis, encountered an unusual dilemma on the beach in the small coastal town of Florence: getting rid of a dead eight-ton sperm whale that had been decomposing in the sun for three days. After consulting with the Navy about demolition techniques, the team decided to apply a solution as direct as it was disastrous to the corpse: half a ton of dynamite (twenty boxes), in the hope of pulverizing the cetacean. The seagulls would be in charge of cleaning up the remains. Good marines, bad advisors. The consultation turned out to be counterproductive. The marines advised on demolition with explosives, their specialty, but no one consulted marine biologists or coastal wildlife experts. Walter Umenhofer, a local businessman with military experience, warned Thornton that twenty boxes of dynamite was excessive: he recommended twenty individual cartridges or, if not, a much larger amount to completely pulverize organic tissue. His advice was ignored. Boom. The detonation, at 3:45 PM, caused a 30 meter high sand and grease apocalypsethrowing whale fragments in all directions. Blocks of tissue and muscle the size of coffee tables fell on spectators located at a safe distance of more than 400 meters from the explosion point. The screams of excitement from the hundred or so spectators turned into screams of horror as fragments of tissue fell from the sky. Some of the pieces of fat, almost a meter long, crushed the roof of a vehicle. The smell of burning flesh lingered for days and the seagulls never appeared. The decision of George Thornton, responsible for the action, lacked technical basis from the beginning. In one previous interviewadmitted: “I’m sure it will work. The only thing we’re not sure about is exactly how much dynamite we’ll need to break this… thing up, so the seagulls and crabs and other scavengers can clean it up.” Thornton decided to treat the cetacean like a rock on a road: half a ton of explosives strategically placed under the animal, in the hope that the force would propel the remains into the Pacific. What to do with a whale. Cetacean strandings have posed logistical dilemmas for coastal authorities for decades. Prior to the development of unified scientific protocols (that prioritize scientific necropsy on rapid elimination), methods for dealing with dead whales often relied on improvisation. The most common options They included burial on the beach, towing out to sea for sinking, or simply allowing the animal to decompose naturally. Today, disposal methods have evolved: countries such as South Africa, Iceland and Australia continue to use controlled explosives after towing cetaceans out to seabut the United States ended up abandoning this practice. When 41 sperm whales stranded near Florence in 1979, authorities They buried them without hesitation. Hunting In 1970, Oregon lacked specific guidelines for these cases. The Oregon Highway Division had jurisdiction over state beaches (an administrative quirk arising from the legal consideration of coastlines as part of the public highway system) but no expertise in marine biology. When the sperm whale arrived in Florence, George Thornton publicly admitted that he had been assigned to the case.”because his supervisor had gone hunting“. The closest precedent had been successful because of its modesty: two years earlier, in 1968, authorities in Long Beach, Washington, had managed a similar stranding through a conventional burial without incident. The unforgettable video. All was immortalized by KATU journalist Paul Linnman, who arrived on the scene initially frustrated by what he considered a menial assignment. Until he found out the amount of dynamite involved. With cameraman Doug Brazil documented the event on 16mm film with live magnetically recorded audio, a format that, unlike video, would retain its visual quality for decades. On. After the disaster, most of the sperm whale remained intact on the beach. Highway Division workers spent the afternoon manually burying the remains, including huge sections of the animal that were not moved from the explosion point. Thornton declared to Bacon that same afternoon that everything had gone “well…except that the explosion dug a hole in the sand beneath the whale,” directing the force upward rather than toward the ocean. decades laterThornton continued to defend the operation as a technical success distorted by hostile media coverage. It goes viral. For two decades, the incident remained a regional anecdote until comedian Dave Barry resurrected history in his Miami Herald column on May 20, 1990. Titled “The Far Side Comes to Life in Oregon,” in reference to the immortal series by gary larson. His description of the event introduced the American public to the concept of “epic fail” before the digital age popularized the term. The Oregon Department of Transportation received calls from angry people, convinced the incident had occurred recently. Which makes the exploding whale one of the first stories to go viral on the internet. Beyond the meme. The phenomenon transcended the purely digital. In 2015, Oregon indie musician Sufjan Stevens released the song ‘Exploding Whale‘, where it said “Embrace the epic failure of my exploiting whale”. Of course, the event appeared on ‘The Simpsons’, in the 2010 episode ‘The Squirt and the Whale’. In 2020, the Oregon Historical Society commissioned a 4K restoration of the original 16mm footage of the news story. The laughs. 55 years later, that fiasco in public management has been transformed into folklore and local heritage. In 2024, Florence declared November as “Exploding Whale Month”and the city celebrates the anniversary with a festival that culminates with the “Superlative … Read more

How we learned to take care of what lives on a screen

Thirty years after its launch in Japan, the Tamagotchi is still recognizable at first glance. The egg shape, the three buttons, the screen that barely shows a few animated pixels. Everything seems taken straight from the nineties and yet, we are not facing an object frozen in time. Bandai has continued to push new versions and the product continues to find an audience, both among those who remember it from their childhood and among new generations who did not experience its original heyday. This journey, from global phenomenon to persistent cultural icon, cannot be explained only as a fashion that returns. The Tamagotchi installed a different relationship with a device: in its original version it was not used when you felt like it, but when you wanted it. Caring, feeding, cleaning and assuming consequences were part of the deal, with a radical element for an electronic toy of the time: there was no pause button to rescue you from neglect. What we interpret today as the “attention economy” did not yet have a name, but the mechanics were already there. A Japanese toy that taught how to coexist with digital Functionally, the Tamagotchi is a basic simulation of care and growth encapsulated in a pocket-sized object. The device executes a set of rules that determine the state of the digital creature, rules that the user can only partially modulate through specific actions repeated over time. There is no learning curve complex nor a clearly defined ending, and therein lies part of its uniqueness compared to other electronic toys of its time. The important thing is not to “win”, but to sustain the bond. The interest is not in dominating the system, but in living with him. That logic, deliberately open, allowed the Tamagotchi to transcend the usual framework of the electronic toy and integrate into the daily routine of those who used it. It was not about sitting down to play, but rather assuming a presence that could demand intervention throughout the day, regardless of the context. It’s a small distinction on paper, but huge in practice, because it moves the game from a “time” to an ongoing relationship. To understand why this product toIt seems in Japan in 1996it is convenient to look at the industrial context without turning it into a closed cultural explanation. Bandai operated in a mature market for toys and licenses, and in the mid-nineties it was looking for formats capable of connecting with a young audience that already lived with electronics. Japan, furthermore, was an environment especially accustomed to portable personal objects, from players to consoles, and to characters turned into everyday icons. All of that didn’t “cause” the Tamagotchi, but it did make it more readable. The key is that the Tamagotchi did not rely on a well-known franchise or a previous history. Its appeal was based on a simple, portable and easy-to-communicate idea, reinforced by an aesthetic close to the Japanese visual culture of the time, where the small and the expressive were already part of the landscape. That combination helped the concept be adopted quickly and, above all, shared naturally. Not as a rare device, but as a personal item that was carried around. Although the Tamagotchi is often spoken of as a singular invention, its origin is the result of a very specific collaboration. Akihiro Yokoipresident of WiZ, proposed the initial concept of a portable virtual pet and presented it to Bandai in the mid-nineties. There, Aki Maita, responsible for the project within the company, was the one who transformed that idea into a viable product from a technical and commercial point of view. This double authorship matters because it avoids the easy story of the solitary genius and better describes how many consumer phenomena are born. The initial concept was that of a portable virtual pet. The process included testing with real users before its launch, something unusual in the development of electronic toys at the time. These tests allowed us to adjust both the design and the focus of the product and revealed a key fact for Bandai: interest was strongly emerging among teenagers, especially girls, which influenced the final aesthetic and the way it was presented. It is not a minor detail, because it explains why the Tamagotchi became a social and visible object, not a device that was hidden. If we talk about the name, we can say that it was not a minor detail or an afterthought either. “Tamagotchi” born from a combination deliberate between tamago, the Japanese word for “egg”and watchreferring to an often consulted object, adapted phonetically in Japan. That choice reveals how the product was thought about from the beginning. Not as a toy that is used occasionally, but as something that is carried around and looked at frequently. As we say, in the original model, the device did not offer full control to the user. There was no way to freeze the system or protect the creature from the consequences of carelessness, and that harshness was built into the proposal. The asymmetry, in which the user responded more than he commanded, altered the traditional relationship between player and toy and increased the emotional cost of abandonment. Bandai assumed from the beginning that this lack of indulgence was part of the experience. The Tamagotchi was not designed to please, but to demand consistency and generate involvement. This logic, which today we associate with much more sophisticated digital dynamics, was key to making the bond with the creature feel less instrumental and closer to an everyday responsibility. When the Tamagotchi left Japan, it did so as difficult-to-anticipate phenomena often do: faster than the market could absorb. In May 1997 it arrived in the United States and, from there, spread to other markets, including European countries, in a very short period. Bandai went from managing a domestic launch to dealing with a global product, with supply problems, resale and a constant presence in media that did not always know how to fit that … Read more

force the United States out of its comfort zone

If today we were asked which country is leading the race for artificial intelligence, the most immediate answer would probably still be the United States. And it wouldn’t be an occurrence. For decades, the country has set the pace for technological innovation and a good part of the digital tools that we use daily come from their large companies. However, that leadership is no longer as incontestable as it once was. The board begins to move and there is an actor who is closing distances at a speed that is difficult to ignore. That actor is China. The question is no longer whether China competes, but how it got here. How a country identified for years as the world’s factory, associated with mass production and cheap labor, has become a benchmark for innovation and technological vanguard. In a new video from Xataka’s YouTube channelour colleague Francisco Franconi analyzes this process in detail and puts figures, context and nuances to a phenomenon that we are seeing develop almost in real time and that can alter the balance of power in the global technology sector. China is no longer just the world’s factory: it is building its own path in AI “China should be years behind the United States in the development of AIs. It is a fact, since between 85 and 95% of the global market of chips used in this sector belong to Nvidia,” explains Franconi. The data is key, but it does not explain everything. The race for artificial intelligence is not only played in the field of semiconductors. There are other structural factors that are equally determining, and one of them is energy. The video delves into the enormous energy gap that separates both countries and why this aspect is crucial to understanding Chinese progress. As Franconi points out, energy “is necessary to build chip factories, supercomputers and processing centers. Without it there is no industrial growth.” To contextualize this statement, the analysis uses data from the International Energy Agency that helps to measure the real scope of this advantage and its direct impact on industrial and technological development. Another of the axes of the video is resilience. Specifically, China’s ability to adapt and continue moving forward despite the sanctions and restrictions imposed by the different US administrations. Franconi focuses on the repeated limitations that affect NVIDIA, but also examines the case of Huawei and the role that startups such as deepseek in this new scenario. Talent appears as another of the fundamental pillars of this career. “A relevant fact is that China has a greater number of graduates in science, technology, engineering and mathematics, but the most shocking fact is that 50% of the world’s AI researchers are of Chinese origin“says Franconi. A figure that helps understand why the Asian country is gaining weight so quickly in development and research in artificial intelligence. The video also covers the current ecosystem of language models competing in the market and offers a clear snapshot of the position that China and the United States occupy in this technological race. An analysis that leads to our colleague’s conclusions about where this global pulse is heading and what implications it may have in the medium and long term. You can see now the full video on the Xataka YouTube channel. And, of course, we invite you to leave your comments both there and on this article. Images | Xataka In Xataka | Huawei is coming back. And not everyone is prepared for what is coming

We have accepted that sport is “medicine” for the body. Now science is discovering its side effects

Physical exercise can be prescribed as a drug in doctors’ offices, even though it is not packaged in a simple pill that we take. This is because the evidence behind it has made it more than clear that playing sports can prevent a large number of chronic diseasesI know even have a very good old age. But behind all this, too There is a negative part behind doing physical exercise. Its side effects. If we accept exercise as a drugwe must also accept that every drug has a leaflet, specific doses and of course some adverse effects. That is why as a society we have the problem of having begun to sell the fact of “exercising” in a generic way, ignoring the fine print that this task has, as recognized by the Spanish Heart Foundation itself. And it has a very simple solution: personalizing physical exercises per patient. The problem of metaphor. The slogan “exercise as medicine” is undoubtedly an excellent marketing campaign within the world of public health, but for science there are several important flaws. As different scientific studies point out, exercise does not act like a classic drugsince it does not have a predictable response in a patient as if it occurs as a pill. This forces us to always think that the effect can be very different for each person. In this way, by calling exercise a drug we can make invisible the diversity of individual responses. And there is no universal “squat pill”, since doing this exercise in a specific person can be very beneficial, but in another it can be be the origin of a pathology due to overload. And all because we throw ourselves into exercise without planning how to do it, since we find it very easy to pick up some weights and start building biceps. The damage numbers. We often hear that it is a great danger to stay sitting on the couch, and it is true because they are many diseases related to a sedentary lifestyle. But according to different studies done in the United States, people who meet or exceed the recommendations for moderate or vigorous exercise They have a 44 to 66% chance of developing musculoskeletal injuries. than subjects who remain inactive. In addition to this, although cardiovascular health improves with physical exercise because the heart reduces its heart rate, for example, the “maintenance cost” of the physical body increases dramatically with the amount of exercise done. A question of biases. Without a doubt, this is one of the most critical points that scientific literature reveals regarding the lack of transparency in clinical trials related to exercise. This is something that was seen in an analysis that included 103 trials on knee osteoarthritis, where it was found that 6% of the participants suffered direct damage from this exercise. But the most worrying thing is not the number, but the low information: many patients who abandon studies due to pain or discomfort are not classified as “victims of adverse effects”, which generates an artificially high perception of safety. This problem is repeated in oncology, where the motto “exercise is medicine in oncology” live with non-trivial adverse events which have forced us to propose much stricter monitoring systems to protect patients. We pass each other sometimes. The underlying problem in this case is undoubtedly recommending intensive or complex programs without a clear benefit/harm relationship compared to an alternative that is much simpler. But, on the other hand, we also fall into the phenomenon of “quaternary prevention” making medicine focus on avoiding harm from its own interventions. by overmedicalizingnullifying the benefits of physical exercise. The necessary consensus. In this way, the authors who popularized the concept of ‘exercise as medicine’ explicitly recognize that exercise is not without risks. Even the WHO itself In its guides it maintains that inactivity is the greatest population risk, but there is fine print that must be taken into account: Exercise should be ‘prescribed’ starting with a low intensity, and not opt ​​for maximum intensity from the first day. This causes a person who has spent years on a couch to begin to carry a lot of weight, for example, and end up injured. Pain is not always bad, and the patient must be educated so that they see that fatigue from the gym does not have to be medicalized with pills. Patients with cardiac risk must be evaluated to prevent uncontrolled exercise from aggravating the situation. Be supervised. The conclusion in this case is that exercise is obviously necessary and without a doubt it is one of the practices that can prevent the appearance of many diseases. But we always have to be aware of what we do. Loading the body with a large amount of exercise from minute 0 can cause significant injuries or the aggravation of diseases that are already present. In this way, the possibility of being in a gym with trainers who can advise on the progression curve that should be followed can be an interesting idea to have the benefit of exercise without the consequences of doing it aggressively. Images | Jonathan Borba In Xataka | Doing cardio or strength training: for science there is no debate about which is the ideal exercise after 50

The US electrical grid does not support so many data centers so they have had an idea: disconnect them to avoid blackouts

One third of all data centers in the world They are in the US and that is putting a huge burden on the electrical grid. One of the consequences that consumers are noticing is the price increases on the invoice, But electricity operators already foresee another problem: blackouts. What is happening. They tell it in WSJ. The US power grid is beginning to become strained, with grid operators expecting blackouts during periods of high demand. The solution they propose to avoid this is to make data centers disconnect from the network and use their own energy reserves temporarily. The technology companies have not been amused and talk about “discriminatory measures.” Why is it important. In 2023, data centers already consumed 4% of all the country’s electricity and the forecasts are that by 2028 that percentage will increase to 12%. The electrical grid is not prepared to support so much demand and, although it is already expanding, the pace of construction of new data centers is faster. Network operators face a difficult dilemma: powering data centers while maintaining supply to consumers. ‘Kill switch’. PJM Interconnection It is the organization that oversees the energy market in the Midwest, where they have already suffered from the problem of price increases. The concern that blackouts will occur is on the table and PJM has proposed that technology companies create their own energy sources or accept that their supply will be cut off if the network becomes too saturated. They are not the only ones who have raised something like this. With demand expected to double by 2035, Texas passed a law last year that contemplates a ‘kill switch’ that allows large consumers, such as data centers, to be disconnected at times when the network is under “extreme stress.” What the technologies say. As we said, the companies that own these data centers have not been very happy with the proposal. The Data Center Coalitionof which companies such as Google, Microsoft and AWS are part, have stated that the proposal is discriminatory since data centers need a reliable and stable network. They also warn that depending on their own energy reserves could have a negative environmental impact, by forcing them to use solutions such as diesel generators. Waiting times. There is an intermediate scenario in which technology companies can obtain benefits if they accept these conditions. As the electrical infrastructure does not support so much demand, data centers have to wait several years to be connected to the network, normally between 3 and 5 years, although there have been cases up to 8 years. Southwest Power Pool, the grid operator in Texas, has offered data centers a deal: give them access to the grid sooner in exchange for agreeing to be disconnected during times of high demand. According to a recent study Funded by Google, data centers that have more flexible connections (i.e., those that build their own power sources and accept temporary disconnections) typically connect to the grid several years faster than those that do not. Bring your own energy. Despite the reluctance towards that off button, generating your own energy is the most realistic solution and the one towards which the industry seems to be moving. Google recently bought an electrical company in order to obtain its own energy. Others big tech Amazon, Microsoft, Oracle or xAI are also exploring create your own energy solutions such as natural gas and solar panels. Image | Google In Xataka | Drastically reducing data center consumption is crucial for AI. And China has had an idea: submerge them in the sea

How the Sinaloa Cartel turned the marble industry into its methamphetamine logistics center

If Walter White had exchanged the New Mexico desert for the Mediterranean coast, his story would not have been very different from what the National Police has just revealed. in the series Breaking Badthe Albuquerque chemist hid his money under the sand and used a car wash to launder his “blue empire.” On the other hand, in the province of Alicante, the setting has been a marble industrial warehouse, an armored underground bunker and a statue of Popeye that, instead of spinach, kept the purest “crystal” of the Sinaloa Cartel. As if it were a script by Vince Gilligan, “Operation Saga” has revealed that the largest methamphetamine network in Europe did not operate from marginal shadows, but from the heart of the marble industry between Novelda and Monforte del Cid. He Heisenberg From this plot he decided that the marble blocks were the perfect container for the desires to expand the empire that Joaquín “El Chapo” Guzmán one day founded. The end of “Operation Saga”. The National Police, in a strategic alliance with the US DEA, has put the final lock on an investigation that began in 2023. According to the official press releasethis second phase has culminated in the total dismantling of a Spanish-Mexican organization responsible for turning Spain into the main hub of methamphetamine from the mainland. If in May 2024 the first phase already left a record number of 1,800 kilos of drugs seized, this new coup has ended with nine key arrests. The operation has managed to decapitate the infrastructure that the Sinaloa Cartel had woven between Tenerife, Madrid, Valencia and Alicante. A popeye 40 kilos. The logistics of the network were as ingenious as they were sophisticated. According to local mediathe narcotic was traveling from Mexico hidden inside imposing marble stones that were legally imported. Once in Spain, the organization used the business structure of a well-known marble worker in the area to move the drugs. We found the most surreal example in July 2024. The police intercepted a statue of Popeye, five feet tall and in metallic colors, bound for Tenerife. Its base was not solid metal, but contained 40 kilos of methamphetamine. The recipient, a “historic drug trafficker” on the island, was waiting for the shipment without knowing that the figure had been under police surveillance for months. This seizure made it possible to confirm that, after the 2024 coup, the organization was trying desperately refinance. The bunker and the salary of silence. Civil engineering put at the service of crime reached its zenith in a Novelda warehouse. There, the agents They found an underground bunker hidden under a heavy steel plate and a large stone block, where almost 3,000,000 euros in cash were kept. While the money was accumulating in Alicante, in the Madrid neighborhood of Malasaña, the organization supported a member of the Sinaloa Cartel “in reserve”. This man lived in a semi-cloistered regime in an apartment from which he barely left. He received a salary of 2,500 euros per month exclusively in exchange for his silence, since he knew the details of the entry of the 1,800 kilos of the first phase. A global drug network. The logistics brain was not at street level. The leader of the drug transporters, a Spaniard with a record of crimes against property, coordinated movements between Mexico and Spain through criminal teleworking from Dubai. From there he supervised not only the glass, but also secondary shipments, such as a 38-kilo shipment of marijuana intercepted in Finland. But why Spain? The answer lies in waste science. a study about wastewater (analyzing metabolites in urine) is the definitive tool to measure actual consumption. Although the consumption of methamphetamine in Spain is lower than that of cocaine, EUDA studies place Spain and the Netherlands as distribution hubs. In areas like Euskadi, for example, records of amphetamines in wastewater already show peaks that are eighty times the national average, an unequivocal sign that the market is there. The end of one era (or the beginning of another). The operation, directed by the Investigative Court number 6 of the National Court, has also seized seven luxury watches, geolocation devices and ammunition. With this, the Police consider that the most powerful criminal network of synthetic drugs in Europe has been dismantled. However, as Commissioner Alberto Morales warnsSinaloa’s persistence is legendary. Since 2009 they have tried to settle in Spain in every possible way: from “Chapo’s” cousin detained in the Palace Hotel in 2012, to the drug laboratories of “Los Chapitos” dismantled in Toledo in 2024. Today the Novelda bunker is empty and Popeye rests in the Canillas police facilities, but the authorities are clear that the “European dream” of the Mexican cartels is far from over. Image | lifestyle.sustainability and freepik Xataka | There is a huge gap between what we think medical marijuana does and what it actually does.

Neptune’s closest neighbor is Mercury

There is information that we have stored in our heads since we learned it, such as prepositions or the planets that make up the Solar System. And that has its handicaps: having to enumerate the list to get to what interests you or if you are already a few years old, finishing the string of planets with Pluto. Old spoiler: Pluto was demoted in 2006, despite the fact that there are scientists who they question the definition of a planet and therefore, its appearance or not on that list. What is the closest planet to Earth? Faced with that question and with the temptation to recite the burned-in list, many people will probably say Venus and just as many will say Mars. Reality has its substance and although the situation changes frequently, it is generally considered that the correct answer is Venus. In fact, taking a look at the distances between each pair of planets We would reach that same conclusion. Well, yes, but no. Mercury is the winning horse. To NASA It refers to Venus as “our closest planetary neighbor” and although this is true if we stick to which planet is closest to Earth, it is not true if we are interested in knowing which planet is closest on average. Here things change and have a new winner: Mercury. Mercury is the innermost planet in the solar system, but on average it spends more time near Earth than Venus. What’s more, Mercury is on average the closest planet to all the other planets in the Solar System. How is the proximity between planets considered?. The usual method is limited to subtracting the mean radius of the inner orbit from that of the outer orbit. Thus, the average distance between Earth (1 AU) and Venus (0.72 AU) would be 0.28 AU. When they are furthest away, Venus is 1.72 AU from Earth. Although it is intuitive to consider the average distance between each point of two concentric ellipses as the difference of their radii, in reality that difference only determines the average distance of the closest points of the ellipses. A more precise mathematical method that considers time. The average of both scenarios above improves the calculation, but is still imprecise, they explain scientists Tom Stockman, Gabriel Monroe and Samuel Cordner. So the American Institute of Physics devised a more precise mathematical method that averages the distance over time of the planets and in this scenario everything changes and not only for the Earth, but also for all the planets. The method in question is called point-circle (PCM) and models orbits as concentric and coplanar circles. Since the planets spend the same time at each point in their orbit, the average distance can be calculated by integrating all possible positions. Using this method, Venus is an average of 1.14 AU from Earth and Mercury is only 1.04 AU. As they explain: “We observe that the distance between two orbiting bodies is minimum when the inner orbit is the smallest. That observation gives rise to what we call the whirly-dirly corollary (a reference to an episode of the series Rick and Morty): for two bodies with approximately coplanar, concentric and circular orbits, the average distance between them decreases as the radius of the inner orbit decreases.” The checks. This research team ran a simulation that calculated the position of the eight planets over 10,000 years and recorded their distance. The results differed by 300% compared to the traditional method, but less than 1% compared to the point-circle method. Mercury is the closest to all. This discovery not only affects Earth. In fact, it can be generalized to any pair of bodies with approximately circular, concentric and coplanar orbits. With this method, the average distance between two bodies depends on the radius of the inner orbit and the smaller the inner orbit, the smaller the average distance. In short: Mercury is the closest planet to Earth, but also to Neptune and even the degraded Pluto. This finding, beyond changing the paradigm of how to consider distances between planets, may also be useful for estimating communications with satellites. In Xataka | Poland and Spain are the European countries that have increased their contribution to space the most. For very different reasons In Xataka | A planet has just disappeared: NASA’s Hubble telescope has captured a violent cosmic event that changes everything Cover | NASA Hubble Space Telescope

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.