the four things that change in the 2026 declaration

We are going to tell you the main ones news about the 2025 income tax returnwhich is what we will do during 2026. Despite being done this year, it is the 2025 Income because it is the one in which we will catch up with the last fiscal year. The statement, according to Income calendar 2025you will begin to be able to do it on April 8. Like every year, the Treasury has introduced some new features that seek to optimize the declaration or take into account things that were not taken into account before. Therefore, it is important to know them to know if they affect you. Income News 2025 Below we leave you a list with the four new features that you will find in this Income tax return that we are going to carry out in 2026. They are the following: New deduction for low-income workers: Taxpayers who receive income from work less than 18,276 euros per year and do not have other income greater than 6,500 euros may apply a new deduction of up to 340 euros. 340 euros is the maximum for those who receive the Minimum Interprofessional Wage, and it gradually decreases until it disappears at the upper limit of income. Tax incentives for sustainability: Deductions are maintained for the purchase of electric vehicles, installation of charging points and energy efficiency improvements in homes. Thus, if you invest in sustainable solutions, your tax burden will be reduced. Higher taxation for higher savings income: Those who have more, must pay more. Capital income (dividends, interest, capital gains) that exceeds 300,000 euros will be taxed at the maximum rate of 30%. Between 200,000 and 300,000 euros at 27%, between 50,000 and 200,000 euros at 23%, between 6,000 and 50,000 euros at 21%, and those up to 6,000 euros only 19%. Changes in personal income tax brackets and regional particularities: At the state level, the last section of the savings base has been raised from 14% to 15%, and adjustments are contemplated for taxpayers residing abroad. In addition, some autonomous communities may introduce additional modifications that affect the final fee. Of course, the structure of sections of the general tax base is maintained, these small changes have simply been introduced. In Xataka Basics | IRPF withholding calculator 2025: how to use it online to know your minimum withholding recommended by the Treasury

We believed that GLP-1 drugs were only going to change obesity. They just turned upside down how we treat addictions

The famous GLP-1 receptor agonistsamong which some protagonists such as Ozempic stand out, have revolutionized the treatment of type 2 diabetes and of obesity. However, for some time patients and doctors had been reporting a “side effect” that was as surprising as it was hopeful, since it was seen that this treatment made people not feel like drinking alcohol or smoking. New routes. What began as a trickle of anecdotes in doctors’ offices has ended up being the target of study by different research teams who have seen here a new way of understanding the mechanism of addictions in humans. Now, a recent study published in B.M.J. backed by new clinical trialssuggests that these medications could be the key to treating addictive substance use disorders. How it looked. The heavyweight of this new research is a gigantic cohort study published in 2026, where the data of 606,434 United States veterans with type 2 diabetes. Here it was divided into two groups: those who started treatment with GLP-1 drugs such as Ozempic and those who took SGLT2 inhibitorswhich is one of the accepted treatments for advanced type 2 diabetes. The results. But the most shocking data came when analyzing patients who already had a previous history of addictions. In this group, the use of Ozempic resulted in a dramatic decrease in addiction problems requiring urgent treatment, but also saw a lower rate of hospital admissions, lower drug-related mortality, a drop in overdoses, and even a significant reduction in suicidal ideation and attempts. The essays. Although observational studies are very valuable, they also you have to go to the laboratory to see what is happening. Here, a 2025 randomized trial demonstrated that taking Ozempic dramatically reduced alcohol self-administration in a laboratory setting. Here patients reported less anxiety about having to have a drink or a cigarette, fewer days of heavy consumption, and incidentally, a decrease in the number of cigarettes they smoked per day. In the past, a study published in 2022 showed that using exenatide it was not possible to generally reduce the days of consumption of these drugs, but it was possible to see how the drug had a direct effect on some specific parts of the brain that are related to the reward centers. Because? That a drug designed for the pancreas affects our relationship with alcohol and tobacco, the truth is that it can raise many questions. The answer lies in the brain, since some reviews suggest that GLP-1 receptors not only regulate blood sugar or slow down gastric emptying. These receptors are also found in key brain areas that control the dopamine pathway, which is why, by activating them, drugs such as emaglutide or liraglutide attenuate the sensation of reward. In rodents, for example, they block the reinforcement produced by substances such as cocaine, opioids or nicotine and, basically, the drug stops “feeling good.” A paradigm shift. As can be seen every day, constant drug use over time can have devastating consequences for the lives of people and those around them. The problem is that right now there are few approved pharmacological therapies to support these addicts, and this makes any clue to have a new therapeutic door welcome. Although more research and large-scale Phase III trials are needed for regulatory agencies to officially approve their psychiatric use, GLP-1 drugs appear to be doing something that medicine has been seeking for decades: “satiating” not only physical hunger, but also the brain’s chemical hunger. Images | lilartsy In Xataka | Ozempic not only eliminates hunger, it is rewriting the supermarket ticket: goodbye to ultra-processed foods and spending on snacks

OpenAI is now the bad guy of AI. GPT-5.4 will have to be very good to change that

He soap opera that has been assembled with the Department of Defense has made the perception clear in recent days for two of the leading companies in AI. Suddenly Anthropic She is the good one in the movie and OpenAI is the bad guy. And whether precisely for that reason or not, Sam Altman’s team has decided that now was the time to launch a new and promising AI model: GPT-5.4. Hello GPT-5.4. In it OpenAI official announcement explain how this new model will currently be available in two variants: GPT-5.4 Thinking and, for those who want “maximum performance in complex tasks”, GPT-5.4 Pro. We are looking at a foundational model that is better than ever in its reasoning, programming capacity and above all in one very fashionable thing: “agent flows”. Or what is the same: do things for us. The “Use My Computer” mode, protagonist. It is a free translation, but it is more or less what OpenAI highlights with what is probably the great novelty of this model. As they say in the announcement, this is their first model “with native computer use capabilities.” It is capable of taking control of our machine and doing things for us autonomously, completing complex cycles of action and solving problems that arise. Not only that: according to its creators GPT-5.4 “is our most token-efficient reasoning model, using significantly fewer tokens to solve problems than GPT-5.2.” Or what is the same: AI doing things for us will be cheaper and it will solve them even better. Use the computer better than us. The benchmarks certainly seem to point to fantastic performance in these tasks. In the OSWorld-Verified test, which measures a model’s ability to navigate a desktop environment using screenshots and virtual mouse and keyboard actions, GPT-5.4 achieves a 75% success rate. That is not only better than the 47.3% of GPT-5-2: it even exceeds human performance, which is 72.4% according to the creators of this benchmark. Other tests of this type that evaluate the ability of an AI model to navigate also make it clear that GPT-5.4 is clearly ahead of its predecessors. The ARC-AGI thing is scary. Machines were supposed to have a lot of trouble solving abstract reasoning problems that humans are naturally fantastic at, but oh well. In recent times we have seen how the ARC-AGI 2 test, which seemed like a challenge for AI models, has become increasingly acceptable for said models. GPT-5.4 gives a new bite to that reality, and in its Pro version it already manages to solve 83.3% of the tasks (73.3% in the standard model) when in GPT-2 the rate was 52.9%. It is a simply brutal jump, and although in other tasks that jump is not so notable (it programs somewhat better according to SWE-Bench Pro, but not much), it is clear that we are facing an extraordinary model. Perfect for OpenClaw? That ability seems to come to him that was not even painted OpenClawthe AI ​​agent that has become a phenomenon in this area in recent weeks. OpenAI ended up signing its creator and is in some way the “owner” of the projectand this performance in agentic tasks is expected to be very useful for everything OpenClaw does, which is basically that: manage your machine for you. That’s where GPT-5.4 can really come into its own. And you can trust him more. According to those responsible for OpenAI, GPT-5.4 is now better at answering questions that require seeking information from multiple sources, and “identifying the most relevant ones, particularly for “needle in a haystack” type questions, and synthesizing them into a clear and well-reasoned answer.” What’s more: they rate it as the model most focused on answering based on facts and say that it is 33% less likely to answer something that is false compared to GPT-5.2. But be careful: it is very, very expensive. These capabilities, however, will not come cheap. With this launch OpenAI has updated its prices, and it has done so by making it clear that if you want the best, you will have to pay for it. The “standard” GPT-5.4 model costs $2.50 per million input tokens and $15 for output tokens, while the Pro costs a whopping $30 and $180 respectively. Claude Opus 4.6, which was until now considered the best AI model, costs $10 per million input tokens and $25 per million output tokens: it was already expensive, but GPT-5.4 Pro leaves it almost as a “bargain” AI model. Trying to stop the bleeding. The model appears at a delicate moment. According to various sources, ChatGPT has lost 1.5 million users since announcing that they had reached an agreement with the Department of Defense. That decision provoked much criticism, a movement on networks that spoke of “cancel ChatGPT” and internal tensions. Before the scandal there was already talk of the potential appearance of GPT-5.4, but it is clear that the launch now takes on a double meaning. It doesn’t just have to be better than everyone else: it has to redeem OpenAI. And above all he needs a victory. Public perception seems clear: OpenAI has been suffering lately, whether from internal dramas, talent drains, or temporarily falling behind in the performance of its models. GPT-5.4 is not a simple evolution of its founding model, because what OpenAI needs is for this model to succeed and convince people to “love again” (figuratively, you know what we mean) ChatGPT. We’ll see if he succeeds. In Xataka | Sam Altman says he’s terrified of a world where AI companies believe themselves to be more powerful than the government. It’s just what you’re building

a textbook “if something works, don’t change it”, but with two beastly hearts

Apple is having a big week. The company is not attending the Mobile World Congress, but has decided to talk about it this week. Yesterday he presented the iPhone 17e and today they were playing the laptops. If last year they launched the MacBook Pro with an M5 processor, it was clear that it was our turn to see the older brothers. Next, we go with all the details of the new MacBook Pro with M5 Pro and M5 Max. Some beasts with eight times more AI performance (but with immense fine print). First of all, the technical sheet of the new laptops. Technical sheet of the MacBook Pro with M5 Pro and M5 Max 14 inch macbook pro 16 inch Macbook Pro screen 14.2-inch Liquid Retina XDR 3,024 by 1,964 pixels ProMotion technology with up to 120Hz adaptive refresh rate HDR 1,600 nits (peak) 16.2-inch Liquid Retina XDR 3,456 by 2,234 pixels ProMotion technology with up to 120Hz adaptive refresh rate HDR 1,600 nits (peak) processor Apple M5 Apple M5 Pro Apple M5 Max Apple M5 Pro Apple M5 Max RAM 16 to 128 GB 24 to 128 GB storage 1TB to 8TB 1TB to 8TB ports SDXC card slot HDMI port 3.5mm headphone jack MagSafe 3 port 3x Thunderbolt 4 (USB‑C) SDXC card slot HDMI port 3.5mm headphone jack MagSafe 3 port 3x Thunderbolt 4 (USB‑C) WEIGHT 1.55kg 2.14kg webcam and sound 12MP Center Stage and Top-down View Camera Six Dolby Atmos speakers 12MP Center Stage and Top-down View Camera Six Dolby Atmos speakers battery 72.4Wh Fast charging with a 96W adapter 100Wh Fast charging with 140W adapter operating system macOS 26 Tahoe macOS 26 Tahoe price From 1,929 euros From 3,049 euros Identical on the outside We said it in analysis of the MacBook Pro with M5: Apple is one of the few companies that can afford to launch a generation without it being noticed. With the iPhone it is something that was set in stone with the notch, now with the ‘Dynamic Island’ and with the MacBook the same thing is happening. They made the change by introducing the notch and giving us connection ports again and, since then, the changes have been more internal than external. In this sense, the new 14- and 16-inch MacBook Pro are no exception. In development…

the subtle change on your screen that your eyes will appreciate after eight hours of Excel

If you work or study for many hours every day in front of the PC screen, It is very likely that you will end up with tired eyes: reading a lot of text, watching videos, going from one Excel to another (and then to another, and another), writing, editing images or videos and a very long etcetera of tasks sustained over time will almost certainly cause you to have eye fatigue and a tiredness in your eyes that is as uncomfortable as it is unhealthy. If you have no choice but to be in front of the monitor for a good handful of hours each day (something quite common in many cases, in these times), you can always do everything possible on your part to minimize it and even remedy it: Get up every now and then, don’t stare too long at a time and look at distant objects, stand at a distance that prevents you from straining your eyes too much, and more similar tricks. And added to all this, I am going to give you an idea that perhaps you had not considered and that is not going to solve your life, but it will make you gain visual comfort (and I say this from experience): using a monitor with a high refresh rate. Even if you don’t play. Because having more than 60 Hz (120 Hz, 144 Hz and even more) is an excellent idea outside of the field of video games. And (spoiler) nowadays, the price difference between some monitors and others is so small, that I can tell you that it is very worth it that little extra investment. What is refresh rate (and what does it affect) Okay, on paper, purely speaking of specs, 120Hz is better than 60Hz. And 144 Hzbetter than 120 Hz. That has become clear to us. But exactly what are we talking about? We are talking about hertz, which in short determine the number of frames per second that the screen in question is capable of displaying. In other words: the number of still images that appear, one after another, in one second of time. The greater the quantity, the more fluid the image, which can be a video game, a video or the apps we use in our daily lives. The latter, just what we are looking for. In practice, having more than 60 Hz and therefore a higher refresh rate translates into fluidity. Fluidity in everything: transitions, application effects, window and cursor movements, scroll much softer and, ultimately, everything that generates movement on the screen. Something that may seem minor, but in the long run, and after spending hours in front of the monitor, it is noticeable. A one-way road. At this point, we must take into account the ‘price factor’. Because long ago, going above that base 60 Hz from which monitors start was expensive. However now, with refresh rates that even exceed 360 Hzwe find 144 Hz (or 120 Hz, or 165 Hz and even more) options at great prices. Which means that for a fairly contained investment you can make a huge leap in quality. The good and the bad, face to face Although the theory is simple, the differences between one type of monitor and another can be confusing if you are not familiar with it. Then, This table as a summary will clarify it a lot for you.. 60Hz 144Hz THE GOOD 🟢 Cheaper (and you can allocate more budget to other specifications: resolution, size…) Great fluidity in images and a standard to play today THE BAD 🔴 They offer less fluidity and are somewhat outdated in 2026, as they can make the jump to 144 Hz or more at similar prices Slightly more expensive than 60 Hz and, in general, more striking gaming design IDEAL FOR Users who do not feel visual fatigue, who feel comfortable at 60 Hz and do not want to spend more Gamers (or non-gamers) who want to take a leap in visual comfort Which one may interest you more: we do the math As we have already mentioned, price is not a determining factor today when deciding between a 60 Hz monitor and another with 144 Hz or more. Even so, if a 60 Hz one is enough for you, you can dedicate that extra investment in other aspects of the screens, such as the resolution, the diagonal or the format. Actual use: 60 Hz is enough for you and you prefer to spend what 144 Hz would cost you on a ultrawidebecause you need more horizontal space on the screen. What experience you get: similar to what you have been obtaining with previous 60 Hz monitors, but you gain in those other characteristics that are important to you (more diagonal, different format, more resolution…). If, on the other hand, you notice that after finishing the work day your eyes are very tired and it seems that applications, transitions and other movements are not as fluid as you would like, then going from 60 Hz to 144 Hz or more is an excellent decision. Spending little more than you would with a 60 Hz monitor, you double (and even more) its refresh rate and the view thanks you. Actual use: It bothers you that the animations of the operating system, the scroll or the passing of the cursor across the screen goes in fits and starts and you decide to go above 60 Hz. What experience you get: From the first second, you see that everything runs more smoothly and is more comfortable for the eyes. Where before there were almost imperceptible but existing cuts, now everything is going smoothly. It even looks like you’ve upgraded to a better PC! In summary: 👉 Choose 60 Hz if: You don’t notice visual fatigue because you don’t spend too many hours in front of the screen, you don’t want to spend more and you also don’t play games or plan to do so in the short term. … Read more

This is how climate change multiplied the devastation of the DANA in Valencia

October 29, 2024 was marked as one of the most tragic days in the recent history of Spain due to the DANA that hit the region of Valencia and left 230 fatalities, billions in economic losses and rainfall that shattered records. And it is no wonder, because in stations like Turís, they accumulated 771.8 mm in just 16 hours and the national record for rainfall in one hour was broken with 184.6 mm. And now investigations are emerging about it. Climate change. We know that this effect is altering the hydrological cycle at a global level, but now a new and exhaustive published study in Nature led by researcher Carlos Calvo-Sancho, has managed to measure exactly how and how much this storm was ‘doped’ by blame for anthropogenic global warming. And the most interesting thing is that it opens the door to the fact that these phenomena may be more common in the coming years. Pure physics. Days after the disaster, rapid attribution initiatives such as Attribution and ClimaMeter They had already estimated, according to the most basic parameters, that this meteorological event had been twice as likely and 13% more intense due to climate change. Although at that time it was simply preliminary data that required confirmation and above all ‘sitting down’ to analyze it well. That analysis has arrived many months later in a new work that goes far beyond these quick figures and focuses on the physical fundamentals. Here the researchers used very high resolution simulations under an approach called ‘Pseudo-Global Warming’. A simulation. This approach is nothing more than recreating the October 2024 storm on a computer to see the devastation that occurred and then simulating it again by removing the effects of global warming from the formula. This is achieved by returning the atmosphere to the conditions of the pre-industrial era, which is like a reference point when talking about climate change. The data. By comparing both simulated worlds, the supercomputer results showed the tremendous impact of the human hand on the storm. The most interesting results that were obtained can be summarized in four different points: Six-hour rainfall rates intensified by 21% under current weather conditions. The territory affected by rains exceeding 180 liters per square meter, which for the AEMET is the red notice limit, was expanded by 55%. The total volume of water falling directly on the Júcar River basin increased by 19%. The intensity of rain in one hour increased at a rate of 20% for each degree Celsius of temperature, something that is very relevant. And to understand it, we have to go to the Clausius-Clapeyron relationship, which dictates that the atmosphere should retain 7% more water vapor for each extra degree of temperature. Something that was duplicated here. Because?. Here the question that many people can ask, both from the affected areas and from other parts of Spain, is clear: Why did it rain so much more than what the basic theory dictated? Here the science suggests that it all started with unusually high temperatures on the surface of the Mediterranean Sea, which reached record levels in the summer of 2024. This injected a huge amount of water vapor into the system, and when comparing the current simulation with the pre-industrial one, the scientists detected, among other things, an 11.9% increase in the water that could precipitate or 11.9% more violent and faster updrafts. The perfect cocktail. In short, the greater amount of water evaporated in the sea by high temperatures and air not only caused more rain, but also triggered an aerodynamic and thermal domino effect that made the storm much larger, longer lasting and more destructive than could be expected. Towards the future. These findings are important to understand exactly what happened here, but they also raise a big warning: extreme hydrometeorological phenomena in the western Mediterranean are evolving aggressively. In this way, the study highlights that the future scenarios projected by climatologists are already here, making it urgent and vital that we rethink our urban planning and our adaptation strategies to prepare for storms that are going to be increasingly more aggressive, as we keep seeing. Images | EMU Chris LeBoutillier In Xataka | Some say worrying about climate change is a “first world problem.” A macro survey proves him right

Thousands of people change their clothes right after work. Neuroscience has something to say: they are right

The sound is almost universal: the jingling of keys in the entryway, immediately followed by the sound of a zipper being lowered, a button being released, or a bra being unclasped. For millions of people, the day doesn’t end when they clock in at the office or close their laptop, but rather the moment they take off their stiff jeans, suit or uniform and slip into something soft. That sigh of relief is not just physical; It is the acoustic signal that the brain has just changed gears. The Scandinavians, experts in naming the intangible, are clear about it. In fact, the Danes use the term Hyggebukser to define those pants that you would never wear to go out, but that are so comfortable that, secretly, they are your favorites. But this goes beyond a Nordic trend. Meik Wiking, director of the Happiness Research Institute, explains in his book Hygge Home that the objective of this clothing is to offer “a break for your responsible, stressed and compliant adult self.” It’s about creating a sensation soft that prompts the brain to feel safe, allowing us to “experience the happiness of simple pleasures knowing there is nothing to worry about.” To understand why this gesture has become vital, we must first understand what we have lost. Historically, work and home clothes were not so differentiated until the arrival of the Industrial Revolution, which standardized indoor work spaces. However, in the modern era, the line has become dangerously blurred. As journalist Amanda Mull points outwe are experiencing a “leak” (seepage) from work to home. Before, taking off the uniform guaranteed mental freedom. Now, “many people wear the same jeans they wore to work to cook dinner, with their cell phones and laptops never too far away,” which prevents the mind and body from truly disconnecting from productive work. This phenomenon worsened after the pandemic. Five years after the health crisis, the fashion sector is still “knocked out”, as they point out in Herald. The consumer has changed his priorities: he prefers to invest in experiences rather than formal clothing, and the rise of teleworking has reduced the need for complex wardrobes. According to Eduardo Zamácola, president of Acotex, in statements to the same medium: “People go to work with versatile, casual-style garments; the most dressed pieces have taken a backseat.” However, this permanent convenience comes at a price. Although teleworking has been shown to make us happier and allow us to sleep 27 minutes more on average, it also has brought new challenges to separate leisure and business times. The Science of “Clothing Cognition” This is where science validates intuition. Changing clothes is not a superficial matter; It is a cognitive tool. Researchers Hajo Adam and Adam D. Galinsky coined the term Enclothed Cognition (Apparel Cognition) to describe how clothing systematically influences the wearer’s psychological processes. In their famous experiment, they showed that subjects wearing a lab coat described as “doctor’s” increased their sustained attention compared to those wearing the same coat described as “painter’s.” The conclusion is fascinating: the effect depends on two simultaneous factors, “the physical experience of wearing the clothing and its symbolic meaning.” If we extrapolate it to the living room of our house, the logic holds: if your brain associates tracksuits or pajamas with “absolute rest”, putting them on will physiologically activate relaxation. But if you wear those same clothes to work, you break the symbolic association and the cognitive “spell” disappears. This connects directly to the theory of “Role Transitions.” Researchers Blake Ashforth and Glen Kreiner explain what we need “micro-transitions” or rites of passage to cross the boundaries between our different roles (from employee to parent, from boss to partner). Changing clothes acts as a physical and psychological boundary that facilitates this transition, preventing the stress of one role from contaminating the other. Ritual as anxiolytic From clinical psychology, the action of changing is understood as a direct message to our biology. “Clothing works as a direct message to the brain. Taking off your outer clothing (…) is a very clear way of telling your nervous system ‘you can slow down now,’” explains psychologist Marta Calderero to Vogue. It is pure contextual learning. Furthermore, the act itself has power. A study published in Organizational Behavior and Human Decision Processes confirms that the rituals —defined as predefined sequences of symbolic actions— are effective tools to regain a sense of control and reduce anxiety. Performing the ritual of changing clothes when you get home reduces uncertainty and prepares the individual for a different mental state. But be careful, comfort should not mean sloppiness. Style expert Anuschka Rees warns in his book The Curated Closet about the importance of identity at home. As he points out: “Not just any old cloth will do. Choosing clothes that also represent you when you are at home, not just when you go out or when they see you, is super important on an identity level.” Home clothes should be a “healing wardrobe”, lovingly chosen to generate real well-being. So for those working from home, the strategy must be even stricter. The psychologist Isabel Aranda warns that “The fact that you wear the same clothes all day transmits a flat rhythm and makes every day seem the same”, distorting our perception of time and affecting our biorhythms. The recommendation is even if you don’t go out, change. Wear one clothes to work and a different one to rest. “It’s a way of telling your body that you’re still active,” says Aranda. Interestingly, there is a counterpoint in the corporate world known as the “red shoe effect” (red-sneakers effect), where breaking the dress code (like Mark Zuckerberg with his sweatshirt) can denote status and power. However, in the privacy of the home, we do not seek power over others, but power over our own well-being. In an increasingly volatile and uncertain outside world, where fashion and work schedules have lost their rigid structure, home remains our refuge. Changing clothes when crossing the … Read more

AI is starting to change that dynamic

For years, Apple was more than just a mobile phone manufacturer: it was the customer that everyone wanted to keep happy. The company that could negotiate with suppliers and reserve capacity in advance. But that stage is beginning to break down for a very specific reason: the industry has started buying hardware for artificial intelligence on an enormous scale, and that new appetite is reordering priorities. AI companies are willing to pay more and secure supplies up front, a shift that is beginning to put pressure on something Apple has protected like a treasure: its margins. Memory, the bottleneck. The easiest example to understand is in something as everyday as the storage and speed of the iPhone. It’s no secret that memory chips are in short supply due to the explosion of AI, and that is pushing prices up. Tim Cook dropped it in the last earnings call acknowledging limitations in chip supply and that memory prices were rising “significantly.” Not so comfortable terrain. The Wall Street Journal points out that AI giants are willing to close agreements with very attractive conditions for suppliers, including the possibility of securing supply with firm commitments and advance payments. This context gives room for companies like Samsung Electronics and SK Hynix to raise prices on certain DRAM chips destined for Apple. Even on the less visible plane there is friction: many engineers who previously worked on improving displays for smartphones now also spend time on specialized glass for packaging advanced AI chips. It’s a silent fight for capacity and attention. Alternative to TSMC. The newspaper says that TSMC is doing more business with NVIDIA and other AI companies. Consequently, according to anonymous sources with knowledge of the plans, Apple would be exploring the option of manufacturing some less advanced processors with another supplier. No name has yet been released, but it would be a fairly important change in the Cupertino company’s supply chain. How it affects us. In the short term, the blow is taken by the income statement: if components rise, margins suffer, even in a company used to working with ease. For the consumer the scenario is more ambiguous. The well-known analyst Ming-chi Kuo estimates that Apple would not expect to raise the price of the next iPhone if they are equipped in a similar way to the iPhone 17. That doesn’t take the pressure off, but it suggests that adjustment could come through other avenues, from configurations to tighter margins. Images | Apple In Xataka | The AI ​​bill for Meta has grown by 400% since 2023: Zuckerberg wants to lead the sector at any cost in 2026

Mercadona and the rest of the supermarkets spend tons of paper on receipts that no one reads. Now they want to change it

You go to the supermarket, you buy a couple of things (just enough for dinner), you go to the checkout, they give you the ticket, you put it in your pocket and you leave with the bag in the direction of the parking lot. Pure routine. Our daily bread. If the employer’s retail achieves its objective, there is one element of that scene, however, that will change radically. Which? That ticket that you will end up throwing away without even reading it. What has happened? Every year supermarkets print millions and millions of strips of paper in which in many cases only a handful of articles appear, so they end up in the garbage can without anyone having even looked at them. It is a waste, a waste of resources. For chains like Dia, Lidl or Mercadona, but also for the environment. So Asedas (Spanish Association of Distributors, Self-Service and Supermarkets) has had an idea: they want us to start printing receipts only when the customer requests it. What do they want? The news I advanced it on thursday theEconomist. Asedas has proposed to the Government that it slightly tweak the regulations that regulate tickets so that they are no longer printed systematically. That does not mean that they are no longer issued or that the customer no longer has a receipt that clarifies what they have purchased and how much they have been charged. The change would focus on support. The idea, clarifies Ignacio García, head of Asedas, is “that the ticket continues to be generated electronically for control purposes, but that it is printed on paper at the consumer’s request.” That is, the user can request the physical or digital ticket. Right now, remember theEconomistthe regulations provide that supers deliver the receipt in two ways: either in paper or digital format. What’s happening? Since not all clients are in favor of handing over their data (including email) to the chains, in the end they have no choice but to print it. Not only that. The employer’s data They show that many of the times we go to the supermarket we buy only a handful of items, so the receipts show small transactions, for low amounts that we do not even review. Result: those papers end up in the trash as they are printed. It is not even strange for the customer to reject them when the cashier offers them to them. Is it that serious? “Our companies have been confirming for years that, in about a third of operations, the ticket is abandoned at the checkout line,” confirm Garcia. It is not surprising if we take into account the data on the shopping basket managed by Asedas. According to their estimates, 30% of the operations registered in supermarkets respond to almost urgent visits, during which we take home at most four products and spend less than 10 euros. In 60% of cases, purchases involve between five and 25 products with average tickets of between 10 and 50 euros. Only the remaining 10% actually respond to large purchases. In practice, the fact that all operations end up reflected in a receipt means that the supers generate about 5 billion tickets that require the use of almost 4,500 tons of paper and a million-dollar expense. Is it important? Beyond the millions of receipts that are printed each year and the cost that this entails in tons of paper and euros, Asedas’ proposal is interesting for at least two reasons. To start with who throws it. Asedas presume to be “the first food distribution business organization in Spain” and cover 19,200 retail stores and 495 wholesalers. Between your partners Companies such as Mercadona, Lidl, Aldi or Dia appear. Another key is that its idea is in line with what is already done in other European countries. For example, in 2023 France said goodbye to the generation of tickets by default precisely because of the amount of paper it consumed. That doesn’t mean they no longer exist, but they must be requested. In the Netherlands, Switzerland and Sweden there have also been changes related to the generation of receipts. In Spain itself, some large chains they take time moving towards the digital ticket. Images | Xataka Mobile and Wikipedia In Xataka | There was a time not too long ago when the future of supermarkets seemed like Amazon Go. Now Amazon Go is dead

What Zealand explains about climate change

For decades, geography books taught us that the world was divided into six continents. In 2017, the scientific community made official the existence of a new “intruder” of colossal dimensions: Zealandia. With an extension of 4.9 million square kilometers —equivalent to the entire European Union—, this mass of continental crust separated from Australia and Antarctica about 80 million years ago. What makes Zealandia an absolute anomaly is not just its size, but how well it has been hidden. Unlike the rest of the continents, 94% of its body is sunk under the Pacific. Only its highest peaks manage to show their heads, forming what we step on today such as New Zealand and New Caledonia. This geographical timidity kept it as a “ghost continent” until technology allowed us, finally, to pierce the abyss. The mission to rescue the past from the abyss. Everything changed in the summer of 2017. Expedition 371 of the International Ocean Discovery Program (IODP) It was not a pleasure cruise: the ship JOIDES Resolution set sail with a mission almost surgical. For two months, 32 scientists worked piecework, in 24-hour shifts, to extract “witnesses” from the seabed: cylinders of rock and sediment recovered almost five kilometers deep. These sediment cores are not just mud and stone. They are, in the words of paleontologist Laia Alegret, in statements collected by The Conversation, authentic “libraries of climate history”. The findings were surprising, despite being under the sea today, the scientists found pollen from land plants and spores, in addition to thousands of microfossils of organisms They only live in very warm and shallow waters. This confirmed that Zealandia was not always an underwater world, but had periods of land covered in vegetation. The “mirror” of future climate change. The relevance of Zealandia goes far beyond a geological curiosity. According to researchers from Rice Universitythis submerged continent constitutes a “critical region” for climate science, precisely because it is one of the places where current climate models show the greatest deficiencies. If models fail to accurately reproduce Zealand’s past climate, they warn, their predictions of future global warming may be incomplete or biased. The focus of attention is placed especially on the Eocene, between 53 and 41 million years ago, a time in which the Earth functioned as a true “greenhouse planet”. Carbon dioxide concentrations were much higher than today and there were no permanent polar caps. Studying this period in Zealand allows scientists to “look back at our future,” offering a glimpse of how the planet will respond to conditions extreme greenhouse effect greenhouse effects similar to those we could achieve in the coming centuries. One of the hottest spots. One of the most disturbing findings was the identification of episodes of rapid warming—rapid in geological terms, that is, on scales of thousands of years—during which ocean currents changed unexpectedly. The sediments reveal the arrival of deep water masses originating near Antarctica, a phenomenon difficult to explain in a warm world without permanent ice. This discovery, underlined by The Conversationchallenges the current understanding of how heat is redistributed in the oceans and forces us to rethink some basic assumptions of global ocean circulation. The violent birth in the “Ring of Fire.” Zealand’s history is one of a geological “roller coaster” driven by plate tectonics. According to the results published by the expedition directorsRupert Sutherland and Gerald Dickens, the continent was sculpted by two major tectonic events: The Great Divorce: First, it was torn from Australia and Antarctica 85 million years ago, stretching and thinning until it sank. The Resurrection of Subduction: About 50 million years ago, something “globally significant” happened. What scientists call a “massive subduction rupture” began, giving rise to the Pacific Ring of Fire. In simplified terms, this process caused huge portions of the seafloor to curve, parts of Zealand to temporarily rise above sea level, and then the continent to sink back more than a kilometer to its current configuration. It was not a local phenomenon. These tectonic forces altered the direction and speed of movement of many tectonic plates across the planet, in one of the largest geodynamic readjustments of the last 80 million years. Microfossils and the response of life. To reconstruct these movements with surgical precision, scientists they rely on benthic foraminifera. These single-celled shelled organisms are “diagnostics of the deep.” By analyzing its remains in the ship’s laboratories, researchers can determine whether a rock stratum belonged to a shallow beach or an abyssal plain. Furthermore, complementary technical studies, such as those presented in Paleoceanography and Paleoclimatology and marine Micropaleontologyanalyze the biotic response to hyperthermals (peaks of extreme heat). The results indicate that marine life does not react uniformly: the magnitude and speed of warming determine whether ecosystems adapt, reorganize or become stressed. These data are essential to improve predictive models of current climate change. A sea of ​​discoveries at risk. Zealandia’s exploration has shown that continents remain to be discovered and that the ocean floor holds the answers to the most pressing questions about our climate survival. However, science depends not only on curiosity, but on investment. Despite the scientific success of the 2017 expedition, there are countries that later do not intervene adequately due to lack of payments. This leaves future expeditions in the air that could continue to unravel the mysteries of this seventh submerged continent, a territory that, although hidden under thousands of meters of water, has a lot to say about the air we will breathe tomorrow. Image | Unsplash and World Data Center for Geophysics & Marine Geology Xataka | For thousands of years, human beings have avoided crossing the Taklamakan Desert. Now China is raising fish there

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.