While the news says that it is dying between solar panels and expropriations, the data says the opposite

Solar panels that destroy olive trees, massive expropriations in Jaén, warnings that Spain will have to import oil… If we pay attention to the last few months of news about the world of olives, the conclusions are clear: these are bad times for the olive grove. And yet, the data does not confirm this. In fact, as they point out from Datadistathe surface area of ​​this crop has not stopped growing in the last 10 years. The olive grove does not stop growing. With the only exception of the small decline in 2022 (0.08% already recovered in 2023), the hectares of olive groves have grown every year. However, that does not mean that there is no problem. Almost the opposite. The olive grove grows, but it does so in a profoundly unequal way: irrigated land gains ground over dry land, the super-intensive olive tree in a hedge extends over land previously dedicated to cereal or cotton, and investment funds are concentrated in areas with more water. In this sense, the story is not about the disappearance of the olive tree. It’s about changing so much and so fast that it will soon be unrecognizable. What the data says. Apparently, the data is clear. According to provisional data from the Survey on Crop Areas and Yields (ESYRCE) 2025 From the Ministry of Agriculture, Fisheries and Food (MAPA), the olive grove area in Spain reached 2,873,396 hectares, 1.63% more than in 2024 and 5% more than in 2015. It’s just that if we look closely, that data tells a curious (and sometimes counterintuitive) story. For example: the olive tree is already the largest irrigated area in the country. And why does this change occur? Above all, because the irrigated olive grove is safer than the dry one. If it were possible, the entire Spanish olive grove would switch to irrigation regime overnight. Therefore, the interesting thing is to stop and think about why the accelerated change is occurring now. According to a February 2026 report from Datadistathe explanation has a first and last name: investment funds. In the last decade, these funds have gone from 45 to 1000, investment in “Iberian agribusiness” has tripled and this is converting many hectares into super-intensive olive trees (and abandoning the traditional one). And the situation feeds on itself. The growth of the super-intensive irrigated olive grove cushions the volatility of supply and, therefore, contains price spikes. That is, the sector becomes more attractive to investors. This is precisely what ensures the future of olive oil. Even if it is at the cost of changing it completely. Image | Vasilis Caravitis In Xataka | The very high prices of olive oil are just one symptom. The real problem is a sector on the way to disaster

The RAM crisis is very good news for someone. That someone is Samsung

The great supply crisis in 2026 is starring memoirs. Samsung, SK Hynix and Micron control 90% of global DRAM production, and can currently only cover about 60% of projected demand. This is bad news for consumers, and excellent news for giants that cannot keep up with selling memory. Tell Samsung. Samsung Electronics has published its financial report corresponding to the first quarter of 2026. The company recorded revenue of 133.9 trillion won and, so that we understand each other better, this is its all-time quarterly high, with a 43% increase compared to the previous quarter. Memories, memories, memories. The figure is even more surprising if we look closely at the Device Solutions Division, in which its memory business is located. It recorded a sales increase of 86% compared to the previous quarter, with a historical record in operating profit. Samsung itself details that this boom comes from the hand of much higher demand and, to no one’s surprise, a sales price that has increased in the industry. It is not something isolated. Sales related to memories and semiconductors will continue with strong demand throughout the second quarter, Samsung predicts. The company wants to continue capitalizing on demand for GPUs, CPUs and DRAMexpecting advances in agentic AI to continue accelerating demand growth. Why is it important. Samsung’s results are not only good news for the company’s shareholders: they are a reflection of a change in the industry that is here to stay. The RAM crisis will change forever the price of the products we buywill make companies that have never participated in the manufacture of memories have to start considering doing so, like teslaand positions manufacturers like Samsung in a position of power that they have not had for years. The new Samsung. Samsung has always been relevant in semiconductors and memories, but currently this division accounts for 94% of the company’s total operating profit. Virtually every won Samsung earns comes from its device solutions business (RAM and chips). And what about mobile phones?. Although Samsung’s near future will be led by a single division, the company gives enough clues about its future in a territory that touches the average user very closely: mobile phones. Its DX division (in which smartphones are found) grew 19%, with more sales and more profit compared to the previous quarter. Samsung expects a slight drop in revenue next quarter, although it will continue to focus on three clear pillars: high end, folding and series A. In Xataka | There is a company that has grown 3,000% in the stock market, even beating the performance of Nvidia: Sandisk

DeepSeek V4 is here. It’s good news for efficiency and bad news for the myth

DeepSeek has published its V4 model under MIT license, with notable improvements in code and architecture designed for Chinese chips. It has also admitted, in its own technical report, that it is three to six months behind leading Western models. For a laboratory that A little over a year ago the global narrative of AI changedthat is much more than a nuance. Why is it important. DeepSeek became a symbol in January 2025. Its moment shook the markets, questioned the logic of the American technology stock market and convinced half the world that China could compete head-to-head on the frontier of AI, at a fraction of the cost. It’s not that V4 destroys that story, but it does complicate it a bit. China’s most important laboratory in AI arrives with a model that its own engineers describe as a step, not a leap. The context. V4 has taken longer than expected to arrive. According to sector sources collected for 36KrDeepSeek suffered a serious training failure in mid-2025 while trying to migrate its infrastructure from NVIDIA to Huawei’s Ascend chips. Internal opinions on technical direction were not aligned, and the founder, Liang Wenfengimposed conditions that were difficult to execute. The result: months of delay and a model that, furthermore, is still not multimodal, postponed due to lack of computing capacity and cash. Between the lines. The most interesting thing about V4 is in its architecture. The model introduces TileLang, a domain-specific language that allows low-level code to be decoupled from CUDA (the NVIDIA standard) and compile it for different chips. It also incorporates MegaMoE, a kernel designed to reduce latency in expert parallelism that already runs on Ascend hardware. But V4 training has continued using NVIDIA GPUs. Independence is, for the moment, more of an aspiration than an accomplished fact. turning point. While DeepSeek looked inward, the Chinese market has been reorganizing itself without it: Doubaofrom ByteDance, has become the most downloaded chatbot in China. MiniMax and Z.ai They have gone public. Alibaba has achieved great adoption thanks to vertical applications. DeepSeek never wanted to build a consumer product, and the market hasn’t waited for it. The internal bill has also arrived: the laboratory has lost key talent to Tencent, ByteDance and Xiaomi in practically all areas. Liang Wenfeng refused to give up 20% to an unidentified large investor. And now, for the first time, DeepSeek opens an external funding round. Main loser? The narrative of open source Chinese as a real alternative to the Western closed model has taken a hit. A Qwen employee has told 36Kr that “the golden age of nonprofit AI development is over.” The big question. It’s whether DeepSeek can regain lost ground. That depends largely on Huawei, whose Ascend 950 promises to scale well with V4, but 750,000 units are equivalent, adjusted for quality, to a week of American production. The gap is not closed with ingenious architectures. It is closed with silicon. In Xataka | Companies around the world face an irresolvable dilemma: either they are with China or with the US, with both it is no longer possible Featured image | Solen Feyissa

AI enters the era of CPUs. To no one’s surprise, this is bad news for the consumer.

The current state of data centers is redefining the production lines of the main technological players. What seemed like a specific crisis in the price of RAM and SSDs has ended up becoming a tsunami that is sweeping away products and the consumer market. The data centers They need the same components as consumers and the rest of the industry, and there was one component that seemed safe: the processors. It’s over. Danger! During his presentation of results In the first quarter of 2026, Intel gave a worrying piece of information: the ratio of CPUs and GPUs in data centers could soon reach 1:1. So far, we’ve talked about memory and GPUs as the primary hardware in data centers, but there needs to be a CPU running the show, and currently, there was one CPU for every eight GPUs. However, things are starting to change due to agentic AI. How to know the components of your PC (RAM, Graphics, CPU…) and the state they are in Training is still important, but now we seem to be entering the era of inference, and that’s where CPUs excel. David Zinsner, Intel’s chief financial officer, described in the call with investors that the CPU/GPU ratio had already gone from 1:8 to 1:4 (one CPU for every four GPUs), but that this agentic AI was exploding the memory of the CPUs, approaching the aforementioned 1:1. Change of course. I think you can now anticipate where the shots are going to go. As we read in Tom’s Hardwarethis action has caused a reaction: that Intel begins to move its production lines to begin to reduce its capacity in consumer products and increase the output of Xeon processors, which are indicated for servers and continuous work. All because, currently, delivery times for server CPUs are about six months away and they cannot allow AMD, which also has its products for servers, to beat them in that race. Consequences. Price increases. Without palliatives. Xeon CPU prices are estimated to have risen in March by 10% to 20% due to that shortage, but consumer CPUs have also increased in price by 5% to 10%. It is not going to stop there, as another 10% price increase is expected for the second half of the year. In fact, it is the same as with mobile phones, with Intel pointing out that the consumer PC market will decrease by double digits this year. Here it happens as in the rest of the segments: manufacturers are raising prices for everyone (hyperscalers and consumers), but they allocate their production to data centers because they are the ones that will buy the most volume from them. Intel doesn’t care if there is no CPU for users because it is not the bulk of its current market, but what it cannot afford is to have hyperscalers for more than six months expectingespecially if data centers are expected to have to mount more CPUs in the short term. The objective: the great foundry. Intel has been in the doldrums for a few years. In the consumer segment, AMD’s Ryzens have beaten them to the punch, the ARC GPUs have not finished coming together and things were not going well. A series of poor results led the United States government to invest $2 billion to ‘rescue’ the company with one goal: to make it the largest American foundry. Because, even if things did not go well, they are still one of the few companies in the world with the capacity to create chips, like TSMC or Samsung. They have some of the best machines on the market and that government bailout soon began to bear fruit with clients such as Apple and Nvidia. During the presentation of results A few days ago, Intel declared a net loss of $3.7 billion, but something happened nonetheless: the stock rose 20%. The reason is that investors are not looking at Intel’s present, but rather its future, and the changes applied in recent months seem to be going in the right direction. They are not the only ones. This change of direction and production lines is not exclusive to Intel. We have seen it in other companies, but it is true that here they directly advocate leaving out the consumer to prioritize the large customer: Big Tech. Something similar happened with Samsung a few days ago, when it was reported that the company had begun to move your LPDDR4 memory production lines to LPDDR5. This type of memory is better, but also more expensive, which will cause devices that previously mounted LPDDR4 memory (low-end miniPC or entry- and mid-range mobile phones) to have to go directly for LPDDR5 memory that is faster, but also more expensive. In the end, the translation is the same as always: as users, we are going to have to tighten our belts and hold on with the devices we have for a while longer. How long? Until 2027 if you ask some, 2028 if you ask others or if that… 2030. Images | Intel In Xataka | There is no energy for so many data centers and the consequence is clear: half of those planned for 2026 in the US are in danger

There are people reselling tickets to the World Cup final for 2.3 million dollars. Great news for FIFA

It is still too early to know if the 2026 World Cup will be a success, a failure or will be added without pain or glory to the extensive chronicle of FIFA. What we can say at this point is that enjoying the tournament in situ it won’t come cheap. Especially if you aspire to see the final, which will be played on July 19 at MetLife Stadium in New Jersey. The cost of your tickets it takes months embroiled in controversy, but the debate has soured after some positions have come to light resale market for the price of a 200 m2 apartment in the center of Madrid. All with the veiled pleasure of FIFA. What has happened? That although there is still more than a month until the opening match, the World Cup in North America (to be played between Mexico, Canada and the USA) is already earning the dubious honor of being the most expensive of history. The fans screamed in the sky last decemberwhen the first tickets were launched, but the rates that were offered then seem like a ‘bargain’ when compared to those that are now being achieved in the purchase and sale market. In this secondary trade, channeled through FIFA, there are passes that are offered for the same What does a 200 m2 apartment in Madrid cost? Does it sell so expensive? Yes. The news has advanced it the Associated Press (AP) agency, but it comes with taking a look at the buying and selling platform of tickets hosted on the FIFA website to verify it. If we look for available passes for the final on July 19, we will see that there are people reselling them for more than two million dollars. To be precise, there are at least four seats on sale in the lower stand (behind the goal) for a whopping $2,299,998.85. Not all tickets cost the same, but resale prices are generally not affordable for everyone. The cheapest seats, 3rd category, are offered for $10,900. If you want a position with better views and more comfort, you can add a few thousand more to that figure and purchase higher category passes for $16,100, $33,800, $43,200 or even $207,000. The prize goes to the entries of 2.3 million and 991,500, which is what a seller asks for seats located in the front area. On Wednesday FIFA itself put up for sale a new block of tickets on its direct sales platform, where it was possible to find seats for the final by $10,990. Who controls these rates? Direct sale tickets are launched by FIFA itself, but things change when we talk about the secondary market. There, in the so-called “Resale/Exchange Market” the federation does not control prices, although it does take a considerable part of the business. For each transaction you pocket a commission which is divided into two parts. One, 15%, is applied to whoever purchases the ticket. Another, of the same value, is borne by whoever detaches from the entry for resale. As they explain in Guardianthat means that if one of the tickets that cost 2.3 million is finally sold, FIFA would deposit $690,000 into its account. But… How is that possible? In other editions of the World Cup, the resale price of tickets was limited at face value, but this time FIFA has changed the approach. The reason? First, adapt to the market of the host countries, especially the United States, which is the one will host more games of the tournament. Secondly, FIFA hopes that by channeling the buying and selling itself, the use of portals such as StubHub will be discouraged. “FIFA has established a ticketing and secondary market model that reflects standard ticket market practices for major sporting and entertainment events in host countries,” alleges in a statement cited by the Associated Press. “Resale facilitation fees are aligned with industry standards in the North American sports and entertainment sectors.” Is it an isolated controversy? The controversy has now arisen due to the prices that are being reached in resale, but the truth is that the cost of the tickets has been a matter of discussion since the first phase of sale, activated in December 2025. The focus has been on both the prices themselves and the system applied by FIFA in the sale, the ‘variable pricing’similar to dynamic rates. Consumer organizations like the OCU have already raised their voices for that same reason. For reference, in December tickets for the final were already being sold for prices ranging from 4,185 and 8,680 dollars. And this despite the initial promise to offer them for 60 dollars in the group stage. “They only exist as ridiculous green splotches on the edge of seating maps, little more than mirages of inclusion,” ironizes Bryan Armen, from Guardian. Does it only happen with tickets? No. The tickets are so expensive because, FIFA allegesare one of their main sources of income. However, passes to matches are not the only thing that is valued at a gold price. In recent days, another controversy has arisen around the celebration of the World Cup in the US that revolves around something that has little to do with sport: public transportation. The New Jersey rail operator has decided that those who want to buy round-trip tickets to travel from Manhattan to MetLife and watch the July 19 final there will have to pay 150 dollars. It is almost 11 times more than what the same service costs on a normal day, when it is around $12.9. Images | FIFA and Wikipedia In Xataka | Mexico City is already noticing the economic effect of the World Cup: it is losing homes and gaining Airbnb apartments

The oceans are smoking and the bad news is that that phrase is starting to stop being a metaphor

Global sea surface temperature is once again approaching 2024 records, Arctic ice marks its historic winter minimum and the average temperature is 1.43 degrees above pre-industrial levels. What’s more, the Earth’s energy imbalance has reached its highest level in 65 years. And all of this without El Niño being active. So I have to correct myself: what is happening is not that the sea is smoking. That’s a huge understatement. What happens is that the oceans have gone up a notch and we are completely caught out of the game. What is happening? According to Copernicus dataIn March, the global average temperature was 13.94 degrees. That is 0.53 above the 1991-2020 average and 1.48 above the pre-industrial temperature. It’s not the warmest March on record, but it’s close. In contrast, February 2026 was one of the three coldest in the last 14 years. And it’s curious because, anyway, we are in ENSO-Neutral conditions. The 2024 record was reached with El Niño pumping heat from the Pacific; Now we are in the most absolute normality. That does have experts from half the world worried. And the sea? In the sea things are more complicated because the surface temperature is very close to lrecord ace of 2024. Furthermore, it is not a question of a specific rebound: it is the result of a sustained rise throughout the entire month of March. There are specific areas (subtropical and northeastern North Atlantic, North and South Pacific) that are already at record values; The big question is what will happen at the end of the year and, above all, at the beginning of the year when El Niño is at its highest peak of intensity. Well, but this doesn’t affect us much, right? It depends on what we mean by ‘affect’, of course. What there is no doubt is that, despite the fact that temperatures are rising around the world, the Mediterranean has become the great laboratory for all detected and undetected climate risks. After all, Mare Nostrum heats up to 20% faster than the global average. And that has clear and direct consequences for water: from the mass extinction of vertebrates to the decline of grasslands to an enormous mortality of fish. Is a sea dying little by little; a sea that drags us with it: because the heat of the Mediterranean injects more water vapor and fuels extreme precipitation phenomena. The DANA of Valencia It’s a reminder of all this.. That is, the scenario is known. What remains is to see what we do to prepare for it. Images | BenBaso | Xataka In Xataka | Something strange has happened in the stratospheric polar vortex. And it is a hint of the winter that awaits Spain

James Webb has bad news for the largest natural laboratory for rocky planets, but there is still some hope

The star TRAPPIST-1 and the seven known planets that surround it are a natural laboratory in which the evolution of rocky planets can be studied. This has led many scientists to focus their attention on them, in search of a possible habitable planet. However, observations made by an international team of astronomers with the help of the James Webb Space Telescope They are not very encouraging. Planets without atmosphere. The James Webb Space Telescope has a very powerful infrared radiation analysis instrument, with which it can analyze the temperature of the planets it observes. These emit infrared radiation whose intensity is proportional to their temperature, so a thermal map can be made. That’s what these astronomers have done. They have initially focused on two of the planets that orbit TRAPPIST-1: TRAPPIST-1a and TRAPPIST-1b. The resulting heat map shows that neither planet has an atmosphere. They may have had it one day, but possibly TRAPPIST-1 itself destroyed it. It is a very uninspiring result for the search for habitable planets in this system. Lights and shadows of TRAPPIST-1. So far seven exoplanets have been discovered orbiting TRAPPIST-1. They are all very close together. In fact, its seven orbits are concentrated in the distance between Mercury and the Sun. What happens is that this red dwarf is less energetic than our Sun, so the temperature would not be as suffocating. All of these planets are rocky, like Earth, and in fact, some are very similar in size. There could be an exoplanet with conditions similar to ours. The problem is that red dwarfs They emit a lot of radiation and energetic flows of particles that could destroy their atmosphere.. And of course, without atmosphere, there is no life. Tidal lock. All planets in the TRAPPIST-1 system are tidally locked. This means that its rotation and translation period around the red dwarf they are synchronized. As a result, there is one side continuously exposed to the star and another on the opposite side. On one side it is always day and on the other it is always night. NASA/JPL-Caltech Extreme temperatures. When a planet is tidally locked, there can be two situations, depending on whether it has an atmosphere or not. When there is an atmosphere, heat flows from the light side to the dark side, so that the entire planet has a stable average temperature. On the other hand, if there is no atmosphere, the dark side can be frozen and the illuminated side can be scorched. In the two exoplanets analyzed by James Webb, it has been seen that temperatures are around 100ºC-200ºC on the illuminated side and -200ºC on the dark side. Therefore, it is confirmed that there is no atmosphere. And now what? Despite this hard blow, there is still hope. The two exoplanets that have been analyzed are not in the star’s habitable zone. This is the distance from it at which the temperature is adequate for the water, if any, to remain in a liquid state. At that exact point there are only TRAPPIST-1e, TRAPPIST-1f and TRAPPIST-1g. Furthermore, the former has a density and size very similar to those of Earth. James Webb has all his attention on this exoplanet right now, to repeat the process. If there were an atmosphere on it, it could still remain on the list of possible habitable planets. It’s still interesting. Despite the first blow, TRAPPIST-1 remains a very interesting system for understand the evolution of rocky planets. The Earth was lucky not to lose its atmosphere; but, beyond those, the evolutions can be similar. Furthermore, we have not yet ruled out that TRAPPIST-1e has an atmosphere. Let’s go step by step. Image | NASA, ESA, CSA, Joseph Olmsted (STScI) In Xataka | There is only one chance in 11,000 years to reach the planet Sedna. Some Italians want to use this nuclear engine

We already know where many of the microplastics that enter our body end up. We have bad news

That microplastics had managed to enter our body is something that we already knew in detail, especially considering that we have found them in the lungsin the placenta and even in the testicles. However, there were questions about where they accumulated in greater quantities in our body and what consequences does it have. Something that science has already solved. What they have seen. It has been thanks to a recent study published in the journal Environmental Science and Ecotechnology that it has finally been revealed that not only can cholesterol crystals be found in bile that end up generating stones, but also there are microplastics. And the worst of all is that they have a direct impact on the premature aging of the cells that make up our gallbladder. How do you know? To reach this conclusion, the researchers analyzed 14 samples of human bile: five from healthy patients without gallstones and nine from patients with gallstones. The results were conclusive, since they found microplastics in the samples, mainly highlighting two of them. the most common polymers in our daily lives: polyethylene (PE) and polyethylene terephthalate (PET). Here it could be seen that the particles had a size that ranged between 20 and 50 micrometers. A very relevant fact, since from these sizes is when they can cross the biological barriers, cross the intestine-liver axis and end up in the gallbladder of our liver. There is more. In addition to the presence of plastic in the bile, it could be seen that patients who had gallstones showed a higher load of microplastics. Something fundamental, since it is a fact that fits with recent research that suggests that these particles could act as ‘seeds’ around which cholesterol groups to form the dreaded stones in the gallbladder. What do they do? This is the key point of these studies, since we still do not have much idea of ​​the damage that microplastics can do to our body. Here it points out that bile causes mitochondrial dysfunction and promotes the aging of cholangiocytes, which are the cells that line the bile ducts. In previous experiments, it was possible to see that in the liver of laboratory mice exposed to environmental concentrations of microplastics there was alteration in the metabolism of bile acids and liver damage. In the case of humans, which increases oxidative stress. But the important thing is that in both cases the bile duct cell loses its ability to function correctly and ages prematurely, which in the long term could be related to liver and bile duct diseases. Can it be mitigated? Among the bad news, scientific literature suggests that there are ways to avoid it. One of the great protectors that exists is melatonin, suggesting that it can combat oxidative stress and mitochondrial dysfunction generated by these synthetic intruders. In parallel, other recent experiments with human liver organoids have shown that the damage caused by microplastics improves when ursodeoxycholic acid is administered, which is the drug administered to ‘dissolve’ gallstones. The ‘magic’ of this compound is that it increases bile flow, suggesting that promoting a kind of natural “washing” of the bile duct could help reduce toxicity. A problem. The confirmation of bile as a “hidden reservoir” of microplastics highlights an undeniable reality: plastic pollution is no longer just an environmental problem in our oceans, but a systemic public health problem about which we know more and more data. The longer-term consequences, such as the relationship with gallbladder cancer, remain to be seen. Images | FREEPIK In Xataka | An 18-year-old girl has created the definitive weapon against microplastics: a filter that eliminates 96% of them from water

It takes 7,000 GPUs to simulate a tiny quantum processor. Although it may not seem like it, it is excellent news.

The complexity of quantum computers It is extraordinary. In their construction it is possible to rely on several very different strategies, such as, for example, superconducting qubits, ion traps or neutral atoms, among other technologies, but they all have something in common: to a large extent its power is a consequence of its complexity. Of the complexity inherent in any device designed to take advantage the laws of quantum physics. The surprising thing is that, despite its sophistication and exoticism, it is already possible to accurately simulate a small quantum processor using conventional hardware. In fact, has achieved it a research group from the Quantum Systems Accelerator and the Division of Applied Mathematics and Computational Research at the University of California at Berkeley (USA). This is not the first time that a quantum processor has been simulated, but until now no one had managed to do it by emulating every physical detail before its manufacture. A new era begins in quantum chip design Here’s a shocking fact: the Berkeley researchers I mentioned in the previous paragraph have carried out their simulation of a quantum chip using the Perlmutter supercomputer, which contains 7,168 NVIDIA GPUs. To achieve their purpose, they used almost all of these GPUs for 24 uninterrupted hours, so it is evident that the computational effort was titanic. But they got it. They managed to model a multilayer quantum chip 10 mm wide and 0.3 mm thick, accurately simulating how signals travel and interact within this processor. This statement from Andy Nonaka, one of the scientists at the Berkeley Quantum Systems Accelerator, express clearly Why this milestone is so important: “I am not aware of anyone who has ever performed physical modeling of microelectronic circuits at the full scale of the Perlmutter system.” “I’m not aware of anyone having ever done physical modeling of microelectronic circuits at the full scale of the Perlmutter system. We were using almost 7,000 GPUs (…) We divided the chip into 11 billion grid cells and were able to run over a million time steps in seven hours, allowing us to evaluate three circuit configurations in a single day. These simulations would not have been possible in this time frame without the complete system” What really what makes the difference is precision with which they have managed to carry out the design and simulation of their quantum processor. “We perform a full-wave physics-level simulation, which means we care about what material is used in the chip, its design, how the metal is wired (using niobium or other types of metal wires), how the resonators are built, what the size, shape and material used are (…) We care about those physical details and we include them in our model,” Nonaka says. A priori we can conclude that using almost 7,000 GPUs for 24 hours with the computational effort and energy expenditure involved in this process to simulate a quantum chip just 10 mm wide and 0.3 mm thick is not a success. But yes it is. Thanks to this technology, it will now be possible to design quantum hardware in less time and in a more efficient way. Bert de Jong, director of the Berkeley Quantum Systems Accelerator, invites us to look towards the future of quantum computing with optimism: “This unprecedented simulation is a critical step in accelerating the design and development of quantum hardware. More powerful, higher-performance chips will unlock new capabilities for researchers and open new avenues in science” Image | Generated by Xataka with Gemini More information | ScienceDaily In Xataka | We already know what the chips that will arrive until 2039 will be like. The machine that will allow them to be manufactured is close

There is a way to open the dishwasher incorrectly. This is bad news for your kitchen furniture.

Practically half of Spanish homes have a dishwasher. And some of the most reputable manufacturers have had to come to the fore to explain why. it’s not a good idea Open the door as soon as the washing cycle ends. Don’t be craving. Opening the dishwasher as soon as the wash cycle is finished is not the best idea, according to the manufacturers themselves. Some, like Boschrecommend waiting for it to cool a little before opening the door so that less steam comes out of the opening. Others, such as Siemens, make exactly the same recommendation: let the appliance cool down slightly after the program ends before opening the door. Because. Opening the dishwasher at the end of the cycle is not particularly dangerous, but it can cause problems in your kitchen in the long term. Manufacturers point out that excess steam when opening the door when it is still quite hot can end up damaging kitchen furniture. Similarly, dishwashers use their own residual heat to help the correct drying process. All this without taking into account the most obvious detail: if we open the appliance when it is full of high-temperature steam, we run a greater risk of suffering a slight burn. How the dishwasher works. To understand why the dishwasher generates so much steam at the end of the cycle, it is worth doing a brief review of how it works. Basically, it is a closed circuit that allows hot water to be pumped into rotating arms. When you choose the program, the door closes. Water enters its base and the dishwasher heats it. When the water is hot, the pump pushes it under pressure into your arms. The water hits the dishes and drags away the dirt. The detergent is released and begins to break down grease and food debris. The residual heat dries the dishes little by little. The heat problem. Aware that opening the dishwasher as soon as it is finished is common practice in homes, manufacturers have been devising solutions for years to prevent excess steam. Some appliances of this type have an automatic opening system. The door opens just a few centimeters to let the steam out little by little, so that when we finish opening it manually we avoid that initial blow. Other systems, even more advanced, they use zeolites. Zeolites are microporous aluminosilicates (of mineral or synthetic origin) that have a crystalline structure with a huge internal surface area. When they absorb water molecules the process is exothermic, that is, they release heat. This allows part of the steam to be collected and used as waste heat for drying. Increasingly advanced solutions to address a problem as simple as it is common. In Xataka | A user bought a next-generation connected dishwasher. That’s where his nightmare began

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.