It’s Sandisk and it has accumulated 3,000%

The thing about Sandisk growth on the stock market It is to study it carefully. Has accumulated almost 3,000% revaluation in the last year, and is another example of how the AI ​​fever and RAM crisis They are shaping the technology industry and markets. And the company that many of us have known for its USB pendrives and memory cards has become one of the companies that will be the most talked about in 2026. lseparation from Western Digital. In February 2025, Sandisk completed its separation from Western Digital and began trading independently on the Nasdaq. For almost a decade, the company had been buried under the umbrella of its parent company, tied to a conventional hard drive business that was growing slowly while the AI ​​sector accelerated and gave wings to the rest of the companies that made critical components for data centers. At the time of going on the market as an independent company, the stock started at around $48. Nothing strange so far. AI once again plays its role in the market. Actually, Sandisk has not invented anything revolutionary now, but everything has to do with the data center storage demand. The flash memory manufacturing company Kioxia warned last January that its supply of NAND for this entire year was already exhausted. And these types of stories are what make companies start looking at alternatives. Data centers dedicated to AI need to store gigantic volumes of data to train models and infer results. When they ran out of traditional hard drives (HDDs), they turned to SSD. And when SSDs also became scarce, prices skyrocketed. According to Kingstonanother manufacturer of flash memory products, NAND prices increased by 246% throughout 2025. Sandisk, being one of the world’s leading manufacturers of NAND memory, found itself at the center of this perfect storm, and in the best possible position. 3,000%. Sandisk’s revenue in the fiscal second quarter of 2026 reached 3,030 million dollarsa growth of 61% year-on-year, while earnings per share multiplied by more than five compared to the same period of the previous year. Revenue from data centers grew 76% year over year in that same quarter, according to the company’s own results filed with the SEC. The company has publicly acknowledged that its factories are working at full capacity. In this way, the accumulated value of its shares has skyrocketed by more than 3,000%, going from just $30 to surpassing the $1,000 barrier in just over a year. The key is in Kioxia. A fundamental part of Sandisk’s competitive advantage is its historic alliance with Japan’s Kioxia, formerly Toshiba Memory. This joint venture of more than twenty years allows them to share the astronomical costs of manufacturing chips, which translates into margins higher than those of most of their rivals. In this way, as we already explained At the end of January, when NAND prices rise, Sandisk does not need to invest in new factories to earn much more, as the additional revenue falls directly into the profit margin. It’s the equivalent of almost free money in a bull market. The data center segment currently represents more than 55% of sales quarterly from Sandisk, compared to the 30% it represented before the split with WD. The entry to the Nasdaq-100. A few days ago, Sandisk joined the Nasdaq-100 indexwhich mechanically forces all ETFs and index funds that track that index to buy shares of the company. This carryover effect has given Sandisk an additional boost to its already flying stock. And now what? The big question is how long this can last. NAND prices they have risen 60% in the first quarter of 2026, with forecasts for another 70-75% increase in the coming months. The CEO of Micron has already publicly declared that the memory shortage will last until 2027. As shared by Ind Money, several analysis firms assure that the crisis could extend until 2028. We will see. As for companies like Samsung, SK Hynix and Kioxia, the triad of memory companies, they plan to significantly increase your production in the coming years in response to the shortage. Cover image | sandisk In Xataka | China has banned another AI startup from exporting talent and research: little by little, it is “nationalizing” AI

more than 600 km/h on a line that has accumulated years of delays

In 2015, a seven-car prototype in Japan made us dream with the tremendous speeds that the trains of the future would have, with the Japanese country as the main standard bearer. The L0 Series train reached 603 km/h on the Yamanashi test line, becoming the fastest manned railway vehicle ever recorded at the time. More than a decade later, that record still standsalthough the promise of its commercial use has yet to materialize. And the line that is supposed to bring it to travelers accumulates years of delays. magnetic levitation. The L0 Series works via superconducting magnetic levitation, using powerful magnets along the track and in the train that interact to lift the vehicle on the track, completely eliminating physical contact with the tracks. Without friction, without mechanical noise, without wear, and with heart-stopping speeds. The system is known as SCMaglev and uses an electrodynamic suspension, different from that used in the Shanghai maglev. Japan National Railways began researching this type of propulsion in 1962 with a clear objective: to connect Tokyo and Osaka in one hour. They have had that dream for more than six decades. Chūō Shinkansen. This is the maglev line under construction between Tokyo and Nagoya, with plans to extend it to Osaka. The idea is that it will be established between Shinagawa and Nagoya stations, with stops in Sagamihara, Kōfu, Iida and Nakatsugawa. The line is not intended to replace the legendary Tokaido Shinkansen, but it will exist to offer travelers a much faster alternative. The line would connect Tokyo and Nagoya in 40 minutes and, later, Tokyo and Osaka in 67 minutes, at a maximum speed of 505 km/h. Today the fastest Nozomi (Japan’s fastest high-speed train service) takes around two and a half hours between the two cities. With Chūō Shinkansen, the idea is that approximately 90% of the 286-kilometer route to Nagoya passes through tunnels, instead of following the coast, as the Tokaido does. This decision is also the root of much of their problems. ORa prefecture and a river. The main obstacle was that the then governor of Shizuoka, Kawakatsu Heita, denied permission to drill one of the tunnels under the Japanese Southern Alps for environmental reasons. The argument was that the impact studies had been carried out with little rigor and that the excavations could affect the bed of the Oi River. The section in question affected just 8.9 kilometers of tunnel within Shizuoka, but it was enough to block the entire project for years. Without that section, the rest of the work could not be completed. However, the current governor of the region, Yasutomo Suzuki, authorized the geotechnical inspection prior, but the works are still in progress. A calendar full of delays. In 2024, JR Central president Shunsuke Niwa publicly ruled out opening in 2027 and targeted 2034 as the new minimum date. But the story doesn’t end there. Last October, JR Central postponed the arrival to 2035. Construction costs have already skyrocketed by more than 50% to 11 trillion yen (about 61 billion euros), according to RailTech. The section to Osaka, for its part, It would not arrive until 2037 at best. The threat from China. In July of last year, during the World High Speed ​​Congress held in Beijing, the state-owned CRRC presented a maglev prototype Designed to reach 600 km/h. The train runs on rubber wheels at low speed and switches to magnetic levitation when exceeding 150 km/h. The Asia Times shares that it will still take a long time to put it into commercial use, and that market demand, rather than technology, is the main obstacle. But there is more: the T-Flight project from the state company CASIC, which combines magnetic levitation with hyperloop-style vacuum tubes, has already reached 623 km/h in tests in 2024, with the goal of exceeding 1,000 km/h soon. China has also, for years, the only commercial maglev in the world that operates regularly: the Shanghai Maglev, which circulates at 430 km/h. Cover image | Maglev.net In Xataka | The Mayan Train has become a nightmare for Mexico: what seemed like a great plan has run into justice

There is a risk with AI agents and accumulated errors: that they are a "sneaky phone"

In the game of the “sneaky phone” (or broken, or broken) a group of people transmits a message from one to a secret one. What usually happens is that the original message does not have much to do with what the last recipient receives. And the problem we are seeing is that something similar can happen with the promising agents of AI. Accumulated errors. Toby Ord, a researcher at the University of Oxford, recently published A study On AI agents. In it I talked about how these types of systems have the problem of accumulated or compound error. An AI agent chains several stages autonomously to try to solve a problem that we propose – for example, create code for a certain task – but if you make an error in one stage, that error accumulates and becomes more worrying in the next stage, and more in the following, and even more so in the next. The precision of the solution is thus compromised and may not have much (or nothing) to do with the one that would really solve the problem we wanted to solve. AI can program, but not for a long time in a row. What this expert raised was the introduction of the so -called “half -life” of the AI ​​agent, which would help estimate the success rate according to the length of the task that an AI agent wants to solve. For example, an agent with a half -hour life would have a 50% success in two -hour tasks. The message is overwhelming: the longer an AI agent works, the more likely the success rate declines. Benjamin Todd, another expert in AI, I expressed it differently: an AI can schedule for an hour without (barely) errors, but not for 10 hours. They are not real or definitive figures, but express the same problem: AI agents cannot – at least for the moment – function indefinitely, because accumulated errors condemn the success rate. Humans either are saved. But be careful, because Something very similar happens With human performance in prolonged tasks. In the ORB study, it was pointed out how the empirical success rate is falling remarkably: after 15 minutes it is already approximately 75%, after an hour and a half is 50%and after 16 hours of just 20%. We can all make mistakes when performing certain chained tasks, and if we make a mistake in one of them, in the next task of the chain that error condemns all subsequent development even more. Lecun already warned. Yann Lecun, who directs the research efforts of AI in the finish line, has been notaring the problems with the LLMs for a long time. In June 2023 Indian how the autregressive LLMs cannot be factual and avoid toxic responses. He explained that there is a high probability that the token that generates a model takes us outside the correct answers group, and the longer the answer, the more difficult it is correct. {“Videid”: “X8HJ0VY”, “Autoplay”: False, “Title”: “Chatgpt: What you did not know what you could do | tricks”, “Tag”: “”, “Duration”: “790”} That is why is the correction of errors. To avoid the problem, we need to reduce the error rate of AI models. It is something well known In Software Ingenería, where an early code review is always recommended following a “Shift Left” strategy for the software development cycle: the sooner an error is detected, easier and cheaper is to correct it. And just the opposite does not happen if we do not: the cost of correcting an error grows exponentially the later it is detected in the life cycle. Other experts They point to the Reinforcement learning (Reinforcement Learning, RL) could solve the problem, and here Lecun responded that would do it if we had infinite data to polish the behavior of the model, which we do not have. More than agents, multi -agents. In Anthropic They recently demonstrated How there is a way of mitigating even more mistakes (and subsequent accumulated errors): Use multi -legal systems. This is: that multiple agents of AI work in parallel and then confront their results and determine the optimal path or solution. The graph shows the length of the tasks that AI agents can completely complete over the last years. The study reveals that the time that an AI agent can operate to complete tasks with a 50%success rate can be folded every seven months. Or what is the same: agents are improving in a sustained (and notable) way over time. But models and agents do not stop improving (or not?). Todd himself He pointed something important and that allows to be optimistic about that problem. “The error rate of AI models is being reduced by half approximately every five months,” he explained. And at that rate it is possible that AI agents can successfully complete dozens of tasks chained in a year and a half and hundreds in another year and a half later. In The New York Times They did not agree, and recently pointed out that although the models are increasingly powerful, they also “hallucinate” rather than previous generations. The “system card“O3 and O4-MINI precisely points to the fact that there is a real problem with the error rate and” hallucinations “in both models. In Xataka | The hallucinations are still the Achilles heel of the AI: the latest OpenAI models invent more of the account (Function () {Window._js_modules = Window._js_modules || {}; var headelement = document.getelegsbytagname (‘head’) (0); if (_js_modules.instagram) {var instagramscript = Document.Createlement (‘script’); }}) (); – The news There is a risk with AI agents and accumulated errors: that they are a “squeezed phone” It was originally posted in Xataka by Javier Pastor .

Accumulated sediments are a huge problem for reservoirs. And in the Ebro they have taken drastic measures

The reservoirs, both those for hydroelectric use and those for consumptive use, are a vital element in the hydrological panorama. However, for some time, experts warn of a problem that is aggravated over time and affects their functionality. He sediment problem. Half year of works. The works initiated last August to recover the drain of the Ebro reservoir will extend, predictably until 2026, according to They have indicated from The Montañés newspaper. The works, in addition to introducing improvements into one of the swamp drains, intend to recover their functionality from the accumulation of sediments in this. The tasks, explains the local newspaper, will require a team of divers for 3.5 meters of accumulated silt next to the drain gates. The works, with a budget of 2.5 million eurosthey will imply the installation in each of the ducts of the security gates with By-Pass and gates for the regulation of flows. The Arija swamp has two drains, one side and the other located in the dam. It is the latter that, as a consequence of the accumulation of sediments, has lost the ability to perform its function. Key reservoir. The Ebro or Pantano de Arija reservoir is a key element in the Ebro hydrographic basin. It is one of the largest reservoirs in this hydrographic demarcation (behind those of Mequinenza and Canelles). Located in the immediate vicinity of the Cantabrian city of Reinosa, the border between this Autonomous Community and that of Castilla y León in the province of Burgos. According to the latest datathe reservoir Albeca now 348 hm³ of water, 64.3% of its capacity (541 hm³). Some data that do not always reflect reality, precisely due to the problem of sediments. Limiting the capacity. The problem of sediments Not only does it affect the functionality of the drains of the reservoirs: they also limit their capacity. Decades of use have led to a significant accumulation of sludge and sediments in the reservoirs, sediments whose volume implies a significant reduction in the storage capacity of the swamps. Estimates of this loss vary significantly, but the most pessimistic talk about a loss of up to 40% of the volume In some basins. A study in 110 reservoirs launched a more optimistic but still alarming estimate, A loss of 5%. The latest rains seem to have helped reverse the drought situation that still affected some areas. However, our ability to prepare for the next drought is limited by this accumulation of sediments in the reservoirs. Where the sediments are missing. As if this were not enough, the problem of the sediments left over on one side is the problem of those missing in another. In this case, In the Ebro Delta. The Ebro delta is nothing more than the result of the accumulation of sediments dragged by the river current. The installation of numerous dams in this hydrographic basin has reduced the arrival of this matter to the mouth, which, together with the natural coastal erosion, has put the delta ecosystem at risk. An ecosystem on which not only depends the local fauna, but also a part of the agriculture and the economy of the region. In Xataka | In a corner of Andalusia the reservoirs are at 94% of their capacity. It seems excellent news, but it is not so much Image | Josu Aramberri, CC by-SA 3.0

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.