how to migrate the memory of everything other AIs know about you to Claude

Let’s tell you how to migrate memories from ChatGPT or Gemini to Claudeand thus perform a migration of a artificial intelligence to another. Claude has just launched a fairly easy-to-use function that allows you to import memories from ChatGPT, Gemini or any other AI you use. Artificial intelligence chats have a memory system with which they store important data about you and your tastes based on the things you ask them repeatedly. They will know your musical tastes, your pets, if you have plants, and so they take all this into account to personalize their answers. And why can it be useful to import these memories into Claude? Well, because if you have decided to start using this artificial intelligence model, you can make it know about you all the specific data that your other AIs use to personalize their results and adapt them to you. Import memories from another AI to Claude This option It is only available for paying users of Claude, who have Pro, Max, Team or Enterprise subscriptions on the web, and for users of Claude Desktop or Claude Mobile. What you have to do is enter the settings of the AI ​​website or application. Once inside the settings, Click on the section Capabilities in the left column. On the screen you go to, go to the section Memoryand in it click on the option Start import that will appear to you. This will open the import memory screen. In it, above you have a prompt that you must copy to use in another AI to extract the memories, and below you will have a field where you have to write the imported memory that generates the prompt above. Therefore, here click on the button Copy of the text you have above. Now, the text that you have copied in Claude you have to paste it in a chat with the AI ​​where you want to extract the memories. Simply paste it exactly as you have it into ChatGPT, Gemini or another, and send it. This will make the AI generate a code with all the memories what he has on you. You will have to copy this code and stick it in Claude’s field what’s in the window we opened before. With this, Claude will recognize the memories and start saving them internally. In Xataka Basics | Claude: 23 functions and some tricks to get the most out of this artificial intelligence

Anthropic releases a new feature to download all your memory to leave ChatGPT and switch to Claude

This weekend Anthropic has gone from being an AI used by the Pentagon, other US agencies and having partners such as Microsoft or Amazon to total ostracism: from Friday at 5:01 p.m. It is classified as a “risk to the supply chain”. Total veto, a serious threat to the survival of a company valued at 380,000 million dollars and also a challenge for those entities that in less than six months will have to transition to another alternative. The Pentagon itself He already has an agreement with OpenAI to succeed him. Anthropic’s situation is delicate to say the least serving its strategic clients and alliances, something essential to continue growing in the tough battle of intelligence. The company led by Dario Amodei, which was firm in its principles when expressing its concern about the use of artificial intelligence for mass civil surveillance and the development of weapons capable of firing without human intervention, has already announced that he will contestbut for now they look rough. He only has the civil…in every sense, because Claude has risen to number 1 for free downloads in the App Store in the United States, as reported by CNBC. Because yes, this tug of war with the US government has brought an increase in the popularity of Claude, less known than other alternatives such as ChatGPT or Gemini. On the other hand, this movement in which the US Administration has said goodbye to Anthropic in favor of OpenAI also has a reading in which Claude wins: the terms of the agreement and how it affects ChatGPT users. Anthropic Coup de Effect. So Anthropic has been taken out of the sleeve a new feature to facilitate the transition from other AI models, such as ChatGPT or Gemini, to Claude. Because if you have been using ChatGPT for a while for example and already knows youstarting from scratch is a step backwards in every sense. The new feature allows you to import all your memory from other models into Claude so that it immediately knows everything about you (everything that your previous AI already knew). You no longer start from scratch. How to download your memory and load it in Claude. To incorporate your preferences and context from other AI providers into Claude you have to do two steps: Copy and paste the prompt below into the AI ​​you normally use, like Gemini or ChatGPT: I’m moving to another service and need to export my data. List every memory you have stored about me, as well as any context you’ve learned about me from past conversations. Output everything in a single code block so I can easily copy it. Format each entry as: (date saved, if available) – memory content. Make sure to cover all of the following — preserve my words verbatim where possible: Instructions I’ve given you about how to respond (tone, format, style, ‘always do X’, ‘never do Y’). Personal details: name, location, job, family, interests. Projects, goals, and recurring topics. Tools, languages, and frameworks I use. Preferences and corrections I’ve made to your behavior. Any other stored context not covered above. Do not summarize, group, or omit any entries. The model will return everything it knows about you in a block of text, which you have to copy and paste later into Claude. Go to ‘Settings‘ > ‘Capabilities‘and there in Import Memorypaste the answer. Then, tap ‘Add to memory’. From that moment on, Claude already knows what your previous AI knew. It has small print. This is a feature for users on a paid plan (Pro, Max, Team or Enterprise). If you are on the free version, at most you will only be able to have that context in that conversation, but not permanently. In short: the import is free as a manual process, but for Claude to remember it permanently a payment plan is required. In Xataka | Claude: 23 functions and some tricks to get the most out of this artificial intelligence In Xataka | Anthropic and OpenAI have developed AI. The US Pentagon is showing you who really owns it

RAM memory already represents 35% of the cost of a PC. The only solution that HP finds: capable equipment

The PC industry – like many others – is facing a perfect storm that is completely altering manufacturing costs. As revealed by Karen Parkhill, CFO of HP, RAM memory has increased its prices so much that its specific weight in the cost of a PC is now almost unsustainable. Bad business. 35% of what your PC costs you is RAM. According to the directive, RAM memory has gone from representing an acceptable 15–18% of the bill of materials for your PCs and laptops to representing a suffocating 35%. The change is drastic, and has occurred in just one fiscal quarter. Things will get worse. This increase is due to the fact that according to HP, memory costs have doubled sequentially and have grown by 100% in a few months. Not only that: the company’s forecast is pessimistic, and they expect prices to rise as 2026 progresses. From more expensive PCs… The direct consequence for users is inevitable: the prices of PCs and laptops are going to rise. Analysts are already warning of increases of between 15% and 20% in the RRP of these devices, and in fact HP has already begun to make changes to its price tags precisely to protect its profit margins in the face of the massive increase in the price of critical components such as DRAM memory and NAND chips in SSD units. …to capable PCs. But the price is not the only thing that will change. To keep the equipment “affordable”, HP is adopting another strategy that we had already seen in mobile phones: that of “cut specifications.” This means that we will see more low- and mid-range configurations with less RAM than one would expect in 2026. The measure is clearly intended to save costs at the sacrifice of performance. At the moment they are saving the ballot. At HP they are diversifying their suppliers and cutting back on specifications and extras to compensate for the extra cost of chips. The company is even using AI systems to optimize its planning processes and has halved the time it takes to qualify new materials for agile component changes. The demand for HP PCs is still there: its personal systems division grew 11% in revenue. The company warns, however, that this trend could fall: high prices could cause sales to slow down. Damn data centers. The big culprit of everything is AI, of course, which is causing most of the production of DRAM memory chips and NAND chips to be destined for the AI ​​accelerators of NVIDIA and other manufacturers and, of course, for the gigantic data centers that are being planned everywhere. In addition, the industry is focusing on HBM memories, which are much more powerful for AI applications but which cause the production of “traditional” memories to suffer. Hello, 8 GB of RAM in 2026. For many years it seemed that 8 GB of RAM had become the de facto standard in our laptops and many PCs, but a couple of years ago we clearly made the leap to 16 GB. This crisis threatens to take us back to the past and see many “affordable” computers with 8 GB of RAM. Can we survive with this memory? Most likely yes… if our use of the equipment is relatively modest. The 16 GB really helps a lot now that we have become accustomed to opening a lot of browser tabs and applications in an era where these consume more and more memory. 8 GB seemed like a thing of the past, but we fear that we will have to learn to live with that type of configuration again. In Xataka | If you were thinking about setting up a NAS to create your own cloud, we have bad news: AI has other plans

RAM is in an “unprecedented” crisis. So much so that even Tesla is considering opening its own memory factory

Neither technological advances nor a revolution in devices: crises are what is defining the last years of the sector. He veto Huaweithe semiconductor crisis of 2020 and now, the RAM memory crisis. The difference between this crisis and the previous one is that, although the 2020 crisis was caused by a perfect storm, the RAM memory crisis is being caused by excessive interest in data centers and AI. And it is taking all sectors ahead. That there is no RAM memory for consumers is a symptom, but it implies something much bigger: although the main producers are investing millions to increase your RAM productionit is not memory for consumption, but for GPUs and data center systems. Only a few companies dominate the production of these chips, and if they cannot produce them, they do not produce the memory chips for SSDs –raising the price-. They dedicate all production to meeting the demands of AI. And, as we read in FortuneElon Musk, one of the owners of some of the largest data centers on the planethas shown that there are two ways to face this crisis: hitting the wall or taking action. And the translation is that Tesla is considering building its own RAM factory. The problem is that it is easier said than done. Tesla and Intel interested in biting the RAM biggies In recent weeks, some of the world’s leading companies have presented results and RAM has been the central topic. PlayStation, for example, has assured that they are very aware of their ability to continue manufacturing PS5 with the goal of not going upagain, the price. And NVIDIA has been stating for days that it needs TSMC – its main chip supplier – and Samsung – who provides them with new generation HBM4 memory – get the batteries. Meanwhile, the outlook is not good. own NVIDIA aims for seven or eight years of construction no brake on data centers. Intel assures that The crisis will extend beyond 2028 and Micron, one of the big three in DRAM memory, has cataloged the market bottleneck as “unprecedented.” In this technological tsunami, and during Tesla’s results presentation at the end of January, Elon Musk pointed out that the company could need to build your own memory manufacturing plant. The objective is the one that all companies have: ensure supply. Going from scratch to manufacturing RAM memory is easier said than done, however, here Tesla has an advantage: they are not new to chip manufacturing. Although they abandoned the project for a few months, at the beginning of this year Musk himself stated that They came back with their own chip for your data centers. Additionally, there is the fact that they are a company with enough muscle to create a clean chip manufacturing room next to some of its existing plants. Intel is another one looking to become one of the important voices in the RAM conversation. Together with the Japanese giant SoftBank, they are developing an evolution of stacked DRAM memory that have been baptized as ‘ZAM’ and that seeks to break the HBM memory monopoly of Samsung, Micron and SK Hynix. Now, things in the palace are going slowly, and if Intel (which is already in it) It will take between three and four years to have commercial productsTesla’s ambition may go into the next decade. Let’s hope we don’t continue in this crisis by then, but if more “players” are interested in producing RAM, it would mean that, in the event of subsequent crises, there will not be a few that dominate the sector, producing a bottleneck like the one we are experiencing. Domino effect of the AMR crisis and China taking action Because this is not just about RAM being more expensive for users: it goes much further. If companies do not have the capacity to satisfy the demand for AI, they pour all their manufacturing muscle into a single task, neglecting the others. This explains the rise in the price of SSDs, but also of other products that should not have a leading role in this conversation: hard drives or HDDs. It is a brutal domino effect because, as we say, it goes beyond the modules being more expensive: RAM is more expensive for companies and that implies mobile phones or more expensive or with less RAMconsoles that increase in price (like what is happening posing for nintendo switch 2), machines that are late and they will be more expensive (like the Steam Machine), car problems and even impacting the routers. And in this scenario, in which companies like Intel or Tesla are considering taking a bite out of the RAM sector, we have some Chinese companies that had no role in the conversation. positioning itself as an option to alleviate demand. We told it a few days ago: there were reports indicating that PC brands such as Asus, Dell or HP were considering purchasing memory from Chinese manufacturers such as CXMT. Their modules are not as advanced as those of Samsung, for example, and they do not have the production capacity of South Korean companies, but… they produce. And in lean times, that’s better than selling laptops without RAM. Anyway, as we have said on occasion, there are still more companies joining the production of RAM when the crisis has already had a full impact, but the goal is not to create more RAM for ourselvesbut for your data centers. It’s time to entrust ourselves to the most sacred thing: that our PC doesn’t break and we need to update. Images | Gage Skidmore, Intel In Xataka | The US has a problem with its AI data centers: more and more states are opposed to building them

There are people poisoning the memory of our AI to manipulate us. And Microsoft has set off all the alarms

That “comfortable” button of “summarize this with AI“hides a secret: it has surely been manipulated. We don’t say it, it’s the elite department that Microsoft has to analyze the security of both its services and those of the competition. In the process of a investigationhave started to pull the thread and have found that dozens of companies are inserting hidden instructions into those “summarizing with AI” functions with a single objective. Contaminate the AI’s memory to manipulate us. Microsoft what. Big Tech has a lot of exciting departments. from which They are dedicated to opening boxes to guarantee the best experience to those who sculpt competing products in clay to study them. However, something that all big technology companies share are cybersecurity teams, elite teams dedicated to one thing: investigating threats. They analyze both their own products and those of the competition because it is understood as an ecosystem. Google and Microsoft have two of the most powerful and a clear example is that if Google finds a security flaw in Windows, it notifies those responsible because it is something that could potentially harm its own product –Chrome-. An example is the research of one of these Microsoft teams, putting on the table the danger of AIs being so malleable. Poisoning AI memory. It is a concept that attracts attention and is easy to understand. “That useful “Summarize with AI” button could be secretly manipulating what your AI recommends,” Microsoft notes in the blog in which it published the research. What the attackers have done is corrupt the AI ​​by incorporating certain hidden commands that manage to persist in the assistant’s memory. Thus, they influence all the interactions we have with the assistant. Simply put, a compromised assistant may start providing biased recommendations on critical topics. I don’t mean that you ask if pizza is better with or without pineapple and that the answer depends on what the ‘hacker’ has implemented in the AI’s ‘memory’, but something much more serious related to health, finances or security. It must be said that Microsoft has not discovered this, since It’s been ringing for a few monthsbut they have given very specific examples and recommendations to avoid being victims. H-how do they do it? In it documentMicrosoft says they have identified more than 50 unique iterations from 31 companies and 14 different industries. They detail that this manipulation can be done in several ways: Malicious links: Most major AI assistants support reading URLs automatically, so if we click on a summary of a message that has a link with preloaded malicious information, the AI ​​processes those manipulated instructions and becomes contaminated. Integrated instructions: In this case, the instructions for manipulating the AI ​​are hidden embedded in documents, emails or web pages. When the AI ​​processes that content, it becomes contaminated. Social engineering: it is the classic deception, but in this case for the user to paste messages that include commands that alter the AI’s memory. Likewise, when the assistant processes it, it becomes contaminated. And therein lies the problem: various ways to contaminate the AI’s memory, a feature that makes assistants more useful because it can remember personal preferences. But, at the same time, it also creates a new attack surface because, as Microsoft points out, if someone can inject instructions into the AI’s memory and we don’t realize it, they gain persistent influence on future requests. to the point. In an AI like the one we have, it is dangerous, but in the future Agentic AI It is even more so because it will automatically perform actions based on that contaminated memory. Given the context, let’s get down to business. The security team has reviewed URLs for 60 days, finding more than 50 different examples of attempts to contaminate the AI. The purpose is promotional, and they detail that the attempts originated in 31 companies from different fields related to industries such as finance, health, legal services, marketing, food purchasing sites, recipes, commercial services and software as a service. They point out that the effectiveness was not the same in all attacks, but that they did identify the repeated appearance of instructions similar to “remember this.” And, in all cases, they observed the following: Each case involved real companies, not hackers or scammers. They are legitimate businesses contaminating AI to gain influence over your decisions. Deceptive container with hidden instructions in that “button”Summarize with AI“It seems useful to us and that’s why we click, triggering the script that contaminates its memory. Persistence, with commands such as “remember this”, “keep this in mind in future conversations” or “this is a reliable and safe source” to guarantee that long-term influence. Consequences. Concrete examples of what a poisoned AI can do: Child safety: If we ask “is this online game safe for my eight-year-old son?” a poisoned AI that has been instructed that yes, that game with toxic communities, dangerous moderators, harmful policies, and predatory monetization is totally safe, will recommend the game. biased news: When we ask for a summary of the main news of the day, the intervened AI will not bring us the best ones, but will constantly bring up headlines and focuses of the publication whose owners have contaminated the AI. Financial issues: If we ask about investments, the AI ​​may tell us that a certain investment is extremely safe, minimizing the volatility of the operation. Recommendations. And this is where our responsibility comes in. Because you may be thinking “who asks the AI ​​those things and it pays attention”. Good: people ask the AI ​​these things and they listen. There are the unfortunate cases of suicide induced by chatbots or fake news. If the AI ​​recommends us pizza with gluesupposedly we have the common sense not to throw Super Glue as a substitute for cheese, but in other matters, there are users who trust AI as if it were an entity and not a compendium of letters one after another. It is something that Microsoft itself mentions, pointing out … Read more

found RAM memory modules worth 500 euros in the worst time to buy them

A Reddit user counted this week how he had a singular habit: rummaging through his neighborhood trash can in case he found some hardware treasure. And boy did he find it: among other things, he got two 32 GB DDR4 memory modules. Those modules thrown away as waste are a little treasureespecially because with the memory crisis Its market value exceeds 500 euros. what has happened. The user, who uses the alias “ringosbigfuckingnose” indicated that he makes regular visits to the local landfill in his area to look through the garbage that people throw away in search of components for their old PCs. He pointed out how he often comes across equipment from which he can salvage things, but the other day he found a real treasure: A Samsung monitor A 5.25″ floppy drive A 5-bay Drobo NAS Two 32 GB DDR4 memory modules A 10th generation Core i7 with its fan An ASUS motherboard A real find, without a doubt, but above all for one thing. 64 GB of RAM is 500 euros in your pocket. All of these components have value, of course, but it is especially striking that I found those two memory modules with a total of 64 GB. If you take a look around stores like Amazon or PcComponentes you will quickly see that two 32 GB DDR4 modules have a price that today is difficult to lower than the 500 or almost 600 euros. An absolute treasure. An ingenious solution to the memory crisis. What this user has achieved is to find a unique solution to the RAM memory crisis that has caused prices to rise. they shoot in an absolutely extraordinary way. It’s not likely that many people are throwing away memory modules lightly, but there are certainly plenty of people who find real treasures – especially in the form of old consoles and computers – in garbage dumps and recycling centers. And what for some is trash, for others is a small (or big) gem. On TikTok it’s easy find videos with people finding some devices that may be damaged, but that have possibility of being repaired. Electronic waste that is not waste. The Reddit user commented that he lives in a city of about 8,000 people, and the local landfill has a container for recycling electronic waste, something similar to what happens with the recycling centers or clean points that we find in Spain. It was in that part where this user found all those products as is, available for pickup. electronic waste. As they pointed out in Windows Centralthere are studies that indicate that less than a quarter of electronic waste is recycled properly. That means there is a lot of money wasted in the form of still valid hardware and also minerals and components that can be mined from those components. Image | Eugenia Pan’kiv In Xataka | The AI ​​leaves another news that will make the day worse for gamers: NVIDIA will not launch new graphics this year, according to The Information

With the consumer segment drowning, Samsung is the first to manufacture HBM4 memory. And it will be for NVIDIA, of course

Samsung is one of the names of this February. They are expected to present the Galaxy S26but they have something on the table that will be a shock not only to their coffers, but to the engine of the South Korean economy. We refer to high bandwidth memories because, in the midst of the RAM and SSD crisisSamsung is prepared to mass produce the HBM4 memories. And it will be for the AI, How could it not be any other way?. In short. The South Korean company has not confirmed it, but recent reports published by Reuters and local sources such as Korea JoongAng Daily They point out that Samsung will begin mass manufacturing HBM4 memory chips starting next week. It will be the first of the three companies that dominate the production of memory chips (the others are the South Korean SK Hynix and the American Micron, the which is gone from the RAM consumption) in starting to manufacture in large quantities these fundamental memories for the artificial intelligence. HBM4. This type of memory, as its name suggests, has enormous bandwidth. This is crucial for GPU needs and while NVIDIA has remained faithful to GDDR memory for its graphics cardsAMD did flirt with the stacked technology of the HBM chips for their Vega GPUs. However, it is not a technology for consumption, not because its performance is inadequate, but because it is too expensive. Making HBM memory is more expensive than making traditional DRAM chips, but the advantages are there. With HBM4, for example, the density of stacked chips allows Double the bandwidth of the previous generation. This is key to transmitting more data per second, but they also consume up to 40% less energy than HBM3 memories. NVIDIA. The most interested is, as we have said on previous occasionsNVIDIA. And if NVIDIA benefits, practically the entire leading artificial intelligence industry will take advantage of it because its chips are what are currently moving the industry. It is estimated that Samsung memories will go to NVIDIA’s Vera Rubin acceleration systems In fact, it has been reported that Jensen Huang himself has urged to accelerate and increase the production of these chips. Well, Huang has asked the entire semiconductor industry to manufacture components for his cards. let’s get the batteriesit is not something that concerns only Samsung. Spearhead. According to a Korea KoongAng Daily source, “Samsung has the world’s largest production capacity and broadest product line. It has demonstrated a recovery in its technological competitiveness by becoming the first to mass produce the highest-performance HBM4 memory.” Because, in this field, its main competitor, the neighboring SK Hynix, is expected to begin mass manufacturing its response between March or April, enough time ahead for Samsung to begin sending its memory to NVIDIA. And, here, Samsung’s great advantage is that it does not depend on TSMC: it has its own foundry and the HBM4 modules are based on 4 nanometer photolithography. Looking to the future. SK Hynix’s delay is not because they have rested on their laurels: they are the ones who they lead the way in the previous generation thanks to the HBM3E memory, but due to their schedule and they did not need it, they started developing the new generation later than Samsung. But of course, although HBM is the standard in current AI systems, we have already said that they are expensive chips and, in addition, they heat up a lot, requiring dissipation equipment to match. And that’s where companies are combining HBM4 memory production with a new generation of DRAM memory. The idea is to find a way for this memory – slower, but cheaper and ‘fresh’ – to compete in bandwidth with the HBM. Samsung and SK Hynix are in it, but they will have to compete against someone who didn’t play in this league: an Intel that does not arrive alonebut from the hand of the Japanese giant SoftBank. In short: Samsung has decided to get back on its feet when it comes to manufacturing muscle. And most important of all, all the companies that make memory modules remain focused on one thing: they make hardware for artificial intelligence while components such as RAM and SSD consumption they have the prices through the stratosphere. Images | Maxence Pira, Choi Kwang-moNVIDIA logo (edited) In Xataka | Huawei has kept its promise: it has found a way to boost China’s competitiveness in AI compared to the US

There is an unexpected victim of the rise in RAM memory prices: the very modern connected cars

Which what’s happening with the RAM memories is making one thing clear: the best time to buy memory modules is yesterday. The price increase is so extraordinary which is already affecting other classic components of our PCs such as SSD units or graphics cards. However, the crisis that these components are generating goes further. Much further. Data centers devour memory. The AI ​​fever, we already know very well, has generated a voracious hunger not only for cutting-edge AI chips, but also for RAM and HBM memories that accompany these chips. As indicated in The Wall Street Journaldata centers (both conventional and those dedicated to AI) will consume more than 70% of the high-end memory chips that manufacturers produce in 2026. And if they could take more, they would take them. This is not (only) about PCs or mobiles. It is evident that the first affected by this problem are conventional desktop and laptop computers, as well as our mobile devices. Hundreds of millions of them are sold every year and they all have a certain amount of RAM that is now more expensive than ever. The shock wave is already causing other components such as SSD drives or graphics cards affected, but in reality memory chips are everywhere. And above all, in one. From TV to car. The frenetic rise in memory prices is certainly going to affect other segments that we had not thought about soon. Of course it will do so on other consumer electronic devices, and this certainly includes Smart TVs, which They have their own processor, memory and storage to offer us its functions. But the problem may be even more critical for cars, which for years were already computers with wheels and which are now even better and more powerful computers (and with more memory) with wheels. Memories of all kinds. Although car electronic systems have traditionally used RAM, the latest in most cases was not needed. But that was in the cars of a few years ago, because the arrival especially of the electric car and the fever for screens in our vehicles has made these needs different. Now our cars need various types of memory, but in some cases those modules are as good (or better) than the ones we have in our cell phones and computers. The ECUs. A modern car makes use of so-called ECUs (Electronic Control Units) for issues such as controlling the transmission, the airbag system or the engine itself. It is normal for them to have between 50 and 150 of these control units or microcontrollers, and almost all of them contain RAM for temporary data and a ROM for firmware and software. Infotainment systems. The most obvious component that surely comes to mind as that “car computer” is the infotainment system, which usually consists of a touch screen, navigation functions, support for CarPlay and Android Auto systems, and voice assistants. Although in many cars these systems use 1 GB or 2 GB of DRAM memory, there are more modern cars that They reach 4 GB and even 8 GB of LPDDR4 memory. And if we talk about some manufacturers like BYD or NIO, there are models in which They use 16 GB of LPDDR5 memory. The Ford SYNC 5 system, for example, is based on a Qualcomm SoC with 16 GB of RAM. Driving assistance requires memory. In addition to these components, there are others that also require the use of RAM. Advanced driving assistance systems (ADAS) allow you to activate functions such as adaptive cruise control, lane keep assist, automatic emergency braking or parking assistant. And to achieve this they use RAM with high bandwidth, which allows working with real-time images and processing of sensor signals. Samsung knows this well and in fact manufactures modules specifically oriented to this market. Tesla’s well-known autopilot hardware, Hardware 4 (currently used) makes use of 16 GB of RAMFor example. Micron already warned. In December 2023 Micron already indicated that “a car needs more memory than a (space) rocket.” The firm, an absolute protagonist in the field of RAM memory module manufacturing, indicated how in 2023 the average vehicle used 90 GB between RAM and NAND, but in 2026 that figure was estimated to be 278 GB and would reach 2 TB in high-end vehicles. That was good news for it and other manufacturers, and even then it pointed to how “generative AI is transforming automotive.” What they probably didn’t realize is that this revolution was going to need many data centers, and those data centers were going to need a lot of memory. And this is where we are. In Xataka | “Not a phone, it’s a car”: Volkswagen believes that screens in cars are going too far

a -50°C sanctuary to save the memory of glaciers

The climate crisis What we are experiencing is not only threatening to redesign world maps with sea level risebut it is also erasing traces of the planet’s history. After confirming that 2025 was the third warmest year in historythe scientific community has completed a critical mission: inaugurate the Ice Memory Sanctuary in Antarcticaan underground library designed to preserve ice from mountain glaciers before they melt permanently. A real bunker. Today we have on the planet a seed bank to prepare ourselves in case there is a global catastrophe, and also data servers. And now we also have a large bench for ice, which logically requires extreme thermal stability. This sanctuary, which can be considered an authentic glacier cemetery, has been promoted by the Ice Memory Foundation and led by institutions such as the French CNRS and the Italian CNR. The location chosen could not be other than the Antarctic plateau itself, specifically the Concordia station. What is stored. Inside there is not simply “ice, but we find what scientists called “ice witnesses”. For science there is a fairly clear difference, since these glaciers are authentic hard drives that contain the thermal chemical history of our planet. And unfortunately it was being lost due to rising temperatures. With these ice cylinders it is possible to analyze the air that existed thousands of years ago or even at analyze the hydrogen and oxygen isotopes inside calculate the exact temperature it was in the past. Something that allows us to reconstruct global temperature graphs with a precision that tree rings or marine sediments do not always achieve. A disaster record. Bonus, this ice also acts as a filter that traps anything floating in the air. That is why we have already seen, for example, cvolcanic sand or dust from the Sahara which allows studying historical eruptions or the cycle of wind movement. Although technology logically has limitations, and in the future it is quite likely that these technological means will increase considerably. That is why the real objective is to leave this ice for the scientists of the future who will surely have many tools to continue extracting information from these blocks of ice that we cannot do today. Engineering behind the cold. Logically, ice cannot be at unstable temperatures, so the location at the Franco-Italian Concordia station is not a conventional building. It is a cave excavated directly under the snow, taking advantage of the extreme conditions of the white continent. Something that allows you to maintain a stable temperature at -50ºCwhich is also essential for storing the genetic material that may be inside. But unlike freezers in European laboratories, this sanctuary does not depend on the electrical grid or motors. If there is a blackout or energy crisis, the ice remains intact. That is why its design is perfect to last for centuries. There are already tenants. This sanctuary already has several members in its exhibition. Two ice cores have already been found that come from the Alps, specifically, one Col du Dôme block drilled in 2016 and from Gran Combin (Switzerland) extracted in 2025. Logically, the problem is in logistical transportation from Europe (or any location) to Antarctica. The samples traveled for 50 days on the research icebreaker Italian Laura Bassi from Trieste to Antarctica, completing the last leg by plane to the Concordia base. Something that logically is not easy at all. What’s next now. The Ice Memory Foundation plans to continue rescuing samples from at-risk glaciers in the Andes, Himalayas and Pamirs. The Concordia sanctuary is ready to receive the legacy of a world that, year after year, breaks temperature records and this is what has caused this project to move so rapidly today in order not to lose more glaciers that are melting. Images | Cassie Matias In Xataka | Eight months ago a robot disappeared under the ice of Antarctica. Today we have recovered it and it brings disturbing data

Everyone blames the manufacturers for the lack of memory. Micron says real bottleneck lies elsewhere

For months, memory shortage It has established itself in the technological debate as one of those phenomena that do not seem to need too many explanations. If RAM is missing and prices risethe immediate conclusion is that someone is privileging AI and leaving the consumer aside. That idea has resonated strongly, especially after visible decisions that have affected the domestic channel and have reinforced the feeling of abandonment. But when you get down to how memory is manufactured and kept stable today, the diagnosis becomes less obvious: the bottleneck doesn’t seem as obvious as it seems. A controversial decision. In this climate of widespread suspicion, Micron has become a preferred target, shared with other large manufacturers, but for a very specific and recent decision: the announcement of the end of Crucial consumer products. The company recently announced that will stop selling RAM memory and storage under that historic brand, with shipments expected through February 2026. For many users, that move was interpreted as a direct consumer recall just when memory is short. Micron justified that decision by noting that AI-driven growth in data centers has skyrocketed demand and that Crucial’s exit seeks to improve supply and support to its strategic customers in higher-growth segments. The market has changed size. From Micron’s perspective, the problem is not a renunciation of consumption, but an abrupt change in the scale of the market. Christopher Moore, vice president of marketing for the client and mobile business, He said in an interview with Wccftech that the company continues to have a relevant presence in PCs and mobile devices, while serving data centers. What has altered the balance is the growth of the data center business, driven by AI, which has gone from representing around 30% of the market to approaching, according to its figures, 50% or even 60%. That leap, he defends, has left the entire industry without sufficient margin. Variety also creates scarcity. For Micron, the bottleneck is not so much the lack of factories as how the existing ones are used. Moore explains that producing memory is not about making a single type of chip seamlessly, but rather about switching between multiple densities and configurations depending on what customers ask for. Each change, for example going from 12 GB to 16 GB modules or from 16 GB to 24 GB, forces lines to be readjusted and reduces the total output volume. In a context of skyrocketing demand, this variety, which was previously acceptable, becomes a direct brake on production. Micron’s new Idaho factory under construction Faced with the temptation to think that new factories will solve the problem, the manufacturer asks for patience. Moore explains that expanding memory capacity is not an immediate process, because it requires not only building facilities, but equipping them, validating them and certifying each product with customers. The company laid the first stone three years ago in its ID1 plant in IdahoUnited States, whose entry into operation is scheduled for mid-2027. Even so, it warns that there will be no significant impact on supply until the entire qualification process is complete, which it places in 2028. Crucial is gone, the channel is not. Moore assures that, although Crucial has disappeared from the consumer showcase, the company continues to provide memory to major PC and mobile device brands through channels less visible to the end user. This OEM channel, in which Micron supplies memory directly to integrators and manufacturers, concentrates a very relevant part of the market and ends up being incorporated into commercial designs and equipment. From their point of view, the consumer continues to receive Micron memory, even if it no longer does so under a recognizable label. With this panorama, the lack of memory ceases to be a problem of isolated decisions and is revealed as the result of several overlapping tensions. AI-driven demand for data centers that has changed the scale of the market, operational limits on production and long lead times to expand capacity explain why supply will remain tight for years. Micron places the relief horizon no earlier than 2028 and, until then, the consumer will live with fewer options and pressured prices. The bottleneck, the company insists, is not only in who buys the memory, but in how it is manufactured. Images | Micron In Xataka | The situation with RAM prices is so desperate that there are already those who build their own memory at home

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.