Astronomical RAM prices are bad news for everyone, but especially for Apple

RAM memory prices have skyrocketed between 100% and 400% in just six months. 32 GB kits that cost $95 in the summer now cost $400. There are stores in the United States that They have removed the prices from the shelves and communicate them at the checkout, as if it were lobster on Christmas Eve. Why is it important. RAM prices have skyrocketed between 100% and 400% in just a few months. Samsung and SK Hynix have committed 40% of all global production to Stargate, OpenAI’s infrastructure. The three manufacturers that control 93% of the market prioritize servers over consumption. TrendForce has predicted that Entry-level smartphones will return to 4 GB of RAM in 2026. Budget laptops will stay stuck at 8 GB. For the first time in decades, specifications are not improving but going backwards. The paradox. The scarcity is caused by AI, but that same scarcity is going to undermine our ability to use local AI. Data centers take up all the memory to train huge models, but users won’t be able to run those models on their computers because much-needed RAM has exploded, so we’ll have the same, or less. Main loser. Apple has the most to lose in this scenario. Meta, Google and Microsoft can use the cloud for their models as they have been doing until now, but Apple has been betting heavily on local AI for two years as a great differentiator: models that run on your device, privacy by design and processing without depending on servers. The entire narrative of Apple Intelligence It is built on having enough RAM and local computing power. The iPhones They have been increasing their RAM precisely to run Apple Intelligence smoothly, closing the RAM gap between base and Pro models. Macs with Apple Silicon They have normalized 16 GB, after many years stuck at 8 GB, as the base in all models. The impossible dilemma. Apple has financial muscle and preferential contracts that allow it to get memory when others cannot. But that doesn’t solve your fundamental problem: you have two options and neither are good. You can maintain specifications and raise prices, but there is a limit to what the market will tolerate. Or you can start cutting RAM, but that means compromising just the competitive advantage you’ve been selling for two years. Between the lines. Other manufacturers can adapt by lowering specifications without breaking their value proposition too much. Samsung can put 6 GB in a mid-range Galaxy and still function the same: its AI depends on the Google cloud. But Apple has committed to an architecture that requires powerful devices in the user’s hands. And those devices are now much more expensive to manufacture. Private Cloud Computing It is a help, but it does not change the local narrative. The unexpected turn. Apple Intelligence may end up being much more expensive than Apple had planned. Not because the technology is expensive, but because the raw materials to execute it have become a scarce commodity. Apple is probably the company best positioned to weather this crisis due to its purchasing power (as we already saw with the semiconductor crisis due to the pandemic), but it is also the one that has the most to lose strategically. Apple chose a different path than its competitors precisely when that path was about to become prohibitively expensive. Cloud AI scales with servers you can rent or expand. Local AI scales only if each user has powerful hardware, and that hardware just got wildly expensive. In summary. For the first time in years, Apple does not control the key variables of its strategy. You can pay more than anyone else for memory, but you can’t change the fact that only three companies manufacture it or that those companies prefer to sell to OpenAI and company rather than to mobile and laptop manufacturers for the consumer market. The era of cheap memory is over, and among its many consequences is also the economic viability of Apple’s great differentiating bet. In Xataka | The RAM crisis is so extreme that it has achieved what seemed unthinkable: Apple’s memories are “cheap” Featured image | Georgiy Lyamin

return to 4 GB RAM mobiles

That ASUS Zenfone 2 was presented at the CES 2015 with one fact that stood out above all the others: it had 4 GB of RAM. That seemed exaggerated. “Who needed 4 GB on a cell phone?” we asked ourselves. It didn’t take too long until we saw that this figure became the norm, and it didn’t take much longer until we moved to 8, 12 or 16GB RAM. It seemed like we could only go further, but no. Everything indicates that in 2026 we will again see mobile phones with 4 GB of RAM in certain ranges. like him Pixel 3a of the image, a mobile six years ago. There it is nothing. what has happened. The perfect storm with AI and data centers has caused DRAM and NAND memories become a luxury. Prices have skyrocketed and the manufacturers of these components are already warning that this situation could continue. And in that situation it appears a prediction from TrendForce analysts: that mobile manufacturers could resurrect hardware configurations that seemed extinct, such as smartphones with 4 GB of RAM as standard in the entry range by 2026. less for the same. The trigger for this return to the past is the aforementioned increase in memory prices projected for the first months of 2026. This will cause significant cost pressure on large manufacturers, who are seeing how memory occupies an increasing percentage in their list of materials. And if they don’t want to raise the prices of their devices, they will do something that we don’t like at all: reduce technical specifications. Android phones, the most affected. This scenario fully affects the entire Android ecosystem, especially in the medium and low ranges. In them, the amount of RAM had traditionally been a key factor for the marketing of these devices, but to prevent prices from skyrocketing, manufacturers will foreseeably be forced to cut specifications. The question is how not having 6 or 8 GB of RAM on the mobile will affect the fluidity of the Android experience. The paradox of AI on mobile. The most ironic thing is that this price increase is caused by the rise of AI and data centers, but if we have mobile phones with less RAM, we will not be able to take full advantage of AI. The local models that our phones use precisely benefit from having a lot of RAM, so cutting this specification precisely cuts the capacity of those local AI models. microSD slot back. Curiously, this crisis may also bring back an old ally of mobile users on tight budgets: the microSD slot. Taking into account that NAND memories have also increased in price, it is reasonable to think that we will not see mobile phones with large capacities at acceptable prices either. That opens the door to boost again the option to expand that storage using microSD cards. And 8 GB laptops. For many years we had to put up with manufacturers offering laptops with 8 GB of RAM as standard. In recent times we have seen how the figure of 16 GB of RAM finally seemed to have settled, but according to TrendForce, although these models will continue to be offered, the ones that will be distributed the most will be the 8 GB ones again in the mid-range. If we want more of that, we will probably notice it clearly in the price. This is bad news for all users, but perhaps especially for the world of gaming, which has always gone one step further in terms of hardware requirements. The era of cheap and abundant memory is over. If the forecasts are already bad for the first quarter of 2026, they are even worse for the second quarter of next year. According to TrendForce, this will be when “more significant price fluctuations” will occur in the PC and laptop market. The moment also coincides with the celebration in June of Computex, the most important fair for manufacturers of desktop and laptop computers. In Xataka | 8 GB of RAM is not enough. Not even with Apple Silicon In Xataka | There was only one way to lower the price of RAM: Samsung and SK Hynix have flatly refused

In 1995 a program came out that promised to double your PC’s RAM. In the best of cases what I did was not spend more

The 90s were wonderful in the world of software and hardware. Epic trolling like that of the 299 dollars of the first PlayStationthe legendary key of Windows 95 or the PlayStation emulator presented by Steve Jobs himself. In the middle of the decade a program came out that promised the impossible: double the amount of RAM on your PC. Its name was SoftRAM 95 and, although it makes us raise an eyebrow today, in its day it sold hundreds of thousands of copies for $80 each. And spoiler: it was of absolutely no use. SoftRAM 95, the miracle solution for your PC’s RAM The launch of a program like this is a product of its time, one in which users they could have been less ‘smart’ Now for more than logical reasons and in an industry in which everything was learned and developed as we went. There were times when the smartest were the ones who got results, but a company called Suncronys Softcorp learned its lesson the hard way. The year was 1995 and Windows 95 was beginning to revolutionize homes. Although the Microsoft system made control a PC was more accessible than ever (unfortunately for Steve Jobs), the hardware still had a brutal barrier to entry: the price. They were still expensive devices, very expensive, so saving on components saved a few dollars. RAM It was one of those components for which you paid gold per KB, but… what if there was a program that, for a few dollars, doubled the amount of memory on our PC? What if he did all this without having to touch any piece of our equipment? That is where the Californian Syncronys Softcorp saw a vein and – now we can say that in bad faith – launched its program: SoftRAM 95. It went on sale in August 1995 and it is estimated that they sold a whopping 600,000 copies until December of that same year. In those days, it was truly outrageous. And the logical question is how he achieved what he promised. The long answer is that it compressed the memory, so when the operating system needed to save data from RAM to the hard drive, SoftRAM 95 compressed it before writing it, reducing the amount of space needed on the disk and allowing the RAM to have more space available. The concept, roughly speaking, is correct, and the program interface told us that yes, congratulations, you had double the amount of RAM. The long answer is that it didn’t do what it promised. Although technically they were on the right track, this process at the time was tremendously ambitious for one reason: the speed of both the RAM and the primitive hard drives It was so absurdly slow that, effectively, the objective could not be met. They knew this from the top of Syncronys, but they didn’t care: the money was pouring in because each license cost about 30 dollars. Under the magnifying glass of the press… and Microsoft However, things quickly went wrong. A magazine of the time called PC Magazine submitted the software to a analysis How these analyzes should be done: testing whether the program really did what it promised. Using blocks of data to evaluate whether compression was effective, they found that processing times were exactly the same with compressible data and with random data that could not be compressed. They came to the conclusion that the only thing SoftRAM did was show an animated screen which gave the user the perception that they were working when, in reality, they were doing absolutely nothing. But beyond the press, those who got their hands on the software were Bryce Cogswell and Mark Russinovich, two Microsoft engineers who dissected the program at the code level. Basically, confirmed the well-founded suspicion of PC Magazine and pointed out that the program never actually worked. That is, the paging controller device – that compression of the RAM to transfer it to the hard drive – it closed just when loadingso it never did anything at all other than display false numbers while the operating system worked exactly as it should, whether the program was installed or not. When I said before that the management of Syncronys knew it, it was not because we saw history with the eyes of the present. When everything was revealed, they reported that RAM compression was not being carried out and, in addition, it was learned that they sold the software even though its developers had warned that the product was not ready. And it wasn’t aI’ll launch it and I’ll fix itlike many current games”, because in 1995 Internet updates were not the norm. Just when the company thought it was over, the US Federal Trade Commission arrived. Following its investigation, Syncronys finally acknowledged that it had misrepresented the performance of its product and banned it from selling any more copies of both SoftRAM and Windows 3.1 as SoftRAM 95. In total, both versions placed 700,000 copies on the market and Syncronys declared bankruptcy in July 98, owing 4.5 million dollars. The idea did not die with SoftRAM In the end, what SoftRAM did The best case scenario was not to eat up your PC’s resources.and it was one of those attempts to sell whatever in a still somewhat naive market. For PC Worldnext to AOL and RealPlayerSoftRAM is the worst technology product of all time. But of course, with the eyes of 2025, you may be wondering… what happens with solutions like Windows Vista ReadyBoost and the mobile memory expansion? It’s a different matter and, although both promise to improve performance by using “extra memory”, it is something very different from what SoftRAM did. ReadyBoost, for example, allowed you to use the memory of a pen drive as a cache to speed up access to frequent data. It acted as an extension of the system’s virtual memory and the theory is correct, but again we ran into the speed limitation of USBs … Read more

The AI ​​raises a huge change in our mobiles. One that will have (at least) 32 GB of RAM

A year ago our mobiles have AI functions. Google offers them with Gemini and Apple (more or less) with Apple Intelligencebut for now these functions are limited and are reduced to somewhat modest tasks. However, we are seeing how our PCs have access to more striking models. The recent comparison we made of Deepseek R1-14b with models as it calls 3.1-8b or phi 4-14b showed that these developments could really run well in a Mac Mini M4 with 16 GB of RAM. However, what happens for example in the Pixel is that Google offers its Gemini Nano modelwhich has two versions: one 1.8b and another 3.25b. They are decent models, but they are still clearly below the benefits of models such as Deepseek-R1-14B and others such as those mentioned. The problem is that these models, especially when we begin to raise the size and number of parameters (14b, for example), They need memory. And enough. A LLM of seven billion parameters (7b) usually need about 8 GB of memory, although here has some more margin of maneuver (for example, 12 GB) is recommended. The manufacturers know it, and in fact even Apple has made a small effort there. In the iPhone 16 The jump from 6 to 8 GB has been made largely because of this, and Google Pixel 9 They offer up to 16 GB of RAM precisely for the same reason: that gives maneuver margin so that the functions of the executed in local can function fluidly. But that jump may soon go more. It does not seem unreasonable to think that sooner rather than later let’s see mobiles with at least 32 GB of RAM precisely to be able to execute larger AI models and thus offer users more powerful options in this regard. Of course, not only does the amount of memory matter. Our mobile phones do not have a dedicated GPU that can accelerate these tasks, but much is being progressed in the development of NPUs increasingly powerful. The combination of both elements seems to make possible an important change in offering local models of increasingly versatile. These hardware improvements in our mobiles also join possible techniques of optimization and “compression” of AI models. The quantization, a kind of “rounding”, allows large language models (LLM) to see their size reduced, yes, from the loss of a certain level of precision. The quantization It is an already very popular process when being able to use large models in more modest machines, and in addition to reducing hardware requirements it also allows to gain efficiency. All this suggests a not too distant future in which we will have much more powerful models in our pocket. Models that we can run at home, which we can even use without internet connection and that will also maintain the entire conversation in private. There are many interesting advantages. Too many Not to think that we may soon see how mobile manufacturers presume 32 GB mobile. Or who knows if even more. In Xataka | The new Gemini demonstrates a Google ambition: that we talk without stopping with our mobile

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.