With the consumer segment drowning, Samsung is the first to manufacture HBM4 memory. And it will be for NVIDIA, of course

Samsung is one of the names of this February. They are expected to present the Galaxy S26but they have something on the table that will be a shock not only to their coffers, but to the engine of the South Korean economy. We refer to high bandwidth memories because, in the midst of the RAM and SSD crisisSamsung is prepared to mass produce the HBM4 memories. And it will be for the AI, How could it not be any other way?. In short. The South Korean company has not confirmed it, but recent reports published by Reuters and local sources such as Korea JoongAng Daily They point out that Samsung will begin mass manufacturing HBM4 memory chips starting next week. It will be the first of the three companies that dominate the production of memory chips (the others are the South Korean SK Hynix and the American Micron, the which is gone from the RAM consumption) in starting to manufacture in large quantities these fundamental memories for the artificial intelligence. HBM4. This type of memory, as its name suggests, has enormous bandwidth. This is crucial for GPU needs and while NVIDIA has remained faithful to GDDR memory for its graphics cardsAMD did flirt with the stacked technology of the HBM chips for their Vega GPUs. However, it is not a technology for consumption, not because its performance is inadequate, but because it is too expensive. Making HBM memory is more expensive than making traditional DRAM chips, but the advantages are there. With HBM4, for example, the density of stacked chips allows Double the bandwidth of the previous generation. This is key to transmitting more data per second, but they also consume up to 40% less energy than HBM3 memories. NVIDIA. The most interested is, as we have said on previous occasionsNVIDIA. And if NVIDIA benefits, practically the entire leading artificial intelligence industry will take advantage of it because its chips are what are currently moving the industry. It is estimated that Samsung memories will go to NVIDIA’s Vera Rubin acceleration systems In fact, it has been reported that Jensen Huang himself has urged to accelerate and increase the production of these chips. Well, Huang has asked the entire semiconductor industry to manufacture components for his cards. let’s get the batteriesit is not something that concerns only Samsung. Spearhead. According to a Korea KoongAng Daily source, “Samsung has the world’s largest production capacity and broadest product line. It has demonstrated a recovery in its technological competitiveness by becoming the first to mass produce the highest-performance HBM4 memory.” Because, in this field, its main competitor, the neighboring SK Hynix, is expected to begin mass manufacturing its response between March or April, enough time ahead for Samsung to begin sending its memory to NVIDIA. And, here, Samsung’s great advantage is that it does not depend on TSMC: it has its own foundry and the HBM4 modules are based on 4 nanometer photolithography. Looking to the future. SK Hynix’s delay is not because they have rested on their laurels: they are the ones who they lead the way in the previous generation thanks to the HBM3E memory, but due to their schedule and they did not need it, they started developing the new generation later than Samsung. But of course, although HBM is the standard in current AI systems, we have already said that they are expensive chips and, in addition, they heat up a lot, requiring dissipation equipment to match. And that’s where companies are combining HBM4 memory production with a new generation of DRAM memory. The idea is to find a way for this memory – slower, but cheaper and ‘fresh’ – to compete in bandwidth with the HBM. Samsung and SK Hynix are in it, but they will have to compete against someone who didn’t play in this league: an Intel that does not arrive alonebut from the hand of the Japanese giant SoftBank. In short: Samsung has decided to get back on its feet when it comes to manufacturing muscle. And most important of all, all the companies that make memory modules remain focused on one thing: they make hardware for artificial intelligence while components such as RAM and SSD consumption they have the prices through the stratosphere. Images | Maxence Pira, Choi Kwang-moNVIDIA logo (edited) In Xataka | Huawei has kept its promise: it has found a way to boost China’s competitiveness in AI compared to the US

In the midst of the RAM memory crisis, Samsung takes a leap with its HBM4 memory. It does not imply good news for the pocket

We are in full RAM price crisis. The industry is a cake that three large producers share and the data centers and the artificial intelligence They want to eat the whole cake. Samsung is one of the companies that manufactures memory for consumption and data centersand will soon begin mass production of its latest broadband memory chips: the HBM4. Don’t throw the bells in the air too soon. HBM4. This technology represents a crucial advance in stacked memories. Its density allows double the bandwidth, key to transmitting more data per second, but they are also up to 40% more energy efficient than HBM3. In short: they consume less energy and have fewer bottlenecks, which translates into an improvement in data processing. Industry sources point out that Samsung will use the 10-nanometer D1c manufacturing process for the matrix of these HBM4 memorieswith an internal structure of 4 nm. It’s a more advanced process than the 12-nanometer D1b from its main rival, SK Hynix. In addition, it will achieve a data transfer speed of 11.7 Gbps compared to 9-10 Gbps of the current standard. Hello Nvidia. South Korean media they point that these new Samsung HBM4 modules they would have passed Nvidia certification testing and will be in february when the company starts mass manufacturing them. Where will they end up? Some to Nvidia’s new AI acceleration system, called Vera Rubinothers at the heart of Google’s seventh-generation TPUs. After these reports, the company’s shares they went up 5.3% in the Seoul market. The enemy at home. In statements To South Korean media, Samsung representatives have commented that they feel quite confident with a new product that will clear up doubts about the company’s ability to supply the demanding needs of data centers. The fifth-generation HBM3E memories were a bottleneck for the company, so major players in the AI ​​industry looked next door: SK Hynix. Also South Korean, she is the second leg of memory chip manufacturing. The third is the American Micron Technology, a considerable distance from the two South Koreans. A year ago we already told that SK Hynix had achieved enormous efficiency in the DRAM stacking process to create these HBM memories, which allowed it to be 8.8 times more efficient than Samsung or Micron and, therefore, produce more modules for an industry that never stops asking. Meanwhile, the two South Koreans were in a race for the development of the new generation HBM4, and Samsung seems to have struck the first blow. Of course, it is estimated that Hynix will also begin mass production of these new memories on the same dates. And the consumer… what? Well nothing. If you were expecting good news related to the price of RAM, it must be said that no improvements are expected. These HBM4 modules will go to Nvidia, but we recently commented that OpenAI had reached an agreement with Samsung and SK Hynix to supply with 900,000 wafers per month. It is the volume equivalent to 39% of the estimated global capacity… and only for one company. Translation? Bottleneck in the market, a manufacturing speed that may not meet that demand and more bad news for the user. We have seen that Micron has abandoned its Crucial brand for consumers in favor of RAM for data centers, and that Samsung and SK Hynix are focused on HBM4 memories en masse, although they are not used in consumer devices, implies that this is where they will focus on this lucrative AI market. In short: Samsung may be dominating the new generation of memories, but 2026 seems difficult for anyone who wants to build a PC, expand RAM of yours, buy a new mobile or even wait for good news from the Steam Machine. Image | TSMC, Google In Xataka | RAM has become so, so expensive that there are manufacturers selling computers in an unprecedented way: “pre-assembled”

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.