With the consumer segment drowning, Samsung is the first to manufacture HBM4 memory. And it will be for NVIDIA, of course

Samsung is one of the names of this February. They are expected to present the Galaxy S26but they have something on the table that will be a shock not only to their coffers, but to the engine of the South Korean economy. We refer to high bandwidth memories because, in the midst of the RAM and SSD crisisSamsung is prepared to mass produce the HBM4 memories.

And it will be for the AI, How could it not be any other way?.

In short. The South Korean company has not confirmed it, but recent reports published by Reuters and local sources such as Korea JoongAng Daily They point out that Samsung will begin mass manufacturing HBM4 memory chips starting next week.

It will be the first of the three companies that dominate the production of memory chips (the others are the South Korean SK Hynix and the American Micron, the which is gone from the RAM consumption) in starting to manufacture in large quantities these fundamental memories for the artificial intelligence.

HBM4. This type of memory, as its name suggests, has enormous bandwidth. This is crucial for GPU needs and while NVIDIA has remained faithful to GDDR memory for its graphics cardsAMD did flirt with the stacked technology of the HBM chips for their Vega GPUs. However, it is not a technology for consumption, not because its performance is inadequate, but because it is too expensive.

Making HBM memory is more expensive than making traditional DRAM chips, but the advantages are there. With HBM4, for example, the density of stacked chips allows Double the bandwidth of the previous generation. This is key to transmitting more data per second, but they also consume up to 40% less energy than HBM3 memories.

NVIDIA. The most interested is, as we have said on previous occasionsNVIDIA. And if NVIDIA benefits, practically the entire leading artificial intelligence industry will take advantage of it because its chips are what are currently moving the industry. It is estimated that Samsung memories will go to NVIDIA’s Vera Rubin acceleration systems

In fact, it has been reported that Jensen Huang himself has urged to accelerate and increase the production of these chips. Well, Huang has asked the entire semiconductor industry to manufacture components for his cards. let’s get the batteriesit is not something that concerns only Samsung.

Spearhead. According to a Korea KoongAng Daily source, “Samsung has the world’s largest production capacity and broadest product line. It has demonstrated a recovery in its technological competitiveness by becoming the first to mass produce the highest-performance HBM4 memory.”

Because, in this field, its main competitor, the neighboring SK Hynix, is expected to begin mass manufacturing its response between March or April, enough time ahead for Samsung to begin sending its memory to NVIDIA. And, here, Samsung’s great advantage is that it does not depend on TSMC: it has its own foundry and the HBM4 modules are based on 4 nanometer photolithography.

Looking to the future. SK Hynix’s delay is not because they have rested on their laurels: they are the ones who they lead the way in the previous generation thanks to the HBM3E memory, but due to their schedule and they did not need it, they started developing the new generation later than Samsung. But of course, although HBM is the standard in current AI systems, we have already said that they are expensive chips and, in addition, they heat up a lot, requiring dissipation equipment to match.

And that’s where companies are combining HBM4 memory production with a new generation of DRAM memory. The idea is to find a way for this memory – slower, but cheaper and ‘fresh’ – to compete in bandwidth with the HBM. Samsung and SK Hynix are in it, but they will have to compete against someone who didn’t play in this league: an Intel that does not arrive alonebut from the hand of the Japanese giant SoftBank.

Components market
Components market

In short: Samsung has decided to get back on its feet when it comes to manufacturing muscle. And most important of all, all the companies that make memory modules remain focused on one thing: they make hardware for artificial intelligence while components such as RAM and SSD consumption they have the prices through the stratosphere.

Images | Maxence Pira, Choi Kwang-moNVIDIA logo (edited)

In Xataka | Huawei has kept its promise: it has found a way to boost China’s competitiveness in AI compared to the US

Leave your vote

Leave a Comment

GIPHY App Key not set. Please check settings

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.