TSMC and SK Hynix are suffocating Samsung. To defend itself, it is already preparing a brutal weapon: 1 nm chips

Samsung Electronics has two major competitors in the semiconductor industry: TSMC and SK Hynix. The Taiwanese company TSMC leads the market for manufacturing integrated circuits for third parties with a share close to 70%, according to the consulting firm. TrendForce. Samsung is the second largest chip producer for third parties, although with a market share of 7.2% It is positioned very far from the leader of this industry. And the Chinese company SMIC (Semiconductor Manufacturing International Corp) is hot on his heels in third position with a share of 5.32%. Samsung’s other big business is memory chips. In this market it competes with the American company Micron Technology, but its biggest rival is the also South Korean company SK Hynix. In recent years, Samsung has led the DRAM memory integrated circuit manufacturing market with an approximate 40% share, while SK Hynix defended a very worthy 29%. Behind both was Micron Technology, with 26% approximately. However, during the first quarter of 2025 a very important setback occurred. SK Hynix controls none other than 70% of the market of HBM memory ICs (High Bandwidth Memory), so its leadership in this sector is overwhelming. If we look at the DRAM memory chips the figures are much more even, although SK Hynix also leads. TSMC and SK Hynix. SK Hynix and TSMC. These two competitors are two big headaches for Samsung, but the latter company seems unwilling to throw in the towel. Samsung plans to have its 1nm photolithography ready in 2030 In February 2025 the Taiwan Economic Daily published a report in which he assured that TSMC plans to develop a cutting-edge semiconductor plant that will be expressly designed to produce 1nm chips. It will be housed in the Taiwanese town of Tainan, and will be called ‘Fab 25’. It will work with 12-inch wafers, have six production lines and will begin large-scale manufacturing in 2030. It may seem like there is still a lot of time, but that is not the case. In fact, according to the newspaper Korea Economic DailySamsung is making efforts to step on the heels of TSMC. And, incidentally, surpass SK Hynix. Samsung’s future 1nm production lines will benefit from the refinements that the company is going to introduce to its 2nm nodes And Samsung engineers have already been working on their 1 nm photolithography for many months with the aim of concluding the research and development phase in 2030 to be able to start mass manufacturing in 2031. There is a lot at stakebut the development of this technology is by no means a piece of cake. In fact, this company is currently trying to optimize the performance of its 2nm nodes because its Exynos 2600 processor in smartphones Galaxy S26 and S26+ suffers when we compare its performance and energy efficiency with those of comparable chips manufactured by TSMC in its 3nm nodes. Be that as it may, Samsung’s future 1nm semiconductor production lines will benefit from the refinements that this company is going to introduce in its 2nm nodes. And, above all, they will take advantage of Fork Sheet technology with which its engineers seek to leave behind the limitations of Gate-All-Around technology (GAA). Fork Sheet It will allow them, roughly speaking, to dramatically optimize the space on 1nm chips by adding a non-conductive element between the transistors with one purpose: to eliminate empty spaces and pack a higher density of transistors on the same surface. It sounds really good. We will tell you more as soon as we have detailed information about this innovation. Image | Generated by Xataka with Gemini More information | Korea Economic Daily In Xataka | We already know what the chips that will arrive until 2039 will be like. The machine that will allow them to be manufactured is close

Google has made AI consume up to six times less memory. Micron, Samsung and SK Hynix are paying dearly

we carry months wrapped in the memory crisisbut maybe there is a way out. Last week Google Research published a study in which he revealed a technique called TurboQuant. This is a compression algorithm capable of compressing the working memory of AI models up to six times without appreciable loss of quality or performance. Great news for end users, who see a light at the end of the tunnel, but terrible news for manufacturers, who this golden age could end. Let’s explain what KV cache is.. To understand TurboQuant you have to understand what that memory is that it manages to compress. When a language model processes a long conversationyou need to remember the context. Each token that is processed is stored in the so-called KV cache, a type of working memory that grows as we chat. The longer the conversation, the more memory the model requires. Compressing what is a gerund. It is one of the main bottlenecks in the AI ​​inference stage (that is, when we use the models), and one of the reasons why data centers they need as much RAM or HBM memory. TurboQuant uses a vector quantization method to compress this cache while maintaining the precision of the model. Pied Piper. As soon as this Google study appeared, the analogies began with the plot of the series ‘Silicon Valley’. In it, the fictional startup in the plot managed to develop an extraordinarily efficient compression algorithm called Pied Piper that threatened to revolutionize the technology industry. These days, multiple references to the series appeared on social media, which had already been referred to as visionary for reflecting what is happening with spectacular accuracy even when the series was a comedy. Six times less memory. The Google Research paper states that this method is capable of reducing the KV cache six times without an appreciable difference in performance in long conversations. The researchers will present their results at an event next month and explain the two methods that allow it to be put into practice. If they confirm what they’ve already teased, the implications are huge: less memory for inference means data centers can do the same thing with much less hardware/memory. Google’s DeepSeek moment. The discovery has some analysts calling this Google’s “DeepSeek moment.” A year ago, the Chinese startup DeepSeek launched an AI model that competed with the best but had cost much less to develop. That shook the industry, and now we return to a technical achievement that points to the same thing. In AI, doing the same with less is crucial, given the enormous resources that this technology requires. There are those who already have done evidence preliminaries with TurboQuant and have confirmed that the method does indeed work. Micron, Samsung and SK Hynix pay dearly. The impact of this technique can be enormous, and this has already begun to be noticed in the stock market valuations of DRAM memory manufacturers and HBM. Companies like Micron, Samsung, SK Hynix, SanDisk and Kioxia fell noticeably last week from their recent highs. On March 18 it was around $471, and today its shares are at $357, a staggering 24.2% drop. The same has happened with the rest of the manufacturers, which were already falling since that date, but have accelerated that fall with the launch of TurboQuant. But. The technique can theoretically be applied only to the inference phase, but the training phase of AI models is not affected by this compression technique. Therefore, huge amounts of memory will still be needed during the training phase. Besides we will have to wait for AI companies to actually start applying said system if it is confirmed to work, and that will be when we can see the real impact. Theoretically this will give a lot of room for maneuver to big tech, which will be able to reduce token prices even further, but it remains to be seen if they do so. RAM memories drop in price. The impact of TurboQuant has also been clear in the prices of memory modules, which have dropped appreciably in price. For example, the Corsair Vengeance DDR5 32 GB 6000MHz (2x16GB) modules were at 489.59 euros on Amazon until a few weeks ago according to CamelCamelCamel, but right now they are at 339.89 euros, a notable discount. It is true that not all components are falling equally, but there are indeed cases in which reductions seem to be occurring. In Xataka | The RAM crisis is destroying all of Valve’s plans with its Steam Machine

Samsung and SK Hynix have flatly refused

The most basic response for a market with a lot of demand and rising prices is none other than increasing production. However, in a memory market experiencing a pricing crisis (largely due to the “perfect storm” caused by AI), the two actors that control a huge percentage of the world’s supply have opted for the opposite strategy. Samsung and SK Hynix have confirmed to their investors that They have no intention of flooding the market to alleviate the shortage. Their priority has changed: they prefer short-term profitability and security against a possible AI bubble, which, as we already said, has caused the industry to stop manufacturing for people to make it for the machines. Slam the door on mass production. The decision of the South Korean memory giants is firm and public. According to what they collect international media and the korean newspaper HankyungSamsung has clarified in investment conferences that its strategy is now focused on minimizing the risk of overproduction rather than rapidly expanding its capacity. This path already has tangible consequences: currently, Samsung only manages to satisfy 70% of the DRAM orders it receives. SK Hynix, on the other hand, has openly admitted that “it will be difficult to resolve the supply shortage” before the first half of 2027. Goodbye to stable controls. The new supply restriction policy is accompanied by a change in the commercial rules of the game. The South Korean media points out that Samsung has begun to reject long-term contracts. This prevents PC or mobile manufacturers from being able to protect themselves against inflation, passing the extra cost on to the end user. It is the materialization of what Xiaomi managers They already warned weeks ago: your next cell phone is going to be more expensive. Sacrifice the consumer for a good reason. The technical reason behind the shortage is a transfer of resources. Production lines are turning to the manufacture of HBM memories — even Intel as response to your current situation— essential for AI GPUs. The most drastic example is Micron, the third player, which has announced the closure of its “Crucial” consumer division to focus on the data and AI sector. By reserving production for strategic clients, the consumer market becomes more undersupplied. Fear of the bubble. Behind this refusal to maximize production to satisfy everyone lies a trauma. Manufacturers fear that the AI ​​fever is a bubble and will collapse after massive expansion. Because? They would be left with oversized factories and million-dollar losses. They prefer to maintain the current shortage – which has triggered memory by up to 300% in some cases – rather than risk future excess stock. The projections are not optimistic. Analysts conclude that the crisis will extend beyond 2028. It is estimated that memory demand will grow by 35% in 2026, while supply will only grow by 23%. With Micron withdrawing from the consumer market and Korean giants slowing production, shortages will cease to be a situation and become the market norm in the coming years. Cover image | Composition with images of PxHere and CEFICEFI for Wikimedia Commons In Xataka | Samsung is dealing with one of the most challenging stages in its history. To succeed, it needs to be successful in chip manufacturing

Openai signs with Samsung and SK Hynix for a potential chips demand of 900,000 wafers per month. It is an absurd figure

In Seoul A package of agreements was closed which reflects how far the career for artificial intelligence is coming. Openai sat down with Samsung and SK to advance his project Stargate And the companies pointed to a goal that surprises on its own: 900,000 DRAM wafers per month. The plan, according to the parties, goes through reinforcing memory production and studying new data centers in South Korea. All this was announced after a series of meetings of Sam Altman, business leaders and President Lee Jae-Myung himself. The appointment at the Seoul presidential office brought together Sam Altman With the leaders of the aforementioned Asian technological conglomerates, in the presence of the president Lee Jae-Myung. The tone was shared: Korea seeks to consolidate as one of the three global powers in artificial intelligence and OpenAi needs to anchor its Stargate project in regions with technological muscle. This lace explains the interest of both parties in formalizing agreements that cover from the memory supply to the construction of new data centers, with a long -term view. An objective that can tension the entire memory sector The volume that has been put on the table is disproportionate if compared to the market. According to Techinsightsthe global capacity of production of 300 millimeter DRAM was about 2.07 million per month in 2024 and would grow to 2.25 million in 2025. reaching 900,000 would mean about 39% of all that capacity. No individual manufacturer reaches such a figure alone, so that the magnitude of the agreement reflects both Openai’s ambition and the growing pressure to ensure the supply of advanced memory. Signed documents include preliminary commitments to expand memory production and evaluate additional infrastructure in South Korea. Among them is the participation of Samsung SDS in the development of data centers, as well as Samsung C&T and Samsung Heavy Industries in its design and construction. The Ministry of Science and ICT contemplates evaluating site outside the Metropolitan Area of ​​Seoul, and SK Telecom has signed an agreement to study the viability of a center in the southwest of the country. It is also proposed to explore the deployment of Chatgpt Enterprise and API capabilities in corporate operations. A key point in all this is in the difference between using and training a model. When someone consults a chatbot, infrastructure of inference is activated, much less demanding. But to train a new generation system, thousands of chips are needed working in parallel, each accompanied by High performance memory modules. This scale multiplies the need for servers, cooling systems and electrical power. In that context, guaranteeing hundreds of thousands of wafers per month does not seem an excess, but a way of ensuring that the next wave of models has the necessary material support. Stargate Data Center in the United States Openai’s computing muscle relies on huge draft alliances. With Oracle and SoftBankthe company prepares five data centers that would provide several capacity gigawatts. Nvidia, meanwhile, has announced that it would invest up to 100,000 million dollars and that would give access to more than 10 gigawatts through their training systems. Openai’s trajectory is not understood without Microsoft, his first great partner. The Initial bet of 1 billion in 2019 and the subsequent investment of 10,000 million gave access to the Azure cloud, Key to train models They promoted Chatgpt. Over time, however, Sam Altman’s company has begun to reduce that dependence. The last movements mark a change of course towards infrastructure in which OpenAI has more direct control, a way of making sure they are not conditioned to a single supplier. It should be remembered that many of the ads remain preliminary. Letters of intention and memoranda mark the will to advance, But concrete details have not yet closed. At the scale that Stargate raises, the risks are evident: from bottlenecks in the production of high performance memory to energy availability to feed facilities of several gigawatts. To this are added the necessary permits and the complexity of coordinating projects with so many actors. At the moment, the signed opens a path, but it remains to be seen what materializes and in what deadlines. Images | Sam Altman | Samsung | SK Hynix | Xataka with Grok In Xataka | I’ve been hooked to Sora 2 for two days: I’m generating absurd memes where I am the protagonist and I can’t stop

Nvidia, TSMC and SK Hynix are the most powerful chip companies on the planet. None can allow any of the others to fall

Nvidia dominates the global chips market for artificial intelligence (AI) with a fee that during the last three years has oscillated between 80 and 94%, according to Fourweekmba. Your leadership is supported by A very competitive hardware and a software ecosystem in which CUDA (Compute Unified Device Architecture) It has an essential role. This technology brings together the compiler and development tools used by programmers to develop their software for NVIDIA GPUs. However, the company led by Jensen Huang has a fundamental partner: TSMC. Nvidia designs the chips for AI and this manufacturer of Taiwanese semiconductors, the eldest of the planet with A global quota close to 60%it produces them. Its iron leadership is the result of Its peak technology and its titanic production capacity. TSMC has many important clients, such as AMD, Qualcomm, MediaTak or Broadcom, among many others, but thanks to the AI ​​NVIDIA, it has established itself as Your second best customer Only behind Apple. Presumably TSMC is about to start MANUFACT 2 NM GPU For Nvidia, but this is not the only thing that this chips manufacturer is going to do for one of its best customers. And this Taiwanese company has decided to start An expansion plan for five years of its manufacturing capacity of integrated circuits using its advanced cowos packaging technology (Chip-on-Wafer-on-Substrate). According to Beth Kindigof the I/O Fund consultant, this technology will monopolize between 50 and 60% of the market in 2025 compared to 15% it supported during 2024. The synergy of these companies is indisputable The high demand for GPUs for AI with Blackwell MicroAritectura de Nvidia is largely responsible for the implementation of this plan. The company led by Jensen Huang can respond better to the needs of its customers and will see how its competitiveness is increased in a phase in which Depseek and other Chinese companies represent a challenge. In March 2024 TSMC officially announced which was building two cowos packaging plants in the town of Chiayi, housed in southern Taiwan. However, this is not all. He also shuffled the option to put a plant more specialized in this advanced packaging technology in Japan, presumably on the island of Kyushu, in which this company is currently building two semiconductor production plants of avant -garde. In any case, there is something else. And it is that Chiayi plants will be trained to work, in addition to the packaging cowos, With advanced Info and Soic technologies (System on Integrated Chips). Nvidia and TSMC synergy is indisputable, but this recipe requires a third ingredient: SK Hynix It is evident that TSMC wants to cover your back well and look to the future to prevent its production capacity from being threatened by a bottleneck. An interesting note: currently the Cowos packaging is being used with the AMD Instinct Mi250 chips and with the A100, H100, H200, B100 and B200 NVIDIA GPUs, as well as in its derivatives. The review used in these last two chips, the B100 and B200, is known as Cowos-L. Before the TSMC ends this year, you will be able to process no less than 60,000 wafers per month using its advanced packaging technology. The synergy of Nvidia and TSMC is indisputable, but this recipe requires a third ingredient: SK Hynix. This South Korean manufacturer of memory chips leads the HBM memories market (High Bandwidth Memory) that work side by side with the GPUs for ia with a shocking authority. Your market share Broken 70%so that the remaining 30% are distributed by Samsung and Micron Technology. After them, Chinese manufacturers of Yangtze Memory Technologies Co. (YMTC) and CXMT (Changxin Memory Technologies). At the end of 2024 SK Hynix took advantage of the celebration of an innovation forum organized by TSMC to publicize its mastery of the manufacture of HBM memories. According to SK Hynix itself Its MR-MUF process, which, in broad strokes, is a technology that makes possible a faster punch of the DRAM compared to the TC-NCF process that other companies use, has allowed it to achieve an efficiency 8.8 times higher than that of Samsung and Micron. This simply means that it manufactures its HBM chips much faster than its main competitors. SK Hynix is ​​manufacturing 12 -layer HBM3E memories on a large scale while Samsung and Micron have problems with their production As we can intuit, the speed at which a company that is dedicated to manufacturing semiconductors is capable of producing its integrated circuits deeply condition its competitiveness. It is evident that greater efficiency will allow you supply more guarantees to your customersespecially in an upward market like that of HBM memories. In addition, SK Hynix is ​​manufacturing 12 -layer HBM3e memories on a large scale while Samsung and Micron have problems with their production. In any case, both Samsung and SK Hynix are already working on the development of HBM4 memories with the purpose of catapulting their competitiveness. Here it is precisely where Nvidia appears. SK Hynix announced in October 2024 that he intended to deliver the first HBM4 memory chips to his clients during the second half of 2025. However, Jensen Huang asked him That the delivery advances. Chey Tae-Won confirmed itthe president of SK Group, so it is absolutely reliable information. Why does NVIDIA require so urgently the HBM4 chips? Simply because you need to support your chips for the most capable with the most available energy and energy efficiency memories. And in this field SK Hynix currently has the pan well grabbed by the handle. Image | TSMC In Xataka | South Korea fears US reprisals. To avoid their old lithography equipment, they take dust on a warehouse

Intel plans to get fully into the market in which South Korean SK Hynix has become rich: memories for AI

The South Korean company SK Hynix leads the HBM memories market (High Bandwidth Memory) With a shocking authority. Your market share Broken 70%so that the remaining 30% are distributed by Samsung and the American memory manufacturer Micron Technology. These memoirs work side by side with the GPUs for artificial intelligence (AI). In fact, one of SK Hynix’s main clients, possibly the oldest, is Nvidia. According to the consultant Datam Intelligence The global market centers market for AI will grow annually 24.5%, so it will go from having a volume of 13,670 million dollars in 2024 to no less than 78,910 million in 2032. For designers and integrated circuit manufacturers compete in a market with this growth potential is crucial, hence several Chinese companies are planning to get into it. And for Intel represents a too juicy opportunity to let it escape. Intel and Softbank work together in a new type of memories for ia The manufacture of HBM memories is very complex. This is the reason why this market at the moment is distributed only to the three companies that I have mentioned in the first paragraph of this article. However, its great growth potential will surely cause other companies over the next few years. Intel is going to be one of them, although the interesting thing is that he will not compete alone or fight for the HBM chip market. Intel and Softbank have proposed to complete the development of a prototype and evaluate its viability from a technical point of view by 2027 This American company has founded a company specialized in the design and manufacture of memory chips from the Japanese investment group. His name Saimemory And he was born expressly to compete from you to you with SK Hynix, Samsung and Micron Technology. Your plan consists in developing a new type of High performance packed dram memory From some patents prepared by Intel and several Japanese research centers, among which is the University of Tokyo. Intel and Softbank have proposed to complete the development of a prototype and evaluate its viability from a technical point of view by 2027. In fact, they intend to manufacture on a large scale and market this dram memory stacked for ia before it ends this decade. The performance of HBM memories is very high, but, as I mentioned a few lines above, they are difficult to manufacture. In addition, they are expensive, they dissipate a lot of energy in the form of heat and consume a lot of electricity. Stacked dram memories, however, on paper will be easier to produce, more efficient, and also cheaper. If when they are really satisfied the expectations that have generated it is possible that They end up displacing HBM chips. In fact, Intel and Softbank are not at all the only companies that trust the potential of stacked dram memories; Samsung and Neo Semiconductor are also developing this type of chips, so before the Memoirs market expires this decade will be much more competitive than today. Image | Samsung More information | Nikkei Asia In Xataka | South Korea fears US reprisals. To avoid their old lithography equipment, they take dust on a warehouse

A former SK Hynix employee is suspected to deliver stolen technology to Huawei

Although not monopolized attention to the same extent that TSMC, Intel or Samsung, the South Korean company SK Hynix is ​​done is One of the main designers and manufacturers of semiconductors of the planet. His specialty is The tuning of memory chipsand it is doing so well that LEADS THE HBM Chips Market so much (High Bandwidth Memory) like that of the DRAM memories. In fact, SK Hynix controls no less than 70% of the integrated HBM memory circuits market, so His leadership in this sector is overwhelming. In addition, it is the NVIDIA supplier if we stick to its GPU to artificial intelligence (AI) Thanks to the high competitiveness of your HBM chips. Samsung has an approximate share of 28%, and Micron touchs 18%. However, the success of SK Hynix is ​​based, as we can intuit, about its capacity for innovation. At the current situation, industrial espionage is the order of the day The Prosecutor of the Central District of Seoul (South Korea) has accused a former SK Hynix employee of illegally transferring technology to the Chinese company Huawei. This person’s identity has not yet transcended, but according to Digitimes Asia He is a name Kim, is South Korean and worked in the subsidiary in China of Sk Hynix until in 2022 he left this South Korean company to join Hysylicon, the Huawei subsidiary specialized in the design of semiconductors. Kim is South Korean and worked at SK Hynix until in 2022 this company left to join Hysilicon Presumably the Prosecutor’s Office is acting at the instances of SK Hynix. And it seems that this last company has discovered that during the negotiation of its hiring Kim gave Hysilicon Secret Information About advanced packaging technology used in the production of 3D NAND and HBM memories, as well as about multiple assembly Chiplets and of the CMOS image sensors. These technologies are protected by the property rights of SK Hynix, and in them largely resides the competitiveness of this company. According to the Kim prosecution he printed and took more than 11,000 photographs of internal SK Hynix files with the purpose of abandoning this last company and using this confidential material to get work in the competition in very advantageous conditions. In fact, according to the researchers of the Kim accusation he presented the material that extracted from SK Hynix at least two Chinese companies. One of them Hysilicon, He finally hired him. An important note: neither this last company nor its matrix, Huawei, asked him to give them commercial secrets, but, according to the Prosecutor’s Office, they accepted them as part of the hiring process. More information | Digitimes Asia In Xataka | China’s future in the chips industry is in the hands of a single company almost unknown: Sicarrier

Samsung crosses one of the most difficult stages in its history. His problem is that he is not able to keep up with TSMC and SK Hynix

Samsung is dealing with a complicated stage. These statements carried out just a few hours ago by Han Jong-Hee, general co-director of this South Korean company, diagnoses precisely What’s happening: “First of all I sin artificial intelligence (AI), which quickly evolves. “ This Mea guilt It is aimed at the company’s shareholders and expresses that the directive dome recognizes not having taken The appropriate decisions In recent years. “Our technological advantage has been compromised in all our businesses. It is difficult status quo Instead of generating disruptive changes, “says an internal statement written by Jay Y. Lee, the president of the company, according to Reuters. Loss of competitiveness is a big problem for Samsung Han Jong-Hee’s statements insinuate clearly that Samsung has not been able to get on the AI ​​train that currently leads the Taiwanese TSMC semiconductor manufacturer with hardly any opposition. This last company produces the integrated circuits for AI designed by NVIDIA, AMD, Broadcom and other companies, which has led it to lead the global chip manufacturing industry with an approximate quota of 60%. TSMC’s economic performance is perceptibly better than Samsung’s The company led by CC Wei He announced In the middle of last January their economic results during 2024, and are extraordinary. In fact, their income beat a record by increasing 34% compared to those obtained in 2023. Only during December 2024 the 8,400 million dollars exceeded, a figure that represents an increase of 57.8% compared to the same month of the previous year. The engine of these figures are precisely the semiconductors for AI applications. TSMC’s economic performance, as we have just seen, is noticeably better than Samsung’s. However, this is not all. And is that The also South Korean Sk Hynix Threatens Samsung’s leadership in the memory chip market. In fact, it has already exceeded the benefits of this last company thanks to the excellent reception that its HBM memories are having (High Bandwidth Memory), which are designed to work side by side with the GPUs for ia. In this situation the hope of Samsung to return to the path of growth resides in 2 nm chips. And the first green sprouts They have already arrived. The Japanese chips designer for the preferred networks (PFN) and a South Korean company specialized in the design of neuronal processing units (NPU) have already formalized orders for their lithographic node of 2 nm. In fact, Digitimes Asia ensures that the massive production of integrated circuits of 2 nm It has already begun in South Korean plants of Samsung. Whatever one thing we can be safe: 2025 will be the year of the semiconductors of 2 nm. TSMC starts from a very comfortable position, but with all probability Intel and Samsung will also put all the meat on the grill. Image | Samsung More information | SCMP | Reuters In Xataka | South Korea fears US reprisals. To avoid their old lithography equipment, they take dust on a warehouse

Samsung has its biggest competitor at home. His future with chips depends on his rivalry with SK Hynix

South Korean semiconductor manufacturer SK Hynix is ​​on a good streak. The memory market is dominated by the Samsung subsidiary specialized in the production of integrated circuits with an approximate share of 40%while SK Hynix defends a very worthy 29%. Behind both is the American Micron Technology, with 26% approximately. These are, precisely, the three companies that control the juicy HBM memory market (High Bandwidth Memory) that work hand in hand with GPUs to artificial intelligence (AI). In fact, SK Hynix is ​​NVIDIA’s main memory supplier. And having the company led by Jensen Huang as a client helps. It helps a lot. So much so that according to SCMPSK Hynix has surpassed Samsung in profits. And it has done so, precisely, thanks to its high-performance memories. However, it is not all good news. SK Hynix has predicted that sales of memory chips for consumer devices, such as smartphones or computers, will fall during 2025. “This year the memory chip market will be subject to great uncertainty because trade protectionism is growing and geopolitical risks are increasing. At the same time, PC and mobile phone companies are adjusting their inventories,” Kim Woo-hyun statedCFO of SK Hynix. This situation anticipates a complicated 2025 for both Samsung and SK Hynix, although the latter, as we have seen, has a very positive inertia in the HBM memory market. Together against China The rivalry between Samsung and SK Hynix in the memory market is a fact, but, in reality, the main threat to these South Korean companies comes from China. The memory integrated circuits industry has enormous growth potential precisely due to the high demand for these chips that has led to the proliferation of data centers for AI applications. And, as expected, Chinese semiconductor manufacturers do not want to be left out of it. The Chinese CXMT has deployed a very aggressive pricing policy to compete in the memory market Changxin Memory Technologies (CXMT) is one of the Chinese companies specialized in the production of memory chips, and, like other companies in the country led by Xi Jinping, it has chosen to compete in this market so attractive unfolding a very aggressive pricing policy. Furthermore, CXMT in particular has increased its DRAM chip production capacity almost five times over the last four years, allowing it to increase its global market share to a very worthy 9%. This growth has placed this company just behind Micron if we stick to its market share, making it already the fourth largest memory chip manufacturer on the planet. To further complicate matters, the Chinese Government is financially supporting its manufacturers of this type of semiconductor in response to the sanctions deployed by the US and its allies, so the competitiveness of Chinese companies is on the rise. Image | Samsung More information | SCMP In Xataka | South Korea fears US retaliation. To avoid them, his old lithography equipment collects dust in a warehouse

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.