Samsung Electronics Co.’s 12-layer HBM3E top and other DDR modules are stacked in Seoul, South Korea, Thursday, April 4, 2024. Samsung’s profit rebounded sharply in the first quarter of 2024, reflecting a turnaround in the semiconductor core the company’s division and stable sales of Galaxy S24 smartphones. Photographer: SeongJoon Cho/Bloomberg via Getty Images
Bloomberg | Bloomberg | Getty Images
High-performance memory chips are likely to remain in limited supply this year as explosive demand for AI leads to a shortage of these chips, according to analysts.
SK Hynix and Micron — two of the world’s largest memory chip suppliers — are out of high-bandwidth memory chips for 2024, while 2025 supplies are also nearly depleted, according to the firms.
“We expect total memory supply to remain tight in 2024,” Kazunori Ito, director of equity research at Morningstar, said in a report last week.
The demand for AI chipsets has boosted the market for high-end memory chips, which has greatly benefited firms such as Samsung Electronics and SK Hynix, two of the world’s largest memory chip makers. While SK Hynix already supplies chips to Nvidiathe company is reportedly considering Samsung as a potential supplier as well.
High-performance memory chips play a crucial role in training large-scale language models (LLMs) such as OpenAI’s ChatGPT, which has led AI deployment to skyrocket. LLMs need these chips to remember details of past conversations with users and their preferences to generate human responses to queries.
“Manufacturing these chips is more complex and scaling up production is difficult. This likely creates a shortfall for the rest of 2024 and most of 2025,” said William Bailey, director at Nasdaq IR Intelligence.
HBM’s production cycle is 1.5 to 2 months longer than DDR5 memory chips commonly found in PCs and servers, market research firm TrendForce said in March.
To meet the growing demand, SK Hynix plans to expand its production capacity by investing in state-of-the-art packaging facilities in Indiana, USA as in M15X factory in Cheongju and Yongin semiconductor cluster in South Korea.
Samsung said during its first-quarter earnings call in April that the supply of HBM bits in 2024 “more than tripled from last year.” Chip capacity refers to the number of bits of data that a memory chip can store.
“And we have already concluded discussions with our customers with this committed delivery. In 2025, we will continue to expand deliveries at least twice or more per year, and we are already in smooth negotiations with our customers regarding this delivery,” Samsung said.
Micron did not respond to CNBC’s request for comment.
Intense competition
Big tech companies Microsoft, Amazon and Google are spending billions to train their own LLMs to stay competitive, fueling demand for AI chips.
“Major buyers of AI chips — firms like Meta and Microsoft — have signaled that they plan to continue pouring resources into building AI infrastructure. This means they will be buying large quantities of AI chips, including HBM, at least until 2024,” said Chris Miller, author of The Chip War, a book about the semiconductor industry.
Chipmakers are in a fierce race to produce the most advanced memory chips on the market to catch the AI boom.
SK Hynix said at a press conference earlier this month that it will begin mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do so in the second quarter after being the first in the industry. which sends samples of the latest chip.
“Samsung is currently ahead in the 12-layer HBM3E sampling process. If they can qualify earlier than their competitors, I guess it could get a majority share in late 2024 and 2025,” said SK Kim, executive director and analyst at Daiwa Securities.
https://www.cnbc.com/2024/05/14/ai-boom-to-keep-supply-of-high-end-memory-chips-tight-through-2024.html