The Samsung logo is shown on a glass door at the company’s Seocho building in Seoul on July 7, 2022. Samsung Electronics has begun applying for tax breaks for 11 potential chip plants in Texas, adding about $192 billion in investment, according to filings. filed in Texas authorities.
Jung Yeon-je | Afp | Getty Images
Samsung Electronics on Tuesday said it has developed a a new high-bandwidth memory chip that has “the highest capacity to date” in the industry.
The South Korean chip giant said the HBM3E 12H “increases both performance and capacity by more than 50%.”
“AI service providers in the industry increasingly need HBMs with higher capacity, and our new product HBM3E 12H is designed to meet this need,” said Yongcheol Bae, executive vice president of memory product planning at Samsung Electronics.
“This new memory solution is part of our drive to develop core technologies for high-stack HBM and provide technology leadership for the high-capacity HBM market in the AI era,” Bae said.
Samsung Electronics is the world’s largest manufacturer of dynamic random access memory chips, which are used in consumer devices such as smartphones and computers.
Generative AI models like OpenAI’s ChatGPT require a large number of high-performance memory chips. Such chips allow generative AI models to remember details of past conversations and user preferences to generate human responses.
The AI boom continues to fuel chip makers. American chip designer Nvidia reported a 265% jump in revenue for its fiscal fourth quarter thanks to soaring demand for its GPUs, thousands of which are used to launch and train ChatGPT.
On a call with analysts, Nvidia CEO Jensen Huang said the company may not be able to sustain that level of growth or sales throughout the year.
“As AI applications grow exponentially, the HBM3E 12H is expected to be an optimal solution for future systems that require more memory. Its higher performance and capacity will enable customers to manage their resources more flexibly and reduce the total cost of ownership for data centers,” said Samsung Electronics.
Samsung said it has begun customer trials of the chip, and mass production of the HBM3E 12H is planned for the first half of 2024.
“I think the news will be positive for Samsung’s share price,” SK Kim, executive director of Daiwa Securities, told CNBC.
“Samsung was behind SK Hynix in HBM3 for Nvidia last year. Also, Micron yesterday announced mass production of the 24GB 8L HBM3E. I predict it will provide leadership in the higher-layer (12L)-based, higher-density (36GB) HBM3E product for Nvidia,” Kim said.
In September, Samsung has struck a deal to supply Nvidia with his 3 high-bandwidth memory chips, according to a report by the Korea Economic Daily, which cited anonymous industry sources.
The report also said SK Hynix, South Korea’s second-largest memory chip maker, is leading the high-performance chip market. SK Hynix was formerly known as sole mass producer of HBM3 chips supplied to Nvidiathe report says.
Samsung said the HBM3E 12H has a 12-layer stack, but applies an advanced non-conductive thermal compression film that allows 12-layer products to have the same height specification as 8-layers to meet current HBM package requirements. The result is a chip that packs more processing power without increasing its physical footprint.
“Samsung continued to reduce the thickness of its NCF material and achieved the industry’s smallest chip gap of seven micrometers (µm) while eliminating voids between layers,” Samsung said. “These efforts result in an improved vertical density of over 20% compared to its HBM3 8H product.”
https://www.cnbc.com/2024/02/27/samsung-unveils-new-memory-chip-with-highest-capacity-to-date-for-ai.html