politicsUSA

AI boom to maintain supply of high-end memory chips until 2024

A 12-layer HBM3E module from Samsung Electronics Co., top and other DDR modules arranged in Seoul, South Korea, Thursday, April 4, 2024. Samsung’s profits rebounded sharply in the first quarter of 2024, reflecting a turnaround in the company’s pivotal semiconductor division and robust sales of Galaxy S24 smartphones. Photographer: SeongJoon Cho/Bloomberg via Getty Images

Bloomberg | Bloomberg | Getty Images

Supply of high-performance memory chips will likely remain tight this year, analysts say, as explosive demand for AI drives a shortage of these chips.

SK Hynix and Micron – two of the world’s largest memory chip suppliers – will run out of high-bandwidth memory chips for 2024, while stock for 2025 is also almost exhausted, according to the companies.

We expect general memory supply to remain constrained throughout 2024,” Kazunori Ito, director of equity research at Morningstar, said in a report last week.

Demand for AI chipsets has driven the high-end memory chip market, hugely benefiting companies such as Samsung Electronics and SK Hynix, the world’s two leading memory chip makers. While SK Hynix already supplies chips to Nvidiathe company is also reportedly considering Samsung as a potential supplier.

High-performance memory chips play a crucial role in training large language models (LLMs) such as OpenAI’s ChatGPT, which has skyrocketed AI adoption. LLMs need these chips to remember details of past conversations with users and their preferences to generate human responses to queries.

“Manufacturing these chips is more complex and ramping up production has been difficult. This will likely lead to shortages through the end of 2024 and much of 2025,” said William Bailey, head of Nasdaq IR Intelligence.

HBM’s production cycle is 1.5 to 2 months longer compared to the DDR5 memory chip commonly found in personal computers and servers, market intelligence firm TrendForce said in March .

Samsung gets $6.4 billion for its chip factories

To meet growing demand, SK Hynix plans to expand its production capacity by investing in advanced packaging facilities in Indiana, USA, as well as the M15X factory in Cheongju and the semiconductor cluster from Yongin in South Korea.

Samsung, during its first-quarter earnings conference call in April, said its 2024 supply of HBM bits “more than tripled from last year.” Chip capacity refers to the number of bits of data a memory chip can store.

“And we have already completed discussions with our customers with this committed supply. In 2025, we will continue to increase supply by at least two times or more year over year, and we are already in talks without clashes with our customers over this supply,” Samsung said.

Micron did not respond to CNBC’s request for comment.

Intense competition

Big tech companies Microsoft, Amazon and Google are spending billions to train their own LLMs to stay competitive, fueling demand for AI chips.

“Large buyers of AI chips – companies like Meta and Microsoft – have indicated they plan to continue devoting resources to building AI infrastructure. This means they will buy large volumes of AI chips, including HBM, at least until 2024,” Chris said. Miller, author of “Chip War,” a book on the semiconductor industry.

Chipmakers are in a fierce race to make the most advanced memory chips on the market to capture the AI ​​boom.

SK Hynix, in a press conference earlier this month, said it would begin mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do during the second quarter, having been the first in the industry to ship samples of the latest chip.

“Currently, Samsung is ahead in the 12-layer HBM3E sampling process. If they can get the qualification earlier than its peers, I guess they can get majority shares in late 2024 and 2025,” SK said Kim, executive director and analyst at Daiwa Securities.

cnbc

Back to top button