▶주메뉴 바로가기

▶본문 바로가기

The Korea Herald
검색폼

THE INVESTOR
September 19, 2024

Samsung

[KH explains] How will Samsung’s 'No. 1 strategy' work for HBM leadership?

  • PUBLISHED :July 28, 2024 - 16:25
  • UPDATED :July 28, 2024 - 16:25
  • 폰트작게
  • 폰트크게
  • facebook
  • sms
  • print

Nvidia CEO Jensen Huang’s written signature includes the phrase “Jensen approved,” next to Samsung’s 12-stack HBM3E prototype displayed at the Samsung Electronics showroom at the GTC 2024 technology conference in San Jose, California, in March. (Samsung Electronics Executive Vice President Han Jin-man's social media)

The red-hot HBM or High Bandwidth Memory chip is becoming a headache for Samsung Electronics, the world’s No. 1 memory chip maker, as it is running months behind its crosstown rival SK hynix to supply the high-value-added product to Nvidia, the US chip giant that enjoys a near monopoly in the burgeoning AI chip market.

Samsung’s fourth-generation HBM3 chips have reportedly been cleared for use by Nvidia just recently, while SK hynix, the No. 2 memory chip maker, has long been the main vendor that started supplying more advanced fifth-generation HBM3E chips from March this year.

On Samsung’s side, it is “ready” to supply the advanced AI chips to Nvidia, having developed both the fourth- and fifth-generation products early on. In February, the tech giant said it developed the industry's first 36-gigabyte 12-layer HBM3E.

What has held back the market leader on its plan is Nvidia's qualification tests that rivals SK hynix and Micron Technology have already passed.

“It is hard to judge from outside whether Samsung has failed in technology. But it is important to understand that HBM is a sort of customized product, and meeting the customer's requirement is the utmost priority," said Kim Yang-paeng, a researcher at the Korea Institute of Industrial Economics and Trade.

"SK hynix has been providing the HBM3 from 2022, and has a high understanding of Nvidia's needs. Samsung may need more time to learn them."

Last week, Reuters reported that Samsung passed the Nvidia tests for HBM3 but testing for HBM3E was still underway, raising speculation that its chip supply was imminent. Samsung declined to confirm the report.

‘No. 2 strategy’ missing

Industry watchers agree that HBM chips remain a tiny part of Samsung’s sprawling memory chip business for now. Without HBM sales, the company posted 1.9 trillion won of operating profits in the first quarter of this year in the chip business alone. In the second quarter, the profits are estimated to soar to some 6 trillion won.

The recent hype surrounds a rare case where the long-standing No. 1 player in the memory market has become a latecomer due to a strategic failure.

Since debuting the industry’s first HBM in 2013, SK hynix has poured considerable resources into the nascent chip technology over the past decade, which has started paying off with the AI boom.

Samsung also operated an HBM development team beginning in 2015 and fared well to become the first to mass produce the early versions. But in 2019, the company decided to scale down investment, citing poor profitability.

When Nvidia sought for HBM3 to power its cutting-edge GPU product H100 in 2022, Samsung was not ready, and SK hynix beat it to getting those orders.

"The problem is that Samsung failed to become the fastest one (in the HBM market)," an industry source said under the condition of anonymity.

"The company has been a 'conqueror' of the DRAM market for a decade. Now, it is experiencing the struggles of a latecomer."

Fighting back

Even though Samsung has failed to secure an earlier edge in HBM chips, it has no intention of ditching the lucrative market. In 2023, the global HBM market was valued at $43.6 billion. This year the market is expected to more than triple to $169.1 billion. In the DRAM market, HBM sales made up 8.4 percent last year and the figure is expected to jump to 20.1 percent this year.

“For now, Nvidia is basically the only client for the cutting-edge HBM chips. But with more AI applications emerging the demand is expected to rise," Kim said.

"Other big techs like Google are also jumping into compete with Nvidia and design their own AI chips. HBM is very likely to be adopted for those products as well."

Some sources say Samsung may not be in a hurry to supply the current chips to Nvidia because it has to lower supply prices to have an upper hand against SK hynix. Rather than catching up with the smaller rival, it would seek a “quantum jump” by focusing on next-generation chips.

“Currently, SK hynix is supplying HBM chips at high prices to Nvidia as the sole supplier. But it is unlikely for Samsung to take cheaper pricing as a strategy to outpace the smaller rival,” said an industry official who wished to be unnamed.

For the sixth-generation HBM4 chips, Samsung reportedly plans to start testing the samples next year, with the mass production set for 2026. During an earnings conference call on Thursday, SK hynix hinted that it would start supplying 12-layer HBM4 in the latter half of 2025, saying it is also working on more advanced 16-layer HBM4 chips whose demand is expected to take off from 2026.

In the meantime, Samsung is expected to unveil its second-quarter earnings on Wednesday, with keen attention being paid to its HBM-specific sales and plans.

By Jo He-rim (herim@heraldcorp.com)

EDITOR'S PICKS