▶주메뉴 바로가기

▶본문 바로가기

The Korea Herald
검색폼

THE INVESTOR
December 11, 2024

Tech

Nvidia CEO signals Samsung’s imminent shipment of AI chips

  • PUBLISHED :November 24, 2024 - 17:13
  • UPDATED :November 24, 2024 - 17:13
  • 폰트작게
  • 폰트크게
  • facebook
  • sms
  • print



Jensen Huang, founder and CEO of Nvidia, speaks to the media following a convocation ceremony in Hong Kong, Saturday. (AP-Yonhap)

Samsung Electronics, the world's largest memory chip maker, is expected to supply its cutting-edge chips to the top graphics processing unit maker Nvidia soon, stepping closer to securing a footing in the lucrative AI chip market.

Speaking on the sidelines of a convocation ceremony at Hong Kong University of Science and Technology on Saturday, Nvidia CEO and founder Jensen Huang told Bloomberg Television that the company is "working as fast as it can to certify Samsung's AI memory chips.”

Huang also said Nvidia is looking at both 8-layer and 12-layer HBM3E offerings from Samsung.

High Bandwidth Memory is an advanced package of vertically stacked DRAM chips, used mainly to boost the performance of advanced graphics processing units for AI applications. The latest fifth-generation HBM3E chips are used in Nvidia's Blackwell, which is currently the most advanced GPU in high demand among top AI tech giants.

Huang's remark comes as Samsung Electronics has been striving to supply high-value DRAM chip packages to Nvidia. Currently, Nvidia's key supplier is Samsung’s crosstown rival SK hynix.

Among the three chipmakers in the world that can produce the HBM chips – Samsung, SK hynix and US-based Micron Technology – SK hynix has benefited the most from the burgeoning HBM market, having already supplied the 8-layer HBM3E chips to Nvidia since March.

Now months behind SK hynix in the lucrative HBM market, Samsung's chip business earned about half the profit of its smaller rival in the July-September period.

In announcing the disappointing quarterly profit last month, Samsung said it expects to expand sales of HBM3E in the fourth quarter this year, and that the product is anticipated to account for about 50 percent of total HBM sales during that period.

"There was a delay in commercializing HBM3E chips, but we have made meaningful progress by passing an important stage in the chip qualification test process with our major customer," Kim Jae-june, Samsung's executive vice president in charge of the memory chip business, said in the quarterly earnings call on Oct. 31.

Nvidia's Blackwell, which is said to dramatically boost AI data center performance and is expected to cost between $30,000 and $40,000 per unit, is in high demand among companies like OpenAI, Microsoft, Meta and other firms building AI data centers.

In the latest earnings call last week, Huang said Blackwell production is going ahead "full steam" and that demand exceeds supply for several more quarters.

"We will deliver this quarter more Blackwells than we had previously estimated," Huang said. Since its introduction in March, Nvidia has shipped 13,000 Blackwell chips to customers.

By Jo He-rim (herim@heraldcorp.com)

EDITOR'S PICKS