MENU

SK Hynix capitalizes on investment in HBM

SK Hynix capitalizes on investment in HBM

Market news |
By Peter Clarke



SK Hynix is the leading supplier of memory to the high-bandwidth standard (HBM) and the company is benefiting from strong growth in server shipments in support of AI, according to TrendForce.

In 2022 the three leading suppliers of high bandwidth memory (HBM) were SK Hynix (50 percent), Samsung (40 percent) and Micron (10 percent), TrendForce estimates.

SK Hynix is presently the only manufacturer making DRAMs to the HBM3 standard. To prepare for the launch of NVIDIA H100 and AMD MI300 in 2H23, all three major suppliers are planning for the mass production of HBM3 products. Samsung and Micron are expected to start mass production of HBM3 sometime towards the end of 2023 or early in 2024.

As a result of SK Hynix’s lead in HBM3 Trendforce expects the company to grow its market share to 53 percent by the end of 2023 at the expense of Samsung and Micron who will drop to 38 and 9 percent share, respectively.

Nvidia’s DM/ML AI servers are often equipped with four or eight high-end graphics cards and two mainstream x86 server CPUs.

AI server shipments jumping

TrendForce reckons that shipment volume of servers with high-end general-purpose GPUs is increased by 9 percent in 2022, with approximately 80 percent of these shipments concentrated in eight major cloud service providers in China and the US. In late 2023 Microsoft, Google, Meta, Baidu, and ByteDance are expected to launch generative AI products and services further boosting AI server shipments.

These are expected to increase 15.4 percent in 2023 and follow a 12.2 percent compound annual growth rate over the period 2023 to 2027.

Typical memory content of general and AI servers. Source: Trendforce.

The rise in significance of AI servers and the exponential growth in AI models is also set to benefit memory chip companies with increasing demand, said TrendForce.

While general servers have 500 to 600Gbytes of DRAM, AI servers require significantly more – averaging between 1.2 to 1.7Tbytes. With Nvidia A100 80Gbyte configurations of four or eight, HBM usage would be around 320 or 640Gbytes.

Related links and articles:

www.trendforce.com

News articles:

NAND flash makers under pressure in falling market

YMTC gets $7 billion cash injection

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s