AI server shipments to leap up 40% in 2023

AI server shipments to leap up 40% in 2023

Market news |
By Peter Clarke

Global AI server shipments will jump by 38.4 percent in 2023, according to market analysis firm TrendForce.

The company defines AI servers as being computers outfitted with GPUs, FPGAs or ASICs used for AI training and inferencing. AI chip shipments are expected to increase by 46 percent.

Global AI server shipments from 2022 to 2026 (1,000 units). Source: TrendForce

The market is set to shoot up from 855,000 units in 2022 to 1.18 million in 2023. This will make AI servers about 9 percent of total server shipments. The strong growth will continue to see 2.37 million AI servers shipping in 2026. This is a compound annual growth rate over the period of 29 percent. At this stage TrendForce reckons AI servers will be 15 percent of the total server market.

Nvidia’s GPUs dominate the AI server market with a market share of between 60 and 70 percent. ASIC chips have about 20 percent of the market.

Nvidia’s A100 and A800 chips, based on the ‘Ampere’ architecture, are in demand in both the US and China with follow on chips H100 and H800, based on the Hopper architecture, are due to ramp in 2H23. The H100 and H800 are priced at about 2 to 2.5x that of the A100 and A800.

The A100 is reported to be priced at about $10,000 but Nvidia’s profit margin is considered to be so large that it can make discounts depending on the buyer’s purchase volume.

As a result demand for high-bandwidth memory (HBM), a high-speed RAM interface used in GPUs is poised for an increase in demand. The H100 GPU has the HBM3 interface. TrendForce forecasts a 58 percent year-on-year increase in HBM demand for 2023, with an estimated further boost of 30 percent expected in 2024.

Related links and articles:

News articles:

SK Hynix capitalizes on investment in HBM

ASML, TSMC, Synopsys join Nvidia for computation lithography

UK funds startup Lumai for optical AI

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles