MENU

World’s first 12 layer HBM4 DRAM chip for AI

World’s first 12 layer HBM4 DRAM chip for AI

Business news |
By Nick Flaherty

Cette publication existe aussi en Français


sk Hynix is sampling the world’s first 12 layer stack of HBM4 memory chips to AI system developers six months earlier than expected

HBM4 is vital to the next generation of Nvidia GPUs, Rubin, which plans to use a stack of 8 chips next year. The Rubin Ultra dual GPU will use a stack of 16 HBM4 chips, and Nvidia CEO Jensen Huang specifically asked sk Hynix to accelerate its production.

Mass production of the 12 layer HBM4 devices with a bandwidth of 2TByte/s is to start in the second half of 2025 following certification process. The stack of 12 chips can store 36Gbytes of data.

Sk Hynix was the first memory maker to mass produce HBM3 in 2022, and 8- and 12-layer HBM3E devices in 2024. HBM3e chips are used for the Blackwell Ultra GPU later this year with 288GB of memory.

US supplier Micron Technology is aiming to have HBM4 chips in 2026, while Samsung plans to have an HBM4 16 layer stack in production on a 4nm process by the end of 2025, six months ahead of schedule.

“We have enhanced our position as a front-runner in the AI ecosystem following years of consistent efforts to overcome technological challenges in accordance with customer demands,” said Justin Kim, President & Head of AI Infra at SK hynix. “We are now ready to smoothly proceed with the performance certification and preparatory works for mass production, taking advantage of the experience we have built as the industry’s largest HBM provider.”

www.skhynix.com

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s