Nvidia CEO Jensen Huang has asked SK Hynix to supply the next generation of high-bandwidth memory (HBM) DRAMs six months earlier than previously planned due to high demand for AI chips, according to Korean papers quoting SK Group chair Chey Tae-won.
“The Nvidia chief asked us to step up the timeline of the HBM4 chip supply by six months,” Chey is quoted saying at the SK AI Summit on Monday November 4.
HBM4 is the next generation of a series of stacked-die DRAMs designed for high-speed data transfer.
SK Hynix has created a commanding position in AI DRAM and is currently introducing 12-layer 36Gbyte HBM3e DRAMs with a plan to migrate to HBM4 late in 2025 or early in 2026.
SK Hynix scores record profits, doubles sales on AI demand
The chair said that SK Hynix now plans to produce 12-layer HBM4 chips in 2H25, faster than its initial schedule of 2026.
Nvidia is a fabless chip company that relies on foundry TSMC to make its chips. TSMC is also a leader in the advanced packaging for AI which brings memory and processor die together.
“Nvidia, SK Hynix and TSMC are fortifying a trilateral partnership to develop the world’s top-notch chips,” Chey is quoted saying.
Chey also said that despite the demand for AI semiconductors there are concerns about a collapse of the AI chip market.
“One of the reasons why we are still worried about the winter of AI is because we have yet to find specific use cases and revenue models in the industry, even if massive investments are being made,” he said.
Related links and articles:
News articles:
SK Hynix plans 30x improvement on HBM with custom variants
Nvidia details GPU roadmap with Rubin HBM4 memory
JEDEC preps finalization of HBM4 standard