CES 2024: SK hynix to show leading memory products for AI
SK hynix has announced that it will showcase the technology for ultra-high performance memory products, the core of future AI infrastructure, at CES 2024, including the recently announced HBM3E next generation DRAM for AI applications.
According to SK hynix, the successful development of HBM3E, the extended version of HBM3 delivers the world’s best specifications, and comes on top of its experience as the industry’s sole mass provider of HBM3. With its experience as the supplier of the industry’s largest volume of HBM (High Bandwidth Memory) products, SK hynix plans to mass produce HBM3E from the first half of next year and solidify its unrivaled leadership in AI memory market.
According to the company, the latest product not only meets the industry’s highest standards of speed, the key specification for AI memory products, but all categories including capacity, heat dissipation and user-friendliness.
In terms of speed, the HBM3E can process data up to 1.15 TB a second, which is equivalent to processing more than 230 Full-HD movies of 5 GB-size each in a second.
AT CES 2024, SK Hynix will emphasise and promote the importance of memory products accelerating the technological innovation in the AI era and its competitiveness in the global memory markets.
HBM3E, which was successfully developed in August, is expected to be available to the largest AI technology companies with mass production starting from the first half of 2024.
Further, HBM3E comes with a 10% improvement in heat dissipation by adopting cutting-edge Advanced Mass Reflow Molded Underfill, or MR-MUF2, technology. It also provides backward compatibility that enables the adoption of the latest product even in systems that have been prepared for HBM3 without a design or structure modification.
SK hynix will also showcase a next-generation Compute Express Link (CXL) interface using a a PCIe-based interconnect protocol in a test product based on its Computational Memory Solution (CMS) memory. This test CMS memory integrates the computational functions of CXL with the Accelerator-in-Memory based Accelerator (AiMX), a processing-in-memory chip-based accelerator card with low-cost and high-efficiency for generative AI.
CXL memory, along with HBM, is one of the core products in the limelight with the rise of AI technology. SK hynix plans to commercialize 96 GB and 128 GB CXL 2.0 memory products based on DDR5 in the second half for shipments to AI customers.
Sungsoo Ryu, Head of DRAM Product Planning at SK hynix, said that the company, through the development of HBM3E, has strengthened its market leadership by further enhancing the completeness of HBM product lineup, which is in the spotlight amid the development of AI technology. “By increasing the supply share of the high-value HBM products, SK hynix will also seek a fast business turnaround.”
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
