
Micron begins production of a ‘better’ HBM3E memory

Micron Technology Inc. (Boise, Idaho) has said it has begun volume production of DRAM memory in the HBM3E format favoured for use in AI datacenters with the claim it consumes 30 percent less power than competing HBM3E offerings.
Micron said its 24Gbyte 8-high HBM3E components will be part of H200 tensor core GPUs from Nvidia that will begin shipping in 2Q24.
As such Micron may have leapfrogged in front of rival SK Hynix. SK Hynix has been the leading volume producer of memory to the original HBM3 standard. HBM3E is the extended version of HBM with superior specifications. SK Hynix is expected to begin mass production of HBM3E memories in 1H24.
Micron chose to skip HBM3 and directly develop HBM3e. It’s HBM3E offers pin speeds of 9.2Gbit/s and overall bandwidth of 1.2Tbyte/s, the company said.
It is made use 1-beta nm manufacturing process technology and through-silicon-vias. The company is preparing a 12-high 36Gbyte HBM3E component for introduction in March 2024.
Micron’s stock opened up almost 5 percent on the news.
Related links and articles:
News articles:
SK Hynix bouncing back on AI demand
CES 2024: SK hynix to show leading memory products for AI
AI accelerator chips boost high bandwidth memory
