Particle.news

Samsung Begins HBM4 Mass Production and Shipments, First to Commercialize Next‑Gen AI Memory

The early launch signals a bid to regain leadership in high‑bandwidth memory during a surge in AI data center demand.

Overview

  • Samsung reports HBM4 runs at a consistent 11.7 Gbps and can scale to 13 Gbps, surpassing JEDEC’s 8 Gbps benchmark and the 9.6 Gbps speed of HBM3E.
  • Each stack delivers up to 3 TB/s of bandwidth and 36 GB using 12 layers, with 16‑layer versions expected to reach 48 GB.
  • Shipments began about a week sooner than planned following customer consultations, with designs targeting lower power use and cooling costs in servers.
  • The company projects HBM sales will more than triple in 2026 and outlines HBM4E sampling in the second half of 2026 with custom HBM samples in 2027.
  • Industry reporting points to NVIDIA as a likely major buyer as SK hynix readies its own ramp, and Samsung shares rose more than six percent after the announcement.