SK Hynix has unveiled its latest DRAM innovation, the HBM3E memory, a groundbreaking product specifically designed for AI applications. This announcement establishes the company’s dominance in the ultra-high-performance memory sector. Samples are already being distributed to customers for performance verification, and mass production starts in 1H24.

SK Hynix HBM3E DRAM memory

HBM3E Achieves an Astounding 1.15TB/s Data Processing Speed

HBM (High Bandwidth Memory) refers to a stack of multiple DRAM chips vertically arranged, which considerably amplifies data processing speeds. The evolution of HBM DRAM has seen its advancements from the first-generation HBM to the most recent, fifth-generation HBM3E. Notably, HBM3E is an enhancement of its predecessor, HBM3.

With their extensive history in HBM production, SK Hynix commented on the release, “As the exclusive manufacturer of HBM3, our development of the world’s most superior HBM3E showcases our deep expertise and commitment to leading the AI memory market.” The company has plans to commence mass production of this cutting-edge memory by the first half of the subsequent year.

The HBM3E boasts a staggering data processing speed of 1.15 terabytes per second. To put this in perspective, it’s capable of handling 230 full high-definition (FHD) movies, each 5 gigabytes in size, in just one second.

SK Hynix HBM3E DRAM memory

Beyond its sheer speed, the product incorporates the latest Advanced MR-MUF technology, enhancing heat dissipation by a notable 10% compared to its previous generation. Furthermore, the HBM3E is designed with backward compatibility. This feature enables a seamless transition for customers, allowing the integration of HBM3E into existing HBM3-based systems without necessitating any design alterations.

The longstanding partnership between NVIDIA and SK Hynix further solidifies the promise of HBM3E. Ian Buck, VP of NVIDIA’s Hyperscale and HPC Division, expressed enthusiasm about the collaboration’s future, emphasizing the potential of the new HBM3E in revolutionizing AI computing.

Samsung is also set to start mass production of HBM chips tailored for AI in the latter half of 2023, competing directly with SK Hynix. In 2022, SK Hynix held a 50% share of the HBM market, Samsung had 40%, and Micron accounted for the remaining 10%. The HBM market constitutes just 1% of the overall DRAM segment.

RELATED:

(Source, Via)