SK hynix Inc. announced today that it successfully developed HBM3E, the next-generation of the highest-specification DRAM for AI applications currently available, and said a customer’s evaluation of samples is underway.
The company said that the successful development of HBM3E, the extended version of HBM3 which delivers the world’s best specifications, comes on top of its experience as the industry’s sole mass provider of HBM3. With its experience as the supplier of the industry’s largest volume of HBM products and the mass-production readiness level, SK hynix plans to mass produce HBM3E from the first half of next year and solidify its unrivaled leadership in AI memory market.
According to the company, the latest product not only meets the industry’s highest standards of speed, the key specification for AI memory products, but all categories including capacity, heat dissipation and user-friendliness.
In terms of speed, the HBM3E can process data up to 1.15 terabytes(TB) a second, which is equivalent to processing more than 230 Full-HD movies of 5GB-size each in a second.
In addition, the product comes with a 10% improvement in heat dissipation by adopting the cutting-edge technology of the Advanced Mass Reflow Molded Underfill, or MR-MUF**, onto the latest product. It also provides backward compatibility*** that enables the adoption of the latest product even onto the systems that have been prepared for the HBM3 without a design or structure modification.