Samsung Unveils Next-Gen AI Memory HBM4E; Nvidia CEO Jensen Huang Hails It as 'World's Best'
Samsung Electronics has unveiled its next-generation High Bandwidth Memory (HBM)4E for the first time in the industry at Nvidia's 'GTC 2026' held in the US. The newly revealed HBM4E supports a speed of 16Gbps per pin and a bandwidth of 4.0TB/s, a significant enhancement compared to the maximum performance of existing HBM4. Samsung Electronics leveraged its strengths as an Integrated Device Manufacturer (IDM), utilizing 10nm-class 6th generation DRAM technology and 4nm-based die design capabilities, to deliver differentiated performance.
Nvidia CEO Jensen Huang personally visited Samsung Electronics' booth and highly praised Samsung's technological capabilities, calling them "world's best." Huang remarked, "Samsung has many of the world's best," and left a handwritten tribute reading "Amazing HBM4" on the displayed HBM4E product.
Furthermore, CEO Jensen Huang confirmed that Nvidia's next-generation AI chip, the 'Groq3 LPU', will be manufactured at Samsung Electronics' foundry and mass-produced starting in the third quarter. Regarding the foundry collaboration for the chip, Huang emphasized the close partnership between the two companies, stating, "I'm truly grateful to Samsung."
Samsung Electronics plans to triple its HBM production volume this year compared to last year and increase the proportion of HBM4 in its total HBM shipments to over half, thereby strengthening its market leadership.