Samsung is working hard to secure NVIDIA’s approval for its HBM3E high-bandwidth memory chip. Industry insiders recently reported that the company achieved strong scores in a key evaluation of HBM3E, hinting at a nearing approval. The CEO of NVIDIA has now suggested the same, revealing that Samsung is in contention to supply HBM3E for Blackwell Ultra.
Samsung moves closer to supplying HBM3E for Nvidia’s AI chips
Recently, Nvidia announced Blackwell Ultra, an accelerated computing platform designed for AI reasoning, including training, post-training, and test-time scaling. The Blackwell Ultra includes the NVIDIA GB300 NVL72 rack-scale solution and the NVIDIA HGX B300 NVL16 system.
What makes this announcement even more interesting is that Jensen Huang, CEO of artificial intelligence (AI) chip leader Nvidia, hinted that Blackwell Ultra might use Samsung’s fifth-generation high-bandwidth memory (HBM3E). He briefly mentioned, “We are looking forward to Samsung’s participation,” and “Samsung has the ability to combine ASIC (custom chip) and memory in the base die (the core component installed at the bottom of the HBM).”
Earlier at CES 2025, Jensen said regarding Samsung’s HBM, “We are currently testing it and are confident that it will be successful,” but added, “Samsung needs to come up with a new design.” Since then, Samsung has been working on the design of HBM3E, especially to fix heat-related issues.
While meeting technical requirements for NVIDIA’s AI chip is crucial, it should also pass qualification tests for its packaging process. Previous reports also suggested that Nvidia officials have visited Samsung’s Cheonan campus multiple times, where the final stages of HBM production take place.
All things considered, it is clear that NVIDIA is highly interested in Samsung’s participation in supplying its HBM3E chips. The collaboration between the two tech giants is important for advancing AI computing, as high-bandwidth memory plays a major role in enhancing the performance of NVIDIA’s GPUs. A Samsung executive also recently said that the company aims to ramp up HBM3E AI memory chips as early as Q2 2025.