Galaxy S25 Edge, Z Fold7, Z Flip7 Mystery Box Sign-up Open!

Galaxy S25 Edge, Z Fold7, Z Flip7 Mystery Box Sign-up Open!

Samsung Ships Large Volume of SOCAMM Samples to NVIDIA

by | Apr 7, 2025 | News

Last month, we reported that Samsung is developing a new advanced memory chip for AI processors called SOCAMM (system on chip advanced memory module) for Nvidia. Now, a fresh report claims that the Korean firm has supplied Nvidia with a large volume of SOCAMM module samples. Surprisingly, Samsungโ€™s supply has exceeded (in quantity) competitors like Micron and SK Hynix. It looks like Samsung wants to dominate the SOCAMM market, which could boost its struggling semiconductor business.

Nvidiaโ€™s GB300 AI chip could feature Samsungโ€™s SOCAMM module

SOCAMM is a new AI memory that helps solve problems with power use, cost, and speed. Nvidia began the development of SOCAMM last year with three suppliers: SK Hynix, Samsung, and Micron. These three are currently major competitors in this field. Micron is the fastest supplier at the moment, though Samsungโ€™s supply volume is currently higher. Samsung has the largest supply capacity and can take an early lead in this emerging AI market. Nvidia will reportedly implement Samsungโ€™s SOCAMM module on its latest AI chip, the Blackwell Ultra GB300, which could debut in the second quarter of 2025.

Samsung is aiming to become the top supplier of SOCAMM for Nvidia. Earlier, Jeon Young-hyun, head of the DS division, said, โ€œWe will never repeat the same mistake as HBM.โ€ This suggests that the Korean firm does not want to miss out on opportunities in the newly emerging AI memory market. However, the profit margins are currently uncertain. The price of the SOCAMM module is only 25โ€“33% of HBM.

In recent years, SOCAMM has become an essential new memory module due to the growing AI market. The key advantages of SOCAMM are low power consumption and high efficiency, as it uses LPDDR5X DRAM. Samsungโ€™s SOCAMM module consumes only 9.2 watts of power, which is 45% more power-efficient than DDR5 DRAM memory modules (DIMM). Since the SOCAMM module processes data quickly with a lower power draw, itโ€™s extremely suitable for AI processors.

Share this Post

___________________________

New Blog Posts

___________________________