Last month, we reported that Samsung is developing a new advanced memory chip for AI processors called SOCAMM (system on chip advanced memory module) for Nvidia. Now, a fresh report claims that the Korean firm has supplied Nvidia with a large volume of SOCAMM module samples. Surprisingly, Samsungโs supply has exceeded (in quantity) competitors like Micron and SK Hynix. It looks like Samsung wants to dominate the SOCAMM market, which could boost its struggling semiconductor business.
Nvidiaโs GB300 AI chip could feature Samsungโs SOCAMM module
SOCAMM is a new AI memory that helps solve problems with power use, cost, and speed. Nvidia began the development of SOCAMM last year with three suppliers: SK Hynix, Samsung, and Micron. These three are currently major competitors in this field. Micron is the fastest supplier at the moment, though Samsungโs supply volume is currently higher. Samsung has the largest supply capacity and can take an early lead in this emerging AI market. Nvidia will reportedly implement Samsungโs SOCAMM module on its latest AI chip, the Blackwell Ultra GB300, which could debut in the second quarter of 2025.
Samsung is aiming to become the top supplier of SOCAMM for Nvidia. Earlier, Jeon Young-hyun, head of the DS division, said, โWe will never repeat the same mistake as HBM.โ This suggests that the Korean firm does not want to miss out on opportunities in the newly emerging AI memory market. However, the profit margins are currently uncertain. The price of the SOCAMM module is only 25โ33% of HBM.
In recent years, SOCAMM has become an essential new memory module due to the growing AI market. The key advantages of SOCAMM are low power consumption and high efficiency, as it uses LPDDR5X DRAM. Samsungโs SOCAMM module consumes only 9.2 watts of power, which is 45% more power-efficient than DDR5 DRAM memory modules (DIMM). Since the SOCAMM module processes data quickly with a lower power draw, itโs extremely suitable for AI processors.