Join the 155,000+ IMP followers

electronics-journal.com

SK hynix advances AI servers with 192GB SOCAMM2 memory

High-density LPDDR5X module improves bandwidth and power efficiency for NVIDIA Vera Rubin platforms, addressing AI memory bottlenecks.

  www.skhynix.com
SK hynix advances AI servers with 192GB SOCAMM2 memory

Data centers, AI infrastructure, and high-performance computing systems require memory architectures capable of supporting increasingly complex workloads. In this context, SK hynix Inc. has begun mass production of its 192GB SOCAMM2 memory module, designed to deliver higher bandwidth and improved energy efficiency for next-generation AI servers.

The new module is based on LPDDR5X low-power DRAM built on a 1cnm process (sixth-generation 10 nm-class technology). It introduces a server-oriented adaptation of low-power memory, traditionally used in mobile devices, to meet the performance and efficiency requirements of large-scale AI workloads.

High-capacity memory for AI server architectures
SOCAMM2¹ (Small Outline Compression Attached Memory Module 2) is designed as a primary memory solution for AI servers, combining a compact form factor with scalability. Its compression connector enhances signal integrity and allows easier module replacement, supporting flexible system configurations.

Compared to conventional RDIMM² modules, the new solution delivers:
  • Higher bandwidth: More than double the data transfer rate
  • Improved energy efficiency: Over 75% reduction in power consumption per performance level
  • High density: 192GB capacity to support data-intensive AI workloads
These characteristics are particularly relevant for large-scale AI models, where memory throughput and efficiency directly influence system performance.

Addressing AI workload bottlenecks
Modern AI applications, especially large language models (LLMs) with hundreds of billions of parameters, require substantial memory bandwidth and capacity. Conventional memory architectures can become a limiting factor during both training and inference processes.

The SOCAMM2 module is designed to mitigate these constraints by enabling faster data movement and reducing energy overhead. This contributes to improved processing speed at the system level, particularly in AI server environments handling parallel computations.

The product is specifically optimized for the NVIDIA Vera Rubin platform, reflecting the increasing integration between memory technologies and AI computing architectures.

Transition toward energy-efficient AI memory
As AI infrastructure evolves, there is a growing shift toward solutions that balance performance with power consumption. Low-power DRAM technologies such as LPDDR5X are gaining relevance beyond mobile applications, particularly in data centers where energy efficiency is a critical factor.

The ability to operate high-capacity memory modules with reduced power demand supports both operational cost reduction and sustainability objectives in large-scale computing environments.

Scalable production for AI infrastructure demand
To meet demand from cloud service providers and AI system developers, SK hynix has established a stable mass production system for SOCAMM2. This ensures supply continuity for next-generation AI platforms requiring high-performance memory solutions.

The introduction of SOCAMM2 reflects broader industry trends toward specialized memory architectures tailored to AI workloads, where bandwidth, efficiency, and scalability are key selection criteria.

1 SOCAMM2 (Small Outline Compression Attached Memory Module 2): An AI server–optimized memory module based on LPDDR. It offers a slim form factor and high scalability, while its compression connector enhances signal integrity and allows for easy module replacement
2 RDIMM (Registered Dual In-Line Memory Module): DRAM module for server/workstation that includes a register or buffer chip to relay address and command signals between the memory controller and DRAM chip in a memory module

Edited by Natania Lyngdoh, Induportals Editor — Adapted by AI.

www.skhynix.com

  Ask For More Information…

LinkedIn
Pinterest

Join the 155,000+ IMP followers