Join the 155,000+ IMP followers

electronics-journal.com

Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms

Micron HBM4, built on its 1ß DRAM process and 12-high advanced packaging with MBIST, offers seamless integration for next-gen AI platforms.

  www.micron.com
Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms

The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc.,has announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.

A leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation. This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively.

Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron’s previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry. This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency.

Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.

"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."

Intelligence Accelerated: Micron’s role in the AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.

www.micron.com

  Ask For More Information…

LinkedIn
Pinterest

Join the 155,000+ IMP followers