Micron Technology announces volume production of HBM3E high-bandwidth memory solution

On March 4, 2024, Micron Technology, Inc. (Micron Technology, Inc., Nasdaq: MU), a leading global provider of memory and storage solutions, recently announced that it has begun mass production of its HBM3E high-bandwidth memory solution . Nvidia's H200 Tensor Core GPU will use Micron's 8-layer stacked 24GB capacity HBM3E memory and will begin shipping in the second quarter of 2024. Micron continues to maintain its industry leadership through this milestone and empowers artificial intelligence (AI) solutions with the extraordinary performance and energy efficiency of HBM3E.

Micron HBM3E memory

As the demand for artificial intelligence continues to surge, in-memory solutions are critical to meeting the increase in workload demands. Micron’s HBM3E solution meets this challenge head-on with:

-Excellent performance: Micron HBM3E has a pin rate of over 9.2Gb/s and provides memory bandwidth of over 1.2TB/s, helping artificial intelligence accelerators, supercomputers and data centers achieve ultra-high-speed data access.

-Excellent energy efficiency: Micron HBM3E consumes about 30% less power than competing products, leading the industry. To support growing artificial intelligence demands and use cases, HBM3E delivers greater throughput at lower power consumption, thereby improving key operational expenditure metrics for data centers.

-Seamless scalability: Micron HBM3E is currently available in 24GB capacity, enabling data centers to seamlessly expand their AI applications. Whether used to train massive neural networks or accelerate inference tasks, Micron's solutions provide the necessary memory bandwidth.

Sumit Sadana, executive vice president and chief commercial officer of Micron, said: "Micron has achieved three major achievements with this landmark product HBM3E: industry-leading time to market, industry-leading performance and outstanding energy efficiency. Artificial intelligence workloads are becoming increasingly popular in a large number of countries. To a certain extent, it depends on memory bandwidth and capacity. Micron has industry-leading HBM3E and HBM4 product roadmaps, as well as a complete set of DRAM and NAND solutions for AI applications, which are fully prepared to support the substantial growth of artificial intelligence in the future."

Micron developed the industry-leading HBM3E design leveraging its 1β (1-beta) technology, advanced through silicon vias (TSVs) and other innovations that enable differentiated packaging solutions. As a long-standing storage leader in 2.5D/3D stacking and advanced packaging technology, Micron is honored to become a partner member of the TSMC 3Dfabric Alliance to jointly build the future of semiconductor and system innovation.

Micron will sample the 12-layer stacked 36GB capacity HBM3E in March 2024, providing performance exceeding 1.2TB/s and superior energy efficiency ahead of competing products, thereby further strengthening its leading position. Micron will sponsor the NVIDIA GTC Global Artificial Intelligence Conference, which opens on March 18, and will share more cutting-edge AI memory product series and roadmaps.

Share post:
10,000+

Daily Order Quantity

5000,000+

Alternative Parts

1600+

Worldwide Manufacturers

15,000 ㎡

In-stock Warehouse

Top