Micron is the first to provide industry partnerswith high-rate,low-latency128GB high-capacityRDlMM memory based on 32Gb single-chip DRAM
Micron's 1β advanced process based DRAM rates of up to 8,000MT/s provide a better solution for memory-intensive applications such as generative AI
Micron Technology, Inc. (Micron Technology, Inc.) today announced the industry-leading 128GB DDR5 RDIMM memory based on 32Gb single chip, with best-in-class performance up to 8,000 MT/s to support current and future data center workloads. This high-capacity, high-rate memory module addresses the performance and data processing needs of a wide range of mission-critical applications in data centers and cloud environments, such as artificial intelligence (AI), in-memory databases (IMDB), and scenarios that require efficient processing of multithreaded, multi-core general-purpose computing workloads. Micron's 128GB DDR5 RDIMM memory based on 32Gb DDR5 DRAM bare chips uses industry-leading 1β (1-beta) process technology and offers significant improvements over competitors using 3DS through-silicon (TSV) technology in the following areas:
● Increased capacity density by more than 45%
● Energy efficiency increases by up to 24%
● Latency is reduced by up to 16%
● Improved AI training performance by up to 28%
Praveen Vaidyanathan, vice president and general manager of Micron's Computing Products Business Group, said: "We are very proud that Micron 128GB DDR5 RDIMMs set a new benchmark for high-capacity, high-speed memory in the data center, providing the memory bandwidth and capacity needed for increasingly computational-intensive workloads. Micron will accelerate the delivery of advanced technology to provide more timely support for the design and integration of high-capacity memory solutions, thereby advancing the data center ecosystem."
Micron's 32Gb DDR5 memory solution uses an innovative bare chip architecture that enables superior array efficiency and greater single-piece DRAM bare chip capacity density. The voltage domain and refresh management capabilities help optimize the power transmission network to achieve the required energy efficiency gains. In addition, the aspect ratio of the bare chip size is optimized to help improve the manufacturing efficiency of 32Gb large capacity DRAM bare chips.
By employing AI-powered smart manufacturing technology, Micron's 1β technology node achieved mature yields at the fastest rate in the company's history. Micron 128GB RDIMMs will ship in 2024 for platforms supporting 4800 MT/s, 5600 MT/s, and 6400 MT/s rates, with support for up to 8,000 MT/s in the future.
Dan McNamara, senior vice president and general manager of AMD's server business, said: "Micron's 128GB RDIMM will provide greater single-core memory capacity for our latest fourth-generation AMD EPYC processors, and 32Gb single-chip DRAM will provide a lower total cost of ownership for business-critical data enterprise workloads such as AI, high-performance computing and virtualization." "As AMD launches the next generation of EPYC processors to help drive computing power, the Micron 128GB RDIMM is expected to be one of our premier memory solutions, providing large capacity and outstanding single-core bandwidth capabilities to meet the needs of memory-intensive applications."
Dr. Dimitrios Ziakas, vice president of Memory and IO Technology at Intel, said: "We look forward to Micron's 128GB RDIMM solution based on 32Gb bare metal to bring improved bandwidth and performance per watt to the server and AI systems markets. Intel is evaluating this 32Gb bare chip memory product for key DDR5 server platforms based on the total cost of ownership for cloud, AI and enterprise customers."
Micron's 32Gb DRAM bare chips offer increased bandwidth and energy efficiency to build 128GB, 256GB and higher capacity MRDIMM product solutions compliant with MCRDIMM and JEDEC standards, expanding the memory portfolio. With industry-leading innovation in process and design technology, Micron offers a range of memory products in RDIMM, MCRDIMM, MRDIMM, CXL and LP form factors, enabling customers to easily integrate these optimized solutions. Meet the needs of AI and high-performance computing (HPC) applications for high bandwidth, large capacity, and low power consumption.