• Thread starter News
  • Start date
  • " /> News - Micron Publishes Updated DRAM Roadmap: 32 Gb DDR5 DRAMs, GDDR7, HBMNext | SoftoolStore.de - Софт, Avid Media Composer, Книги. | бесплатные прокси (HTTP, Socks 4, Socks 5)

    News Micron Publishes Updated DRAM Roadmap: 32 Gb DDR5 DRAMs, GDDR7, HBMNext

    News

    Staff member
    Редактор
    Messages
    14,252
    Points
    358
    Offline
    #1


    In addition to unveiling its first HBM3 memory products yesterday, Micron also published a fresh DRAM roadmap for its AI customers for the coming years. Being one of the world's largest memory manufacturers, Micron has a lot of interesting things planned, including high-capacity DDR5 memory devices and modules, GDDR7 chips for graphics cards and other bandwidth-hungry devices, as well as HBMNext for artificial intelligence and high-performance computing applications.


    32 Gb DDR5 ICs


    We all love inexpensive high-capacity memory modules, and it looks like Micron has us covered. Sometimes in the late first half of 2024, the company plans to roll-out its first 32 Gb DDR5 memory dies, which will be produced on the company's 1β (1-beta) manufacturing process. This is Micron's latest process node and which does not use extreme ultraviolet lithography, but rather relies on multipatterning.

    32 Gb DRAM dies will enable Micron to build 32 GB DDR5 modules using just eight memory devices on one side of the module. Such modules can be made today with Micron's current 16 Gb dies, but this requires either placing 16 DRAM packages over both sides of a memory module – driving up production costs – or by placing two 16 Gb dies within a single DRAM package, which incurs its own costs due to the packaging required. 32 Gb ICs, by comparison, are easier to use, so 32 GB modules based on denser DRAM dies will eventually lead to lower costs compared to today's 32 GB memory sticks.

    But desktop matters aside, Micron's initial focus with their higher density dies will be to build even higher capacity data center-class parts, including RDIMMs, MRDIMMs, and CXL modules. Current high performance AI models tend to be very large and memory constrained, so larger memory pools open the door both to even larger models, or in bringing down inference costs by being able to run additional instances on a single server.

    For 2024, Micron is planning to release 128GB DDR5 modules based on these new dies. In addition, the company announced plans for 192+ GB and 256+ GB DDR5 modules for 2025, albeit without disclosing which chips these are set to use.

    Meanwhile, Micron's capacity-focused roadmap doesn't have much to say about bandwidth. While it would be unusual for newer DRAM dies not to clock at least somewhat higher, memory manufacturers as a whole have not offered much guidance about future DDR5 memory speeds. Especially with MRDIMMs in the pipeline, the focus is more on gaining additional speed through parallelism, rather than running individual DRAM cells faster. Though with this roadmap in particular, it's clear that Micron is more focused on promoting DDR5 capacity than promoting DDR5 performance.

    GDDR7 in 1H 2024


    Micron was the first larger memory maker to announce plans to roll out its GDDR7 memory in the first half of 2024. And following up on that, the new roadmap has the the company prepping 16 Gb and 24 Gb GDDR7 chips for late Q2 2024.

    As with Samsung, Micron's plans for their first generation GDDR7 modules do not have them reaching the spec's highest transfer rates right away (36 GT/sec), and instead Micron is aiming for a more modest and practical 32 GT/sec. Which is still good enough to enable upwards of 50% greater bandwidth for next-generation graphics processors from AMD, Intel, and NVIDIA. And perhaps especially NVIDIA, since this roadmap also implies that we won't be seeing a GDDR7X from Micron, meaning that for the first time since 2018, NVIDIA won't have access to a specialty GDDR DRAM from Micron.

    HBMNext in 2026


    In addition to GDDR7, which will be used by graphics cards, game consoles, and lower-end high-bandwidth applications like accelerators and networking equipment, Micron is also working on the forthcoming generations of its HBM memory for heavy-duty artificial intelligence (AI) and high-performance computing (HPC) applications.

    Micron expects its HBMNext (HBM4?) to be available in 36 GB and 64 GB capacities, which points to a variety of configurations, such as 12-Hi 24 Gb stacks (36 GB) or 16-Hi 32 Gb stacks (64 GB), though these are pure speculations at this point. As for performance, Micron is touting 1.5 TB/s – 2+ TB/s of bandwidth per stack, which points to data transfer rates in excess of 11.5 GT/s/pin.
     
    Top Bottom