The Register: SK hynix begins mass production of 36 GB 12-layer HBM3E

Source URL: https://www.theregister.com/2024/09/26/sk_hynix_hbm3e_production/
Source: The Register
Title: SK hynix begins mass production of 36 GB 12-layer HBM3E

Feedly Summary: Should be Jensen Huang’s hands in less than 60 days
Korea’s SK hynix revealed on Thursday that it had become the first chip manufacturer to mass produce the much-anticipated 36 GB 12-layer HBM3E chip.…

AI Summary and Description: Yes

Summary: SK hynix has achieved a significant milestone by mass producing the 36 GB 12-layer HBM3E chip, which is set to meet the growing demands of AI and supercomputing. This new chip design enhances capacity and speed, providing a key advantage for memory-intensive applications, particularly in the context of AI advancements.

Detailed Description:
SK hynix’s announcement regarding the mass production of its innovative 36 GB 12-layer HBM3E chip marks a notable development in both the hardware and AI sectors. The chip is tailored to meet the increasing demand for advanced memory solutions in high-performance computing environments that heavily rely on AI technologies. Here are key points regarding the new chip and its implications:

– **Mass Production Announcement:**
– SK hynix has become the first chip manufacturer to achieve mass production of a 36 GB HBM3E chip.
– The chips are expected to be available to customers by year-end.

– **Technical Specifications:**
– The new chips utilize a stacked design with through-silicon vias (TSVs), enhancing memory density in a compact format.
– Compared to the previous maximum capacity of 24 GB HBM3E (using eight 3 GB DRAM chips), the new design stacks an additional four layers, thereby increasing capacity by 50%.
– Each layer is 40% thicker than those in the previous generation, which also aids in mitigating reliability issues associated with thin layers.

– **Performance Impact:**
– The HBM3E chips boast the highest available memory speed of 9.6 Gbps, facilitating rapid memory operations essential for AI processing.
– For instance, powering a Large Language Model (LLM) like ‘Llama 3 70B’ using a single GPU with four HBM3E chips allows for rapid processing of 70 billion parameters 35 times per second.

– **Market Demand and Trends:**
– The increasing demand for HBM chips is particularly driven by sectors involved in AI, GPUs, and supercomputers.
– Market analysts have indicated potential DRAM supply shortages as manufacturers prioritize HBM production due to AI needs.

– **Impact on South Korean AI Ambitions:**
– Following this announcement, SK hynix’s shares rose significantly, indicating strong market confidence.
– The South Korean government aims to position the country among the top three AI nations globally, supported by initiatives including the establishment of a National AI Computing Center and regulatory reforms.

– **Collaborative Efforts:**
– A “Presidential AI Committee” has been formed to oversee R&D in AI, highlighting the government’s commitment to fostering technological advancement through collaboration between public and private sectors.

In conclusion, SK hynix’s innovation in HBM3E chips not only showcases advancements in semiconductor technology but also emphasizes the growing confluence of hardware capabilities with AI applications, setting the stage for enhanced computational power in future technologies.