At the 2025 SK AI Summit held in Seoul, SK Group reaffirmed its commitment to scaling up its high-bandwidth memory (HBM) chip capabilities.
SK hynix, the conglomerate’s flagship semiconductor arm, announced plans to begin operations at a new HBM facility in Cheongju, South Korea, next year. The initiative is a direct response to the explosive rise in AI computing power demand, which has far outpaced available supply.
Chairman Chey Tae-won emphasized during the summit that the company’s strategy is shifting from sheer scale to operational efficiency and intelligent production, aiming to manage both the technical and economic challenges that accompany rapid AI infrastructure expansion.
Chey highlighted that the race to deliver more powerful AI chips is becoming increasingly constrained by manufacturing bottlenecks and material shortages, especially in the HBM segment. While demand for memory with ultra-high bandwidth continues to skyrocket, production remains hindered by long lead times and unpredictable client orders.
To address these issues, SK hynix is pursuing a “smart efficiency” model, a system-wide upgrade to optimize design, production, and logistics through AI-driven automation. The company is also collaborating with Nvidia to integrate advanced digital manufacturing platforms that streamline workflows, reduce error rates, and improve yields.
The upcoming Cheongju HBM plant will be complemented by a large-scale memory production cluster expected to be completed by 2027. This network will operate in parallel with new facilities in Yongin, South Korea, and Indiana, United States, marking SK hynix’s most ambitious expansion plan yet.
The cluster will support mass production of HBM4, the fourth generation of high-bandwidth memory, which SK hynix completed development on in September 2025. Mass production is scheduled for Q4 2025, leveraging cutting-edge MR-MUF packaging technology, a process that stacks memory dies with enhanced reliability while reducing defects.
With SK hynix already commanding an estimated 62% share of the global HBM market, the new facility is expected to cement its leadership in AI-grade memory solutions while easing supply pressure for clients such as Nvidia, AMD, and other major hyperscalers.
Beyond chips, SK Group’s AI ambitions extend to data centers and energy efficiency. SK Telecom, another group affiliate, unveiled new plans to expand its AI data center operations while partnering with other SK companies to design energy-optimized facilities.
Korea’s emerging data infrastructure landscape includes a 3-gigawatt AI data center project set to break ground in 2025, alongside a BlackRock-backed hyperscale hub and potential OpenAI partnerships in infrastructure development. These projects emphasize advanced cooling techniques, like liquid and immersion cooling, and smarter power distribution systems to handle fluctuating AI workloads.
The alignment of SK hynix’s chip production with SK Telecom’s AI infrastructure strategy underscores the group’s integrated approach to the AI economy, combining hardware, data, and sustainability under one innovation banner.
The post SK hynix to Launch New HBM Chip Facility Amid Soaring AI Demand appeared first on CoinCentral.


