Memory Shortage Until 2030 Becomes Top AI Constraint
Memory chips have replaced electricity as the primary bottleneck for expanding artificial intelligence infrastructure, according to OpenAI Chief Operating Officer Brad Lightcap. Speaking at the Hill and Valley Forum in Washington on Tuesday, Lightcap stated that the massive procurement of AI accelerators, each packed with high-performance memory, is consuming global production capacity. This assessment realigns the industry's focus from data center power grids to the semiconductor supply chain.
The warning was strongly reinforced by SK Hynix Chairman Chey Tae-won, who projects the global memory chip shortage could extend to 2030. Chey anticipates a wafer supply shortfall of more than 20% as producers struggle to build new fabrication plants, a process that takes four to five years. He also cautioned that the industry's intense focus on high-bandwidth memory (HBM) for AI is straining the supply of traditional DRAM, threatening to drive up prices for consumer PCs and smartphones.
Scarcity Forces OpenAI to Shelve Sora, Chipmakers Invest Billions
The compute scarcity is forcing immediate and difficult strategic decisions across Silicon Valley. OpenAI has been forced to prioritize its resources, leading to the shelving of its popular Sora AI video-generation app. Because video generation is one of the most compute-intensive AI tasks and the app did not generate revenue, the company is redirecting its limited capacity toward its primary growth engine, ChatGPT, and other enterprise offerings.
In response to the supply crunch, memory chip manufacturers are accelerating capital expenditures. SK Hynix, which holds a 57% share of the HBM market, recently placed an $8 billion order for advanced manufacturing tools from ASML and is constructing a new $13 billion HBM packaging plant. The market dynamics are fueling a financial boom for suppliers, with rival Micron Technology recently reporting quarterly revenue of $23.86 billion, a 196% increase year-over-year, driven by what it called massive demand for memory.