A surge in demand for artificial intelligence is set to radically reallocate data center spending toward memory, with its share of capital expenditure projected to approach one-third of the total by 2026, up from just 8% in 2023. This dramatic shift, forecast by research firm SemiAnalysis, points to a period of soaring prices and supply shortages for critical components like High-Bandwidth Memory (HBM) and DRAM.
The report forecasts that DRAM prices will more than double by 2026, with further double-digit increases in 2027. "The cost uplift is unprecedented," Dell Chief Operating Officer Jeff Clarke said on an earnings call, describing the speed of component cost increases impacting the server industry.
According to SemiAnalysis, memory's share of hyperscale data center capex will climb to approximately 30% by 2026 and rise even further in 2027. The firm expects HBM, the vertically stacked memory essential for AI accelerators, to remain in short supply through 2027. This is already affecting hardware costs, with prices for Nvidia's B200 servers expected to climb by as much as 20% by the end of the year due to rising memory costs.
The impending memory bottleneck creates clear winners and losers. Memory producers, including Samsung, SK Hynix, and Micron, are poised for a significant revenue and margin boost. Conversely, AI server manufacturers and major cloud providers will face sustained pressure on their profit margins. The situation may also widen the competitive gap between Nvidia and rivals like AMD, as Nvidia's scale allows it to secure "Very Very Preferred" pricing from suppliers, a discount not available to smaller-volume purchasers.
Nvidia's Advantage, AMD's Challenge
SemiAnalysis notes that Nvidia's preferential DRAM pricing effectively compresses its server costs and masks the severity of the supply shortage for the rest of the market. This purchasing power gives it a significant cost structure advantage over competitors.
AMD, with a smaller AI accelerator footprint, is more exposed to the escalating memory costs and cannot command similar discounts. This dynamic could hinder its ability to compete on price and scale in a market where memory is becoming a primary cost driver.
Supply Scramble Can't Keep Pace
In response to the demand, major memory manufacturers are shifting production capacity toward HBM and other high-margin enterprise DRAM products. However, this pivot constrains the supply of conventional DDR5 and LPDDR5 memory, driving prices up across the board.
Major new production facilities, such as Micron’s $9.6 billion HBM plant in Hiroshima and SK Hynix's expansions in Icheon and Cheongju, are not expected to contribute substantial output until 2027 or 2028 at the earliest. According to SemiAnalysis, while cloud providers have partially factored these price hikes into their 2026 spending guidance, Wall Street estimates have not yet fully accounted for the expected repricing in 2027, suggesting further financial impact is yet to be priced in.
This article is for informational purposes only and does not constitute investment advice.