The global race to build artificial intelligence infrastructure is reshaping the economics of the memory chip industry, driving record share gains for manufacturers and prompting executives to say a decades-old boom-and-bust cycle may finally be breaking down.
Shares of Micron Technology have surged more than 370% over the past year as demand for AI-related memory accelerates. Meanwhile, SanDisk — which returned to the public market in February last year after being spun out of Western Digital — has soared more than 1,100%, highlighting investor enthusiasm for companies tied to the AI supply chain.
For much of the past three decades, memory manufacturers operated under one of the semiconductor industry’s most volatile cycles. Prices for DRAM and NAND storage would spike during supply shortages, prompting producers to rapidly expand manufacturing capacity. The resulting oversupply would then drive prices down, triggering sharp downturns before the next recovery.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
Executives across the technology sector now say artificial intelligence is fundamentally altering that dynamic.
“We will continue to raise prices because the industry will continue to raise prices,” said Antonio Neri, chief executive of Hewlett Packard Enterprise. “There is not enough supply for demand.”
The shift is being driven by the unprecedented computing requirements of modern AI systems. Training large language models and generative AI platforms requires vast clusters of processors working simultaneously, supported by massive pools of ultra-fast memory. That architecture is dramatically more memory-intensive than traditional computing environments used for enterprise software, personal computers, or smartphones.
High-bandwidth memory, commonly known as HBM, has emerged as one of the most critical components in AI hardware. The technology allows chips to access data far faster than conventional DRAM, making it essential for training large AI models and running advanced inference workloads.
Demand for HBM has surged so rapidly that technology companies are rushing to secure long-term supply contracts.
SK Hynix, one of the world’s largest memory manufacturers and a major supplier of HBM, said the industry is undergoing structural changes as customers increasingly prefer multi-year supply agreements.
“The company’s customers, including hyperscalers, have increasingly preferred long-term contracts over the one-year agreements that were more common in the past,” an SK Hynix spokesperson said.
Micron Technology has reported a similar shift, telling CNBC that customers are now willing to sign long-term agreements to lock in supply as competition intensifies for AI hardware components.
Those customers include some of the world’s largest technology companies, often referred to as hyperscalers because of the massive scale of their cloud computing infrastructure.
Executives say these companies are reserving memory capacity years in advance.
On the latest earnings call for Broadcom, Chief Executive Hock Tan said the company has already secured supply commitments for key components through 2028 as demand for AI chips and systems accelerates.
Technology giants building their own AI hardware are also confronting supply constraints. Meta Platforms on Wednesday unveiled a new internally designed AI chip as part of its push to expand computing capacity for artificial intelligence workloads.
But even as the company ramps up hardware development, it remains concerned about securing sufficient memory.
“We’re absolutely worried about HBM supply,” said Yee Jiun Song, vice president of engineering at Meta. “But we think that we have secured our supply for what we’re planning to build out.”
The pressure on memory supply is being amplified by a massive wave of capital spending across the technology industry. Major cloud providers such as Amazon, Microsoft, Alphabet, and Meta are investing hundreds of billions of dollars in AI data centers to support the growing demand for generative AI services.
Each of those facilities requires enormous volumes of memory chips to feed data to powerful processors and graphics chips that train and run AI models.
As hyperscalers absorb increasing amounts of available supply, analysts say the balance of the memory market is shifting away from consumer electronics. Manufacturers of smartphones, PCs, and other consumer devices are finding themselves competing with data center operators for the same components, often at higher prices.
An executive at Seagate Technology told the South China Morning Post that memory price increases could become “the new normal” for the next several years. The long lead times required to expand semiconductor manufacturing capacity are reinforcing those expectations.
Building advanced memory fabrication plants costs tens of billions of dollars and can take several years to complete, meaning supply cannot quickly adjust to sudden surges in demand. As a result, industry executives believe meaningful relief from supply constraints may not arrive until at least 2027, when new facilities currently under construction begin operating at full scale.
The emergence of long-term contracts is also changing how the memory industry manages supply and pricing. Historically, most memory was sold on short-term contracts or even spot markets, leaving prices highly sensitive to shifts in demand.
Multi-year agreements with hyperscalers, by contrast, provide greater revenue visibility for manufacturers while ensuring customers receive priority access to scarce components.
That change could smooth the dramatic price swings that once defined the sector. For investors, the sharp rally in memory stocks is an indication that markets are increasingly convinced the industry is entering a new phase — one powered by sustained demand from artificial intelligence rather than the cyclical consumer electronics markets that dominated the past



