Following a severe market downturn in 2022–2023, major memory manufacturers—
Samsung Electronics,
SK Hynix, and
Micron Technology—implemented strategic production cuts to stabilize pricing. By mid-2024, the rapid expansion of
generative AI services triggered unprecedented demand for specialized memory products, particularly
High Bandwidth Memory (HBM) used in AI accelerators and data center GPUs. Specialized components of chip-making technology are also experiencing supply constraints due to high demand in AI application. For example, glass cloth, a high-performance
glass fiber substrate used for power efficient high speed data transfer and a crucial component of chip-making, is experiencing supply crisis as
Nitto Boseki, a Japanese firm having overwhelming monopoly in its production, is not able to meet increased demands making chip-makers such as
Qualcomm,
Apple,
Nvidia and
AMD compete for securing supply for their chips. There are also reports of smaller electronics companies struggling to find suppliers for components such as
NAND flash. Memory suppliers are adapting to increased demands and market unpredictability by requiring prepayment or shorter time-frame of payment, which makes it more difficult for smaller firms to acquire capital to survive. By 2026, due to steadily increased demand on resources,
CPU chips are also experiencing shortage issues due to low fabrication capacity, prioritisation of server CPUs, and increased demand, with CPU prices also being forecast to increase by as much as 15%. The demand on memory has also increased strain on other electronic components such as
hard disk devices, with reports such as
Western Digital's hard disk supply for 2026 being booked for enterprise applications before February 2026. A 2024
McKinsey analysis projected that global demand for AI-ready data center capacity would grow at approximately 33% annually through 2030, with AI workloads consuming roughly 70% of total data center capacity by the decade's end. In addition, according to
Kearney's State of Semiconductor 2025 Report, executives were already expecting a shortage in the <8nm wafer size with memory chips being mentioned as an acute source of concern. Multiple companies mentioned being prepared for it through long-term agreements with RAM suppliers or amassing additional inventory. On 24 March 2026,
Google announced
TurboQuant, a memory compression technology focused on
large language models (LLM) and vector search engines, which it claimed achieves 6x lower memory consumption in tested local LLMs and 8x performance enhancement in tests running on
H100 accelerators. The technology is also a drop in enhancement for existing inference pipeline. Amid speculation about memory demand trends, memory manufacturers,
SanDisk,
Micron,
Western Digital and
Seagate, among other companies involved in memory manufacture experienced stock price declines. Prices of memory kits also reduced in the following months, although still at inflated prices. == Causes ==