Nvidia and SK hynix are building an AI SSD that could be 10x faster
Partnership expands beyond HBM as SK hynix targets 100 million IOPS
by Skye Jacobs · TechSpotServing tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.
In a nutshell: SK hynix is deepening its collaboration with Nvidia by developing a high-performance solid-state drive optimized for AI inferencing workloads. The partnership marks an extension of the companies' cooperation on high-bandwidth memory supply for Nvidia's AI GPUs; this time moving aggressively into NAND flash storage innovation.
At a recent conference, SK hynix Vice President Kim Cheon-seong announced that the Korean memory manufacturer is working with Nvidia on what could become a tenfold leap in SSD performance. According to Korean outlet Chosun, Kim said his firm was developing a new SSD with ten times more performance alongside Nvidia.
The companies are referring to the new concept under two code names: Storage Next at Nvidia and AI-NP – short for AI NAND performance – within SK hynix. The development remains at the proof-of-concept stage, with a prototype targeted for completion before the end of 2026.
SK hynix aims to push this next-generation AI SSD to 100 million input/output operations per second (IOPS) – dramatically higher than the throughput of conventional enterprise-grade SSDs. Reaching that scale would represent a major architectural leap, effectively bridging the gap between memory and storage in AI infrastructure.
The motivation for this effort stems from the data access bottlenecks found in current AI workloads. Today's large-scale inferencing models rely on continuous retrieval of vast numbers of model parameters, a task that conventional HBM or DRAM technologies cannot efficiently support at scale. The vision for Storage Next is to enable a pseudo-memory layer using NAND flash and advanced controller technologies explicitly designed for AI computation rather than general-purpose storage.
The joint project also hints at potential ripple effects in the market. NAND supply chains are already under pressure from growing demand for cloud and AI services, and a specialized AI SSD could intensify that strain. Industry observers have raised the possibility of a DRAM-style supply crunch if such high-performance NAND solutions move into mainstream AI applications.
Both companies appear focused on overcoming throughput and energy-efficiency challenges that have become defining limits of current-generation AI infrastructure. By integrating more advanced NAND and controller architectures, SK hynix and Nvidia are effectively positioning flash storage to play a more active computational role in machine learning workloads – something conventional memory cannot economically achieve.