New MinIO AIStor integrations leverage NVIDIA's emerging infrastructure technologies and capabilities to rapidly deliver unparalleled innovation in AI storage at the multi-exabyte scale

REDWOOD CITY, Calif., March 17, 2025 -- To further support the demands of modern AI workloads, MinIO, the leader in AI data storage for the exascale data era, today at NVIDIA GTC unveiled three crucial, upcoming advancements to MinIO AIStor that deepen its support for the NVIDIA AI ecosystem. The new integrations will help users maximize the utilization and efficiency of their AI infrastructures while streamlining their management, freeing up personnel for more strategic AI activities.

MinIO Logo

The new MinIO AIStor features include:

  • Support for NVIDIA GPUDirect Storage (GDS) for object storage: Delivers significant increase in CPU efficiency on the NVIDIA GPU server by avoiding the traditional data path through the CPU, freeing up compute for additional AI data processing while reducing infrastructure costs via support for Ethernet networking fabrics.
  • Native integration with NVIDIA BlueField-3 networking platform: Drives down object storage Total Cost of Ownership (TCO) while ensuring industry-leading performance, optimizing data-driven and AI workloads at petabyte to exabyte scale for modern enterprise environments.
  • Incorporation of NVIDIA NIM microservices into AIStor promptObject inference: Brings simplified deployment and management of inference infrastructure while also enabling AIStor's new S3 API promptObject, which allows users to "talk" to unstructured objects in the same way one would engage an LLM, to deliver faster inference via model optimizations for NVIDIA hardware.

"MinIO's strong alignment with NVIDIA allows us to rapidly innovate AI storage at multi-exabyte scale, leveraging their latest infrastructure," said AB Periasamy, co-founder and co-CEO, MinIO. "This approach delivers high-performance object storage on commodity hardware, enabling enterprises to future-proof their AI, maximize GPU utilization, and lower costs."

Maximizing the Utilization and Efficiency of AI Compute Infrastructure (GPUs and CPUs)

NVIDIA GPUDirect Storage (GDS) initially required InfiniBand, which necessitates specialized hardware. NVIDIA has extended the benefits of GDS and InfiniBand to Ethernet networks. This innovation provides flexibility, scalability, and cost efficiency, making it an ideal solution for accelerated AI adoption at scale for Enterprises.

Renowned for its high performance, MinIO AIStor already fully utilizes the available per-node network bandwidth to feed data-hungry GPUs. MinIO AIStor's GDS for object storage implementation leverages Ethernet fabrics by establishing a direct data path between MinIO AIStor and NVIDIA GPU memory. This drastically improves overall GPU server efficiency, increasing resources for additional AI-related compute. MinIO AIStor and NVIDIA GDS deliver a more efficient, adaptable solution for scaling AI infrastructure, and create a streamlined, ultra-fast pipeline that turns data lakes into high-speed AI/ML training environments.

This integration with NVIDIA GDS for object storage reduces the burden on NVIDIA GPU server CPUs by establishing a direct data path between object storage and GPU memory. This drastically improves overall GPU server efficiency, increasing resources for additional AI-related compute. MinIO AIStor and NVIDIA GDS deliver a more efficient, adaptable solution for scaling AI infrastructure, and create a streamlined, ultra-fast pipeline that turns data lakes into high-speed AI/ML training environments.

"We are excited to see MinIO bring AIStor to the NVIDIA AI ecosystem and to explore how AIStor and GPUDirect Storage together perform under the specific demands of our workloads," said Alex Timofeyev, Director, High Performance Compute Engineering and Operations, Recursion. "Our work requires high scalability, throughput, and efficiency in handling AI workloads. Based on preliminary testing, we believe that MinIO AIStor will increase efficiency in the CPU's on our AI compute infrastructure, ultimately enhancing the performance and economics in our data environment." 

Additionally, MinIO AIStor becomes the first and only object storage software to run natively on NVIDIA's BlueField-3 Data Processing Unit (DPU), made possible by AIStor's remarkably compact ~100MB footprint. This ultra-efficient, low-cost architecture completely eliminates the need for separate x64 CPUs, transforming what were already commodity storage servers into MinIO and NIC-powered JBOFs (Just a Bunch of Flash).

MinIO AIStor leverages Arm's Scalable Vector Extension (SVE) instruction set to deliver MinIO's industry-leading object storage performance and inline data management features directly from NVIDIA BlueField-3 DPUs. This integration allows MinIO to be Spectrum-X ready, ensuring seamless integration with NVIDIA's next-generation networking stack for AI and high-performance workloads. It also means customers will get seamless integration with GPUDirect Storage for object storage, optimizing GPU server efficiency by minimizing data movement overhead.

Maximizing Inference Performance and Streamlining Infrastructure Management

MinIO AIStor simplifies AI-powered interactions with stored objects through the incorporation of AIStore promptObject into NVIDIA NIM microservices inference infrastructure. NIM provides pre-built Docker containers and Helm charts, and GPU Operator, which automates the deployment and management of drivers and the rest of the inference stack on the NVIDIA GPU server.

MinIO AIStor, leveraging NVIDIA NIM microservices, accelerates time to value and frees personnel from manual data pipeline and infrastructure building, enabling them to concentrate on strategic AI initiatives. In addition, NVIDIA NIM model optimizations for NVIDIA hardware deliver accelerated promptObject inference results.

These new features and integrations are open to beta customers under private preview. MinIO AIStor support for NVIDIA GDS and native integration with NVIDIA BlueField-3 networking platform will be released in alignment with NVIDIA's GA calendar.

To request a demo, visit min.io. To learn more about each feature and integration, visit:

About MinIO

MinIO is the leader in high-performance object storage for AI. With 2B+ Docker downloads 50k+ stars on GitHub, MinIO is used by more than half of the Fortune 500 to achieve performance at scale at a fraction of the cost compared to the public cloud providers. MinIO AIStor is uniquely designed to meet the flexibility and exascale requirements of AI, empowering organizations to fully capitalize on existing AI investments and address emerging infrastructure challenges while delivering continuous business value. Founded in November 2014 by industry visionaries AB Periasamy and Garima Kapoor, MinIO is the world's fastest growing object store.

Media Contact: Tucker Hallowell, Inkhouse, minio@inkhouse.com 

Logo - https://mma.prnewswire.com/media/2360095/MinIO_Logo.jpg

Cision View original content:https://www.prnewswire.co.uk/news-releases/minio-deepens-support-for-the-nvidia-ai-ecosystem-302402465.html