Extreme Codesign Across NVIDIA Vera CPU, Rubin GPU, NVLink 6 Switch, ConnectX-9 SuperNIC, BlueField-4 DPU and Spectrum-6 Ethernet Switch Slashes Training Time and Inference Token Generation Cost News Summary: The Rubin platform harnesses extreme codesign across hardware and software to deliver up to 10x reduction in inference token cost and 4x reduction in number of GPUs to train MoE models, compared with the NVIDIA Blackwell platform. NVIDIA Spectrum-X Ethernet Photonics switch systems deliver 5x improved power efficiency and uptime. New NVIDIA Inference Context Memory Storage Platform with NVIDIA BlueField-4 storage processor to accelerate agentic AI reasoning. Microsoft’s next-generation Fairwater AI superfactories — featuring NVIDIA Vera Rubin NVL72 rack-scale systems — will scale to hundreds of thousands of NVIDIA Vera Rubin Superchips. CoreWeave among first to offer NVIDIA Rubin, operated through CoreWeave Mission Control for flexibility and performance. Expanded collaboration with Red Hat to deliver a complete AI stack optimized for the Rubin platform with Red Hat Enterprise Linux, Red Hat OpenShift and Red Hat AI. CES—NVIDIA today kickstarted the next generation of AI with the launch of the NVIDIA Rubin platform, comprising six new chips designed to deliver one incredible AI supercomputer. NVIDIA Rubin sets a new standard for building, deploying and securing the world’s largest and most advanced AI systems at the lowest cost to accelerate mainstream AI adoption. The Rubin platform uses extreme codesign across the six chips — the NVIDIA Vera CPU, NVIDIA Rubin GPU, NVIDIA NVLink™ 6 Switch, NVIDIA ConnectX®-9 SuperNIC, NVIDIA BlueField®-4 DPU and NVIDIA Spectrum™-6 Ethernet Switch — to slash training time and inference token costs. “Rubin arrives at exactly the right moment, as AI computing demand for both training and inference is going through the roof,” said Jensen Huang, founder and CEO of NVIDIA. “With our annual cadence of delivering a new generation of A...
First seen: 2026-01-08 18:48
Last seen: 2026-01-08 19:48