Nvidia says new server rack has more power than the entire internet

Editorial photoreal montage of Jensen Huang in a dark suit at center foreground with a neutral expression, behind him a bright futuristic AI data center aisle with glowing server racks and a prominent realistic Nvidia logo on a central metallic server faceplate, neon teal and warm amber accents, shallow depth of field, crisp high-contrast close-up composition with energy-like data streams arcing through the aisle

Nvidia shared new details about Vera Rubin, its computing platform for AI data centers, at the CES tech conference in Las Vegas on Monday. The system is now in production and will ship in the second half of 2026. According to CNN, the release could shape the future of AI given the industry’s heavy reliance on Nvidia technology.

New Storage System Targets AI Demands

Nvidia says Vera Rubin introduces a new storage system. The goal is to help AI models handle complex requests more quickly. The company claims its upcoming server rack, called Vera Rubin NVL72, provides more bandwidth than the entire internet.

CEO Jensen Huang explained the shift from chatbots to AI agents on stage. A video showed a person building a personal assistant using a tabletop robot connected to multiple AI models. The robot could recall to-do lists and even tell a dog to get off the couch. Huang said such tasks were unimaginable years ago but are now simple with large language models.

Dion Harris, Nvidia’s senior director of high-performance computing and AI hyperscale solutions, told reporters the bottleneck is shifting from compute to context management. He added that storage can no longer be an afterthought.

Major Cloud Providers Will Deploy Platform

Microsoft, Amazon Web Services, Google Cloud and CoreWeave will be among the first to deploy Vera Rubin. Computing companies like Dell and Cisco plan to add the new chips to their data centers. AI labs including OpenAI, Anthropic, Meta and xAI are expected to use the technology for training and queries.

Competition and Spending Concerns Grow

Nvidia became the world’s first $5 trillion company last year. But the company faces concerns about an AI bubble and growing competition. Google and OpenAI are developing their own chips to reduce reliance on Nvidia. Chipmaker AMD also competes in the space.

Meta, Microsoft and Amazon have spent tens of billions on AI infrastructure this year. McKinsey & Company expects companies to invest nearly $7 trillion in data center infrastructure globally by 2030. Ben Barringer of Quilter Cheviot said nobody wants to be beholden to Nvidia. Companies are trying to diversify their chip footprint.

Huang addressed funding questions in his opening remarks. He said companies are shifting budgets from classical computing to artificial intelligence. That is where the money is coming from, he said.

Total
0
Shares
Previous Post
Bright editorial collage with a close-up premium Samsung Galaxy smartphone in the center showing an abstract glowing AI orb interface without any letters, Samsung logo on the phone back visible, a floating Google Gemini swirl-style logo hologram nearby, multiple smaller phones radiating outward like a network, clean white-to-cyan background with warm pink accents, crisp high-contrast magazine cover look, shallow depth of field

Samsung plans to double AI phones to 800 million by 2026

Next Post
Editorial collage with a realistic close-up portrait of Jensen Huang on one side and a sleek autonomous Mercedes-Benz-style sedan interior on the other, passenger in the driver seat with hands resting in lap while the steering wheel turns slightly on its own, subtle Nvidia logo near the collage seam, bright teal-and-silver palette with warm skin tones, crisp high-contrast magazine cover look, no text anywhere

Mercedes-Benz to release Nvidia-powered self-driving car in US

Related Posts