Nvidia Corp. Chief Executive Officer Jensen Huang unveiled a new batch of products and services tied to artificial intelligence, looking to further capitalize on a frenzy that has made his company the world’s most valuable chipmaker.
The wide-ranging new lineup includes an AI supercomputer platform called DGX GH200, which will help tech companies create successors to ChatGPT, Huang told the audience at the Computex show in Taiwan. Microsoft Corp., Meta Platforms Inc. and Alphabet Inc.’s Google are expected to be some of the first users of that equipment.
Nvidia also is teaming up with WPP Plc to use AI and the metaverse to lower the cost of producing advertising. It’s releasing a networking offering that’s designed to turbocharge the speed of information within data centers. And the company is even looking to change how people interact with video games: A service called Nvidia ACE for Games will use AI to enliven background characters and give them more personality.
Huang argued the traditional architecture of the tech industry is no longer improving fast enough to keep up with complex computing tasks. To realize the full potential of AI, customers are increasingly turning to accelerated computing and graphics processing units, or GPUs, like those made by Nvidia.
“We have reached the tipping point of a new computing era,” Huang said, as he paced the stage in a trademark leather jacket.
The flurry of announcements underscores Nvidia’s shift from a maker of computer graphics chips to a company at the center of the AI boom. Last week, Huang gave a stunning sales forecast for the current quarter — almost $4 billion above analysts’ estimates — fueled by demand for data-center chips that handle AI tasks. That sent the stock to a record high and put Nvidia on the brink of a $1 trillion valuation — a first for the chip industry.
Read More: Nvidia Nailed Bet on AI Trend in Surge Toward $1 Trillion
Huang also showed off the mind-bending capabilities of generative AI to take inputs in the form of words and then put out other media. In one case, he asked for music to match the mood of early morning. In another, he laid out a handful of lyrics and then used AI to transform the idea into a bouncy pop tune.
“Everyone is a creator now,” he said.
The DGX computer is another attempt to keep data center operators hooked on Nvidia’s products. Microsoft, Google and their peers are all racing to develop services similar to OpenAI Inc.’s ChatGPT chatbot — and that requires plenty of computing horsepower. To satisfy this appetite, Nvidia is both offering equipment for data centers and building its own supercomputers that customers can use. That includes two new supercomputers in Taiwan, the company said.
One of the biggest AI bottlenecks is the speed at which data moves within data centers. Nvidia’s Spectrum X, a networking system that uses technology acquired in the 2020 purchase of Mellanox Technologies, will address that issue. And the company is building a data center in Israel to demonstrate how effective it is.
The WPP partnership, meanwhile, will streamline the creation of advertising content. The UK advertising titan will use Nvidia’s Omniverse technology to create “virtual twins” of products that can be manipulated to customize ads and reduce the need for costly reshoots.
Nvidia’s original business was selling graphics cards to gamers, and it’s returning to that world with the ACE offering. The service will address the problem of NPCs, or nonplayer characters, the background figures that populate video games. NPCs typically give repetitive responses with scripted dialogue, and that limited range has made them the subject of ridicule in memes and even the Ryan Reynolds movie “Free Guy.”
Nvidia ACE will listen to what the gamer says to a character, convert into text and then dump that into a generative AI program to create a more natural, off-the-cuff response. The Santa Clara, California-based company is currently testing the service and will add guardrails to ensure that responses aren’t inappropriate or offensive.
--With assistance from Mayumi Negishi and Peter Elstrom.
(Updates with CEO comments from fourth paragraph)