Nvidia's powerful GPU chips and software are an integral ingredient in the creation of generative AI
San José (United States) (AFP) - Nvidia on Monday unveiled its latest family of chips for powering artificial intelligence, as it seeks to consolidate its position as the major supplier to the AI frenzy.
“We need bigger GPUs. So ladies and gentlemen, I would like to introduce you to a very, very big GPU,” said CEO Jensen Huang at a developers conference in California, referring to the graphics processors that are vitally important to creating generative AI.
The event, dubbed the “AI Woodstock” by Wedbush analyst Dan Ives, has become a can’t-miss date on big tech’s calendar due to Nvidia’s singular role in the AI revolution that has taken the world by storm since the introduction of ChatGPT in late 2022.
“I hope you realize this is not a concert, this is a developers conference,” Huang joked as he took the stage in a packed arena usually reserved for ice hockey games and concerts.
Nvidia’s powerful GPU chips and software are an integral ingredient in the creation of generative AI, with rivals like AMD or Intel still struggling to match the power and efficiency of the company’s blockbuster H100 product, launched in 2022.
Apple, Microsoft and Amazon have also developed chips with AI in mind, but for now are stuck trying to get their hands on Nvidia’s coveted products in order to deliver on their own AI promises.
That lynchpin role in the AI revolution has seen Nvidia’s share price rise roughly 250 percent over the last 12 months, propelling the company above Amazon when measured by market capitalization, behind only Microsoft and Apple.
- ‘Insane’ -
Not letting up, Nvidia told the audience of developers and tech executives it was releasing an even more powerful processor and accompanying software, on a platform called Blackwell, named after David Blackwell, the first Black academic inducted into the National Academy of Science.
Blackwell GPUs would deliver AI “superchips” four times as fast as the previous generation when training AI models, Nvidia said.
“The rate at which computing is advancing is insane,” Huang said.
They would also deliver 25 times the energy efficiency, Nvidia said, a key claim when the creation of AI is criticized for its ravenous needs for energy and natural resources when compared to more conventional computing.
Initially developed to improve the graphics quality of video games, the company run by Jensen Huang figured out GPUs were perfectly suited for developing the large language models (LLMs) that underpin generative AI interfaces such as ChatGPT.
Unlike its rivals Intel, Micron and Texas Instruments, Nvidia, like AMD, does not manufacture its own semiconductors, but uses subcontractors, mainly the Taiwan Semiconductor Manufacturing Co.
Given the geopolitical concerns with Taiwan and China, this could be a potential weak spot, and the US has banned Nvidia from sending its most powerful chips to Chinese companies.