Newstechok.com

NVIDIA GPUs powered the AI ​​revolution. Its new Blackwell chips are up to 30 times faster

NVIDIA GPUs powered the AI ​​revolution.  Its new Blackwell chips are up to 30 times faster

In less than two years, NVIDIA’s H100 chips, which are used by nearly every AI company in the world to train large language models that power services like ChatGPT, have made it one of the most valuable companies in the world. On Monday, NVIDIA announced a next-generation platform called Blackwell, whose chips are between seven and 30 times faster than the H100 and use 25 times less power.

“Blackwell GPUs are the engine that drives this new industrial revolution,” NVIDIA CEO Jensen Huang said at the company’s annual GTC event in San Jose, attended by thousands of developers, and which some compared at a Taylor Swift concert. “Generative AI is the defining technology of our time. By working with the world’s most dynamic companies, we will realize the promise of AI for every industry,” Huang added press release.

NVIDIA’s Blackwell chips are named after David Harold Blackwell, a mathematician who specialized in game theory and statistics. NVIDIA claims Blackwell is the most powerful chip in the world. It offers a significant performance upgrade to AI companies with speeds of 20 petaflops compared to just 4 petaflops provided by the H100. Much of that speed is made possible by the 208 billion transistors in the Blackwell chips, compared to 80 billion in the H100. To achieve this, NVIDIA connected two large chip arrays that can communicate with each other at speeds of up to 10 terabytes per second.

In a sign of how dependent our modern AI revolution is on NVIDIA chips, the company’s press release includes recommendations from seven CEOs who collectively lead trillion-dollar companies. They include OpenAI CEO Sam Altman, Microsoft CEO Satya Nadella, Alphabet CEO Sundar Pichai, Meta CEO Mark Zuckerberg, Google DeepMind CEO Demis Hassabis, Oracle Chairman Larry Ellison, exec Dell CEO Michael Dell and Tesla CEO Elon Musk.

“There’s nothing better than NVIDIA hardware for AI right now,” Musk said in the statement. “Blackwell offers huge leaps in performance and will accelerate our ability to deliver flagship models. We are excited to continue working with NVIDIA to advance AI computing,” said Altman.

NVIDIA did not disclose how much the Blackwell chips will cost. Its H100 chips currently run between $25,000 and $40,000 per chip, According to to CNBCand entire systems powered by these chips can cost up to $200,000.

Despite their price, NVIDIA chips are in high demand. Last year the delivery waiting time was as tall as 11 months. And access to NVIDIA’s AI chips is increasingly seen as a status symbol for tech companies looking to attract AI talent. Earlier this year, Zuckerberg touted the company’s efforts to build “a huge amount of infrastructure” to power Meta’s AI efforts. “At the end of this year,” Zuckerberg wrote, “we’ll have ~350k Nvidia H100s — and a total of ~600k H100s of the H100 compute equivalent if you include other GPUs.”

https://www.engadget.com/nvidias-gpus-powered-the-ai-revolution-its-new-blackwell-chips-are-up-to-30-times-faster-001059577.html?src=rss

Exit mobile version