Pat Gelsinger, CEO Intel, speaking on CNBC’s Squawk Box at the WEF Annual Meeting in Davos, Switzerland on Jan. 16th, 2024.
Adam Galici | CNBC
Intel on Tuesday unveiled its latest artificial intelligence chip, called Gaudi 3, as chipmakers rush to produce semiconductors that can train and deploy big AI models, such as the one underpinning OpenAI’s ChatGPT.
Intel says the new Gaudi 3 chip is over twice as power-efficient as and can run AI models one-and-a-half times faster than Nvidia’s H100 GPU. It also comes in different configurations like a bundle of eight Gaudi 3 chips on one motherboard or a card that can slot into existing systems.
Intel tested the chip on models like Meta’s open-source Llama and the Abu Dhabi-backed Falcon. It said Gaudi 3 can help train or deploy models, including Stable Diffusion or OpenAI’s Whisper model for speech recognition.
Intel says its chips use less power than Nvidia’s.
Nvidia has an estimated 80% of the AI chip market with its graphics processors, known as GPUs, which have been the high-end chip of choice for AI builders over the past year.
Intel said that the new Gaudi 3 chips would be available to customers in the third quarter, and companies including Dell, Hewlett Packard Enterprise, and Supermicro will build systems with the chips. Intel didn’t provide a price range for Gaudi 3.
“We do expect it to be highly competitive” with Nvidia’s latest chips, said Das Kamhout, vice president of Xeon software at Intel, on a call with reporters. “From our competitive pricing, our distinctive open integrated network on chip, we’re using industry-standard Ethernet. We believe it’s a strong offering.”
The data center AI market is also expected to grow as cloud providers and businesses build infrastructure to deploy AI software, suggesting there is room for other competitors even if Nvidia continues to make the vast majority of AI chips.
Running generative AI and buying Nvidia GPUs can be expensive, and companies are looking for additional…
Read the full article here