Google's TPUs Are Giving NVIDIA a Run for Its Money
Digest more
"I think some folks need some sleep," CEO Sundar Pichai said on a recent Google podcast about the company's Gemini 3 launch and AI ambitions.
Nvidia’s customers have a big incentive to explore cheaper alternatives. Bernstein, an investment-research firm, estimates that Nvidia’s GPUs account for over two-thirds of the cost of a typical AI server rack.
Google initially faced pressure due to fears that it fell behind in the AI race and lost ground to AI models. Google Cloud serves as the backbone for many AI applications and is fueling the next stage of growth. Physical AI like Waymo and Gemini Robotics may be Google's biggest opportunity.
Developers have been busy updating Google Messages, and are working to change how you save media, share your location, and interact with Gemini.
The internet giant has released new AI software and struck deals, such as a chip tie-up with Anthropic PBC, that have reassured investors the company won’t easily lose to ChatGPT creator OpenAI and other rivals.
This means that Google still needs Nvidia GPUs, used in tandem with its own TPUs, to get the combination of speed and energy efficiency it needs to compete. It further suggests that, even if the reports are true, and Google's power-miserly chips cut into Nvidia's business, the company will still be the dominant player in the data center GPU space.
When OpenAI unleashed ChatGPT, it awoke Google from its slumber. As Google now reimagines its business in the age of AI, big questions lie ahead.
A lot of value was unlocked when the company beat back the government’s breakup efforts. Now, Google is making strides in the AI race while its core business offers it financial flexibility.