Negative
24Serious
Neutral
Optimistic
Positive
- Total News Sources
- 3
- Left
- 1
- Center
- 1
- Right
- 0
- Unrated
- 1
- Last Updated
- 6 days ago
- Bias Distribution
- 50% Center


OpenAI Shifts ChatGPT AI Chips From Nvidia to Google Cloud
OpenAI has begun renting Google's tensor processing units (TPUs) to power ChatGPT and other AI products, marking its first significant use of non-Nvidia chips. This shift, moving away from Microsoft data centers and Nvidia GPUs, aims to reduce the high costs associated with inference computing. However, Google is limiting OpenAI's access to its most advanced TPUs to maintain a competitive edge, reserving those for internal projects like its Gemini language model. The partnership highlights a strategic move by OpenAI to diversify its AI chip supply amid growing demand, while Google expands its TPU customer base to include major tech players such as Apple. This development could disrupt the AI hardware market by challenging Nvidia's dominance and encouraging more cross-company collaborations in the AI sector. Overall, it reflects evolving industry dynamics where cost, availability, and technological innovation drive new infrastructure strategies for AI companies.


- Total News Sources
- 3
- Left
- 1
- Center
- 1
- Right
- 0
- Unrated
- 1
- Last Updated
- 6 days ago
- Bias Distribution
- 50% Center
Negative
24Serious
Neutral
Optimistic
Positive
Related Topics
Stay in the know
Get the latest news, exclusive insights, and curated content delivered straight to your inbox.

Gift Subscriptions
The perfect gift for understanding
news from all angles.