Despite spending significant sums on high-performance GPUs, major companies are increasingly developing their own custom chip solutions.
Amazon continues to refine its Trainium and Inferentia chips, emphasizing a favorable performance-to-price ratio. OpenAI is also reportedly designing its own chips, and Microsoft is pursuing its own path with its AI chip design.
According to a recent report from The Information, Google is exploring partnerships with data center providers to offer its custom chips for rent to customers. The company is actively promoting its custom tensor processing units (TPUs) to cloud computing providers such as CoreWeave, Crusoe, and Fluidstack, the report suggests.
With Nvidia currently dominating the AI hardware landscape, companies are seeking to diversify their supply chain and reduce their reliance on a single provider for the essential hardware that powers their AI initiatives.