AI firm Anthropic has now sealed a multi-year cloud deal with Google, granting it access to up to one million Tensor Processing Units (TPUs), Googleâs proprietary AI accelerators. This commitment, announced Thursday, represents one of the largest single compute procurement deals in the AI industry and deepens the alliance between the two firms. The TPU rollout is scheduled to commence later this year, and will scale through 2026, primarily augmenting Anthropicâs training clusters in the US. For now, the payment terms, including any equity components or usage-based structures, were not disclosed.
âAnthropicâs choice to significantly expand its usage of TPUs reflects the strong price-performance and efficiency its teams have seen with TPUs for several years,â Thomas Kurian, CEO at Google Cloud, commented on the matter. âWe are continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh generation TPU, Ironwood.â âAnthropic and Google have a longstanding partnership and this latest expansion will help us continue to grow the compute we need to define the frontier of AI,â Krishna Rao, Anthropic CFO, announced in an official statement.
The scale of the transaction, estimated to be worth tens of billions of dollars, is projected to deliver more than one gigawatt of computing capacity to Anthropicâs infrastructure by 2026. Industry analysts indicate that establishing a facility of this magnitude could require an overall investment of approximately $50 billion, with chip infrastructure constituting the majority of the expenditure. By tapping Googleâs vast TPU infrastructure, the company gains both scale and diversification â insulating itself from overreliance on Amazon Web Services while maintaining flexibility across multiple vendors. The additional capacity is expected to accelerate training cycles for its Claude models and future-generation systems, while lowering costs compared to GPU-based setups.
The agreement builds upon the partnership that first began in 2023, when Google made a minority investment in Anthropic and began hosting its Claude language models on Google Cloud. Anthropic, founded by former OpenAI researchers, maintains a diversified, or multi-cloud, infrastructure strategy. Its models are currently trained on a mix of Nvidia GPUs, Amazonâs Trainium 2 chips, and Googleâs TPUs, a deliberate approach intended to optimize for cost, performance, and energy efficiency while mitigating vendor lock-in.
Speaking of Google, its custom-designed TPUs, jointly developed with Broadcom, are engineered specifically to run large-scale training and inference workloads with enhanced power efficiency compared to general-purpose graphics chips. Thomas Kurian, CEO of Google Cloud, noted that Anthropicâs choice to expand its TPU usage demonstrates the strong performance and efficiency of the companyâs seventh-generation accelerator, codenamed âIronwood.â The transaction may strengthen Googleâs competitive position in the cloud infrastructure market, where it seeks to leverage its niche expertise in AI-specific computing against rivals. As enterprise demand for high-performance hardware escalates, TPU contracts are expected to become a key driver of revenue growth for Google Cloud.
The Tech Portal is published by Blue Box Media Private Limited. Our investors have no influence over our reporting. Read our full Ownership and Funding Disclosure â