Anthropic launches new Claude 4 AI models

AI firm Anthropic has now sealed a multi-year cloud deal with Google, granting it access to up to one million Tensor Processing Units (TPUs), Google’s proprietary AI accelerators. This commitment, announced Thursday, represents one of the largest single compute procurement deals in the AI industry and deepens the alliance between the two firms. The TPU rollout is scheduled to commence later this year, and will scale through 2026, primarily augmenting Anthropic’s training clusters in the US. For now, the payment terms, including any equity components or usage-based structures, were not disclosed.

“Anthropic’s choice to significantly expand its usage of TPUs reflects the strong price-performance and efficiency its teams have seen with TPUs for several years,” Thomas Kurian, CEO at Google Cloud, commented on the matter. “We are continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh generation TPU, Ironwood.” “Anthropic and Google have a longstanding partnership and this latest expansion will help us continue to grow the compute we need to define the frontier of AI,” Krishna Rao, Anthropic CFO, announced in an official statement.

The scale of the transaction, estimated to be worth tens of billions of dollars, is projected to deliver more than one gigawatt of computing capacity to Anthropic’s infrastructure by 2026. Industry analysts indicate that establishing a facility of this magnitude could require an overall investment of approximately $50 billion, with chip infrastructure constituting the majority of the expenditure. By tapping Google’s vast TPU infrastructure, the company gains both scale and diversification — insulating itself from overreliance on Amazon Web Services while maintaining flexibility across multiple vendors. The additional capacity is expected to accelerate training cycles for its Claude models and future-generation systems, while lowering costs compared to GPU-based setups.

The agreement builds upon the partnership that first began in 2023, when Google made a minority investment in Anthropic and began hosting its Claude language models on Google Cloud. Anthropic, founded by former OpenAI researchers, maintains a diversified, or multi-cloud, infrastructure strategy. Its models are currently trained on a mix of Nvidia GPUs, Amazon’s Trainium 2 chips, and Google’s TPUs, a deliberate approach intended to optimize for cost, performance, and energy efficiency while mitigating vendor lock-in.

Speaking of Google, its custom-designed TPUs, jointly developed with Broadcom, are engineered specifically to run large-scale training and inference workloads with enhanced power efficiency compared to general-purpose graphics chips. Thomas Kurian, CEO of Google Cloud, noted that Anthropic’s choice to expand its TPU usage demonstrates the strong performance and efficiency of the company’s seventh-generation accelerator, codenamed “Ironwood.” The transaction may strengthen Google’s competitive position in the cloud infrastructure market, where it seeks to leverage its niche expertise in AI-specific computing against rivals. As enterprise demand for high-performance hardware escalates, TPU contracts are expected to become a key driver of revenue growth for Google Cloud.

The Tech Portal is published by Blue Box Media Private Limited. Our investors have no influence over our reporting. Read our full Ownership and Funding Disclosure →