Amazon deepened its partnership with Anthropic on Monday, committing a fresh $5 billion to the AI company and bringing the tech giant’s total investment to $13 billion. The cash infusion is tied to a massive cloud‑services commitment: Anthropic will allocate more than $100 billion to Amazon Web Services (AWS) over the next ten years. The agreement gives Anthropic access to up to 5 gigawatts of new computing capacity, a scale sufficient to train and run the Claude series of large language models.
At the core of the deal are Amazon’s custom‑designed chips. The agreement specifically covers the Trainium family—versions 2 through 4—although the Trainium‑4 silicon has yet to enter production. Amazon’s most recent AI accelerator, Trainium‑3, hit the market in December, and Anthropic secured the option to purchase capacity on any future Amazon chips as they become available. The arrangement also leverages Graviton, Amazon’s low‑power CPU, to balance workloads across the cloud infrastructure.
The partnership echoes a similar deal Amazon signed with OpenAI in February, when the e‑commerce giant joined a $110 billion funding round for the ChatGPT creator, contributing $50 billion. Both agreements blend cash investment with long‑term cloud services, effectively tying the AI firms’ compute needs to Amazon’s infrastructure. By locking in Anthropic’s spending, Amazon aims to ensure that a growing share of generative‑AI workloads run on AWS, bolstering its position against rivals such as Microsoft Azure and Google Cloud.
Industry observers note that the deal could influence Anthropic’s valuation. Venture‑capital firms have reportedly floated a funding round that could value the company at $800 billion or higher, though no formal offer has been announced. The new Amazon investment, combined with the cloud‑spending pledge, may set a benchmark for future AI‑cloud partnerships, where hardware access and compute capacity become as valuable as outright cash.
Amazon’s strategy reflects a broader shift toward integrating AI hardware and services. Trainium chips, positioned as competitors to Nvidia’s accelerators, aim to deliver high‑performance training at lower cost. By offering Anthropic a pipeline of current and future chips, Amazon hopes to capture a loyal customer base that will remain dependent on its cloud ecosystem for the foreseeable future.
For Anthropic, the commitment secures a reliable, high‑capacity compute platform essential for advancing Claude’s capabilities. The company’s leadership indicated that the AWS partnership will enable faster model iteration and broader deployment of its AI products across enterprise and consumer markets.
Both parties stand to benefit: Amazon gains a steady revenue stream and a showcase client for its AI hardware, while Anthropic receives the compute horsepower needed to stay competitive in the rapidly evolving generative‑AI landscape.
Cet article a été rédigé avec l'assistance de l'IA.
News Factory SEO vous aide à automatiser le contenu d'actualités pour votre site.