Amazon Develops AI Chips to Reduce Dependency on Nvidia

Amazon is taking significant steps to reduce its reliance on Nvidia by developing its own artificial intelligence (AI) chips. The tech giant has a history of designing custom processors, and this latest initiative extends its 2015 investment in chip design startup Annapurna Labs. Amazon’s custom chips, including the Graviton processors for general data center tasks and the Trainium chips specifically designed for training large language models, are part of this strategy.

Trainium, introduced in 2023 with the launch of Trainium2, is optimized for AI workloads, making it a key component in Amazon’s efforts to handle the growing demands of AI applications. The chips are already being used by companies like Anthropic, Amazon’s AI partner and a competitor to OpenAI, who utilizes Amazon’s Claude model. The development of these custom chips, led by Annapurna Labs, is not only aimed at reducing dependency on Nvidia but also at reducing costs associated with using third-party chips like those from AMD and Intel.

This move marks a significant step in Amazon’s broader strategy to control its hardware supply chain and reduce costs, while enhancing its ability to compete in the rapidly growing AI sector. By designing its own chips, Amazon hopes to achieve greater cost efficiency and performance optimization for its AI-driven services.