Amazon Unveils AI Chip Ramping Up Competition with Nvidia

Amazon trailers lined up and waiting
Dan Kitwood /Getty

Amazon is preparing to launch its latest AI chip, Trainium 2, as it looks to boost returns on its multibillion-dollar investments in semiconductors and decrease reliance on market leader Nvidia.

Ars Technica reports that Amazon’s cloud computing division, Amazon Web Services (AWS), is making significant investments in custom chips to improve efficiency in its data centers and reduce costs for both the company and its customers. Leading this effort is Annapurna Labs, an Austin-based chip startup acquired by Amazon in 2015 for $350 million.

The upcoming release of Annapurna’s Trainium 2 chip, part of a line of AI chips designed for training large models, is set to be announced next month. The chip is already being tested by several companies, including AI competitor Anthropic, which has secured $4 billion in backing from Amazon, as well as Databricks, Deutsche Telekom, Ricoh, and Stockmark.

AWS and Annapurna’s primary target is to challenge Nvidia, the dominant player in the AI processor market. Dave Brown, vice-president of compute and networking services at AWS, emphasized the importance of providing an alternative to Nvidia while still maintaining compatibility. Amazon claims that its Inferentia line of AI chips is already 40 percent cheaper to run for generating responses from AI models, which can lead to significant savings for customers with large-scale deployments.

Amazon’s capital spending is expected to reach around $75 billion in 2024, with the majority allocated to technology infrastructure. This represents a substantial increase from the $48.4 billion spent in 2023. The surge in spending is part of an ongoing AI investment spree among major cloud providers like Microsoft and Google.

Designing their own data center chips allows these companies to lay the foundation for anticipated AI growth. As Daniel Newman from The Futurum Group notes, the move towards vertically integrated chip technology stacks is driven by the desire for lower production costs, higher margins, greater availability, and more control.

Annapurna’s approach involves building everything from the ground up, including the silicon wafer, server racks, and proprietary software and architecture. Rami Sinno, Annapurna’s director of engineering, emphasizes the complexity and scale of their operation, stating that few companies can match their capabilities.

Despite their efforts, AWS and Annapurna have yet to significantly impact Nvidia’s dominance in the AI infrastructure market. In its second fiscal quarter of 2024, Nvidia reported $26.3 billion in revenue from AI data center chip sales, matching Amazon’s entire AWS division revenue for the same period. Only a small fraction of AWS revenue can be attributed to customers running AI workloads on Annapurna’s infrastructure.

While Amazon avoids direct performance comparisons with Nvidia and does not submit its chips for independent benchmarks, chip consultant Patrick Moorhead is confident in the company’s claimed 4-times performance increase between Trainium 1 and Trainium 2. However, he notes that performance figures may be less important than offering customers more choice in the market.

Read more at Ars Technica here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Authored by Lucas Nolan via Breitbart November 13th 2024