Amazon (NASDAQ: AMZN) CEO Andy Jassy isn’t being subtle about where he thinks the AI chip market is headed. In his 2025 annual letter to shareholders (1) published Thursday, Jassy made his most pointed comments yet about Amazon Web Services’ growing challenge to Nvidia’s dominance in AI semiconductors, framing the company’s custom Trainium chips (2) as a cost-effective alternative to Nvidia’s GPUs.
“Virtually all AI thus far has been done on Nvidia chips, but a new shift has started,” Jassy wrote. “We have a strong partnership with Nvidia, will always have customers who choose to run Nvidia, and we will continue to make AWS the best place to run Nvidia. However, customers want better price-performance. We’ve seen this movie before.”
The “movie” Jassy is referencing here is AWS’s experience displacing Intel in cloud computing. “In the CPU space, virtually all of the workloads ran on Intel chips until we invented Graviton in 2018,” he wrote. “Graviton, which has up to 40% better price-performance than other x86 processors, is now used expansively by 98% of the top 1,000 EC2 customers.”
“The same story arc is unfolding in AI,” he added.
Amazon declined Moneywise‘s request to comment further on the matter.
The AI chip business: by the numbers
Jassy laid out the Trainium roadmap in his annual letter. Amazon’s second-generation AI chip, Trainium2, delivered about 30% better price-performance than comparable GPUs, he wrote, and “has largely sold out.” Trainium3, which began shipping at the start of 2026, offers another 30-40% improvement over its predecessor and is nearly fully subscribed. And a significant portion of Trainium4 — still about 18 months from broad availability — has already been reserved by customers.
On the whole, Amazon’s custom chips business — which includes Graviton, Trainium, and its Nitro networking cards — has hit an annual revenue run rate above $20 billion. That figure has doubled from the $10 billion Amazon disclosed alongside its Q4 2025 earnings (3), and the business is growing at triple-digit percentages year over year, according to Amazon’s CEO.
“If our chips business was a stand-alone business, and sold chips produced this year to AWS and other third parties (as other leading chips companies do), our annual run rate would be ~$50 billion,” Jassy wrote.
For context: Nvidia reported record full-year revenue (4) of $215.9 billion for its fiscal year ending January 2026, with data center sales climbing 75% year-on-year to $62.3 billion in Q4 alone.
At scale, Jassy wrote, Trainium is expected to save Amazon “tens of billions of capex dollars per year, and provide several hundred basis points of operating margin advantage versus relying on others’ chips for inference.” Amazon Bedrock (5), AWS’s fast-growing inference service, already “runs most of its inference on Trainium.”
“Our chips business is on fire,” Jassy wrote, adding it “changes the economics for AWS, and will be much larger than most think.”
Must Read
- Dave Ramsey warns nearly 50% of Americans are making 1 big Social Security mistake — here’s what it is and the simple steps to fix it ASAP
- Robert Kiyosaki begs investors not to miss this ‘explosion’ — says this 1 asset will surge 400% in a year
- Vanguard reveals what could be coming for U.S. stocks, and it’s raising alarm bells for retirees. Here’s why and how to protect yourself
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
Who’s actually using Amazon’s AI chips?
The strongest evidence for Jassy’s claims may be the customer list. Anthropic, the AI safety company behind Claude, is the most prominent Trainium customer. Amazon said in late 2025 (6) that Anthropic is using 500,000 Trainium2 chips as part of Project Rainier, a massive AI compute cluster spread across multiple data centers, and that its models would scale to more than 1 million Trainium2 chips for training and inference.
According to a TechCrunch (7) tour of Amazon’s Trainium lab in Austin published last month, there are now 1.4 million Trainium chips deployed across all three generations, and Anthropic’s Claude runs on over 1 million of the Trainium2 chips deployed. OpenAI, too, has signed on: its over-$100 billion commitment to AWS (8), which Jassy referenced in the letter, makes Amazon the exclusive infrastructure provider for parts of OpenAI’s workloads. Apple is reportedly testing the chips as well, according to TechCrunch.
The relationship with Anthropic goes beyond a typical cloud customer arrangement. Anthropic engineers have had direct input into the chip’s instruction set architecture, and Amazon agreed to open up its instruction set — removing a pain point that Anthropic’s engineers had experienced with Nvidia GPUs, where the company tries to obscure that information to keep competitors from seeing it, Semafor (9) reported last month.
Nvidia’s position — and its challengers
Despite Jassy’s enthusiasm for Amazon’s chips business, Nvidia’s position in AI is still considerably stronger. Nvidia holds an 81% market share by revenue for data center chips, according to IDC research (10), and its CEO Jensen Huang recently projected $1 trillion in cumulative AI chip revenue through 2027, according to Bloomberg (11). Nvidia’s Blackwell Ultra chips are also notably the clear performance leader on several benchmarks.
But the competition is stiffening in the space. Almost every major cloud provider is building custom AI silicon, meaning Amazon isn’t alone in its efforts. Google has been refining its Tensor Processing Units for a decade and is the furthest along among hyperscalers, semiconductor analyst Stacy Rasgon at Bernstein told CNBC (12) in November. Microsoft also has its own silicon, and unveiled its second-generation custom AI chip, the Maia 200, in January; the company claims it delivers three times the performance of Amazon’s latest Trainium on certain benchmarks, according to GeekWire (13). Meta is also scaling its own MTIA chips, and ChatGPT maker OpenAI is working with Broadcom on custom silicon (14) expected by late 2026.
So, where does all this competition leave Nvidia? While it still has an undeniable position, its market share in AI accelerators is expected to come down this year, from 87% in 2024 to around 75% in 2026. Nvidia’s overall revenue will keep growing since the total market is expanding faster than any share decline, but that’s kind of Jassy’s point in the letter. Amazon doesn’t need to “beat” Nvidia to win, it just needs Trainium to be good enough, and notably cheap enough, to capture a meaningful share of this inflated market.
“There’s so much demand for our chips that it’s quite possible we’ll sell racks of them to third parties in the future,” Jassy wrote.
Article Sources
We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.
Amazon (1); AWS Trainium (2); Yahoo Finance (3); Nvidia Newsroom (4); AWS Bedrock (5); Amazon (6); TechCrunch (7); OpenAI (8); Semafor (9); CNN (10); Bloomberg (11); CNBC (12); GeekWire (13); OpenAI (14)
You May Also Like
- Turning 50 with $0 saved for retirement? Most people don’t realize they’re actually just entering their prime earning decade. Here are 6 ways to catch up fast
- This 20-year-old lotto winner refused $1M in cash and chose $1,000/week for life. Now she’s getting slammed for it. Which option would you pick?
- Warren Buffett used these 8 repeatable money rules to turn $9,800 into a $150B fortune. Start using them today to get rich (and stay rich)
- Here are 5 easy ways to own multiple properties like Bezos and Beyoncé. You can start with $10 (and no, you don’t have to manage a single thing)
Dave Smith is the VP of content and editor-in-chief at Moneywise and Money.ca. His work has also been published in Fortune, Business Insider, Newsweek, ABC News, and USA Today. He holds a degree from the University of Maryland and lives in Toronto.
