Amazon’s chip business could be worth $50 billion, Jassy says, and he hints it may sell them externally


Amazon’s chip business could be worth $50 billion, Jassy says, and he hints it may sell them externally

In short: Andy Jassy’s annual letter to shareholders, published on 9 April 2026, reveals that Amazon’s custom chip business, covering Graviton, Trainium, and Nitro, generates more than $20 billion in annualised revenue growing at triple-digit rates year-on-year. If sold on the open market like Nvidia, Jassy says, the business would be worth roughly $50 billion a year. He also signals that Amazon may begin selling those chips directly to third parties, and defends the company’s $200 billion capital expenditure plan for 2026 as grounded in committed customer demand rather than speculation.

“Not on a hunch”: the $200 billion bet

Jassy opened the letter’s financial argument with a direct rebuttal of the scepticism that has surrounded Amazon’s capital commitments. “We’re not investing approximately $200 billion in capex in 2026 on a hunch,” he wrote. “We’re not going to be conservative in how we play this. We’re investing to be the meaningful leader, and our future business, operating income, and free cash flow will be much larger because of it.” The context for that claim is a company that saw its free cash flow fall from $38 billion to $11 billion last year, driven by a $50.7 billion increase in capital spending, the bulk of it committed to AI infrastructure.

The defence rests on customer commitments already in place. Of the CapEx expected to be deployed in 2026, Jassy said a substantial portion already has customer backing, citing as one example OpenAI’s commitment of more than $100 billion to AWS. That commitment, which expanded an existing $38 billion seven-year partnership struck in November 2025, also includes OpenAI consuming approximately two gigawatts of Trainium capacity through AWS infrastructure. SoftBank, which holds a majority stake in OpenAI and has been financing its infrastructure build through mechanisms including a $40 billion bridge loan, is in effect underwriting part of the demand that Jassy is now pointing to as validation for his CapEx stance.

A $50 billion chip business hiding in plain sight

Amazon’s custom silicon programme spans three product lines. Graviton is a custom CPU that Jassy says delivers more than 40% better price-performance than comparable x86 processors, the market that Intel and AMD dominate. It is now used by 98% of the top 1,000 EC2 customers, a figure that reflects a shift in the economics of cloud compute that has been underway for several years. Demand is sufficiently intense that two large AWS customers asked whether they could purchase all available Graviton capacity for 2026. Amazon declined.

Trainium is the AI training and inference accelerator that represents Amazon’s most direct response to Nvidia. Trainium2, which Jassy says offers roughly 30% better price-performance than comparable GPU alternatives, has largely sold out. Trainium3, which began shipping in early 2026 and offers a further 30 to 40% improvement in price-performance over Trainium2, is nearly fully subscribed, with Uber among the companies that have moved workloads onto it. Trainium4, still approximately 18 months from broad availability and featuring interoperability with Nvidia’s NVLink Fusion interconnect technology, has already been significantly reserved. Nitro, the custom network and security chip that underpins AWS’s virtualisation layer, completes the three-chip portfolio. Together, Jassy says the three lines produce more than $20 billion in annualised revenue, growing at triple-digit percentage rates year-on-year. “If we were a standalone chip company,” he writes, “our chips would be generating over $50 billion in annual revenue.” The business currently exists entirely within AWS; customers access Trainium and Graviton through EC2 instances rather than buying chips directly.

At scale, Jassy argues, Trainium will “save us tens of billions of capex dollars per year, and provide several hundred basis points of operating margin advantage versus relying on others’ chips for inference.” That claim is central to the investment thesis underpinning the $200 billion CapEx programme: custom silicon is not only a competitive differentiator but a structural cost advantage that compounds over time as the ratio of inference to training in AI workloads continues to rise.

The Nvidia relationship, and the “new shift”

Jassy is careful in how he frames the competitive dynamic with Nvidia. “We have a strong partnership with NVIDIA, will always have customers who choose to run NVIDIA,” he writes, while also asserting that “virtually all AI thus far has been done on NVIDIA chips, but a new shift has started.” Customers, he says, “want better price-performance.” Nvidia, which reported revenue of $68.1 billion in the fourth quarter of 2025, a 73% year-on-year increase, entered 2026 from a position of market dominance that Amazon’s custom silicon is chipping away at from within the AWS customer base rather than in any broader merchant market. Trainium4’s incorporation of NVLink Fusion means Amazon is also building in a bridge rather than a wall: customers can combine Trainium accelerators with Nvidia GPUs within the same system, preserving optionality for enterprises that have invested heavily in Nvidia’s software stack.

The letter’s most consequential signal on chips, however, may be a single sentence about the future: “There’s so much demand for our chips that it’s quite possible we’ll sell racks of them to third parties in the future.” Amazon currently monetises its custom silicon exclusively through EC2 compute services. Selling chips directly would represent a structural shift in its competitive posture, placing it in the merchant silicon market alongside Nvidia and AMD, and allowing the economics of the chip business to be assessed independently of the cloud revenue it currently underpins.

Bedrock, Amazon Leo, and the broader picture

The shareholder letter situates the chip business within a wider AI infrastructure thesis. Amazon Bedrock, the managed service through which AWS customers access foundation models including Amazon’s own Nova family, processed more tokens in Q1 2026 than in all prior periods combined, with inference volumes “nearly doubling month-over-month” in March. AWS’s AI revenue run rate crossed $15 billion in Q1 2026, a figure Jassy contextualises by noting it represents growth roughly 260 times faster than AWS experienced at a comparable stage of its development.

Jassy also uses the letter to frame Amazon’s satellite internet service, Amazon Leo, as a competitive counterpart to SpaceX’s Starlink, having already secured contracts with Delta Air Lines, JetBlue, AT&T, Vodafone, and NASA. The satellite and chip disclosures share an underlying argument: that Amazon is building infrastructure at a scale and across categories that most observers have not fully priced in. The legal scrutiny that has begun to attach itself to Amazon’s AI products, including a proposed class action over the training data used for Nova Reel, represents one category of risk that the letter does not address. The year 2025 established AI infrastructure as the central capital allocation question for the technology industry, and Jassy’s letter is, in part, an argument that Amazon arrived at the right answer earlier and more decisively than the market has yet recognised.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with