OpenAI’s deal with AWS cements Amazon as the AI era’s infrastructure kingmaker. But also exposes how dangerously centralized and power-hungry the race for intelligence has become.
So here’s the thing: OpenAI has signed a $38 billion deal to use Amazon’s infrastructure. Yes, billions- granting them access to AWS datacentres and hundreds of thousands of Nvidia chips.
At first glance, this is the kind of muscle move that says, “We mean business in AI.” But dig a little, and you see something both bold and a little worrisome.
Bold because scaling frontier AI does, in fact, demand massive, reliable compute. OpenAI’s own CEO says this partnership “strengthens the broad compute ecosystem that will power this next era.”
Good, push the bounds, build the backbone. But what about the “worrying” part?
OpenAI simultaneously says it’s committed to 30 gigawatts of computing resources, enough to power about 25 million U.S. homes.
Now, compare that to revenue: OpenAI reportedly made around $13 billion annually (publicly, at least), yet has committed to a $1.4 trillion infrastructure binge.
Let the imbalance sink in. If you’re backing an AI war-machine, you’d better have a war budget, or the cash-flow won’t hold.
And then there’s Amazon. By taking OpenAI on this deal, Amazon becomes essentially the backbone- the pipes, the powerhouse. AWS is now deeply entwined with one of the most ambitious AI players. That’s smart for Amazon, no doubt. But for the broader ecosystem? This centralization raises vital questions about power, risk, and lock-in.
In short, OpenAI’s move is ambitious and deserves respect. But it may also be placing a staggering bet on a future where compute equals dominance. And AWS? They’re playing the infrastructure kingmaker. The risk is not just for the companies, but for the tech ecosystem:
When one deal holds this much sway, who watches the watcher?


