Alibaba Cloud to Build Hyperscale Computing Center in Shanghai’s Jinshan District

Alibaba signed a strategic cooperation agreement with the Jinshan District government in Shanghai on March 9 to build what it is calling one of the largest intelligent computing hubs in East China.

The facility will run on Alibaba’s in-house Zhenwu chips, developed by its T-Head semiconductor unit, and will form part of a full-stack domestic computing infrastructure that China has been quietly assembling for years while the West debated whether its AI models were sentient.

The announcement is significant for several reasons that go beyond the obvious. Alibaba has already committed $69 billion in AI infrastructure investment over three next three years. This facility in Jinshan builds on a project that began in 2021, backed by 40 billion yuan. The Zhenwu chip, which has now shipped in the hundreds of thousands of units, has moved past Cambricon Technologies to become one of China’s leading domestically developed AI processors. The chip geopolitics here are their own story, but that is not the story we want to tell today.

The story we want to tell is about the electricity.

Every large language model query, every image generation, every AI-assisted search, every training run that produces the models the world is now integrating into healthcare, education, finance and public administration, all of it runs on power. Enormous, continuous, non-negotiable amounts of it. China’s total installed IT load in hyperscale data centers is projected to more than double between now and 2031, from just over 5,000 megawatts to nearly 12,000 megawatts. That is not a rounding error. That is the energy consumption of a medium-sized country being added to the grid in service of keeping AI running.

Alibaba describes the Jinshan facility as a benchmark for green and energy-efficient computing infrastructure. The company’s earlier Hangzhou data center demonstrated genuine innovation, deploying one of the world’s largest server clusters submerged in liquid coolant, reducing energy consumption by more than 70 percent and achieving a power usage effectiveness rating approaching 1.0, which is as close to perfect efficiency as the physics currently allows. These are not empty claims. The engineering behind them is real and the results are measurable.

But efficiency and scale are pulling in opposite directions. You can make each unit of compute greener and still have the aggregate energy demand grow faster than any efficiency gain can offset, which is precisely what is happening across the global AI infrastructure buildout. The industry calls this the rebound effect. It is the same phenomenon that made fuel-efficient cars more affordable to drive, which caused people to drive more, which meant total fuel consumption went up anyway. More efficient AI infrastructure makes AI cheaper to deploy, which accelerates deployment, which increases total energy demand.

China’s response to this, at the policy level, has been the Eastern Data Western Computing program, which channels new data center capacity toward the country’s renewable-rich western provinces. Seventy percent of new capacity is being directed there. It is a structurally sound approach to the geography of clean energy, and it is still not sufficient on its own to absorb what the AI expansion is demanding.

The broader conversation about AI’s energy footprint rarely makes it into the announcements. Hyperscale computing center launches are written in the language of capacity, capability, and sovereign technology. The electricity required to run them appears in sustainability reports, in footnotes, in targets set for dates that are far enough away to require no immediate discomfort.

We think that gap between the announcement language and the physical reality it represents deserves to be named. The computing infrastructure being built right now, by Alibaba in Shanghai, by Google and Microsoft and Amazon across the United States, by the Gulf states with their sovereign AI ambitions, is not neutral infrastructure. It is a long-term energy commitment made on behalf of populations who have not been asked whether they understand the terms.

Alibaba’s liquid cooling is genuinely better than what came before. The Jinshan facility will almost certainly be more efficient than the one it is expanding. That is not the problem. The problem is that the industry’s definition of progress is measured in capability added per watt consumed, when the more honest measure would be total watts consumed per year and what is generating them.

The AI race has a power bill. We are all paying it, and the invoice has not yet arrived in full.

SHARE THIS NEWS

Facebook
Twitter
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *