AI data centers must support much higher power densities than traditional data centers: Nvidia’s GB200 NVL72 systems are estimated to consume up to 120kW per rack, for instance, with classic computing infrastructure consuming perhaps one-tenth of that.
On top of that, the AI-enabled data center will need liquid cooling, advanced networking infrastructure, and advanced infrastructure management software.
And AWS isn’t the only cloud service provider that is ramping up its investments into AI-enabled data centers. Rival cloud service providers are all investing in either upgrading or opening new data centers to capture a larger chunk of business from developers and users of large language models (LLMs).
Earlier this year, Microsoft President Brad Smith said the company is on track to invest nearly $80 billion to build out AI-enabled data centers this fiscal year.
The majority of the $75 billion in capital expenditure Google will make this year will go toward technical infrastructure including servers and data centers, Google CFO Anat Ashkenazi said in the company’s earnings call this week.
Based on the investment numbers provided by the three major cloud service providers over the last week, AWS is leading the pack by around $20 to $25 billion — and is ahead, too, of analysts’ forecasts for cloud infrastructure spending.