Researcher Jesse Dodge did some back-of-the-napkin math on the amount of energy AI chatbots use.
“One query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes,” he says. “So, you can imagine with millions of people using something like that every day, that adds up to a really large amount of electricity.”
He’s a senior research analyst at the Allen Institute for AI and has been studying how artificial intelligence consumes energy. To generate its answers, AI uses far more power than traditional internet uses, like search queries or cloud storage. According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search query.
And as AI gets more sophisticated, it needs more energy. In the U.S., a majority of that energy comes from burning fossil fuels like coal and gas which are primary drivers of climate change.
Most companies working on AI, including ChatGPT maker OpenAI, don’t disclose their emissions. But, last week, Google released a new sustainability report with a glimpse at this data. Deep within the 86-page report, Google said its greenhouse gas emissions rose last year by 48% since 2019. It attributed that surge to its data center energy consumption and supply chain emissions.
“As we further integrate AI into our products, reducing emissions may be challenging,” the report reads.
Google declined an interview with NPR.
“Bigger and bigger data centers all the way up to supercomputers”
Google has the goal of reaching net-zero emissions by 2030. Since 2007, the company has said its company operations were carbon neutral because of the carbon offsets it buys to match its emissions.
But, starting in 2023, Google wrote in its sustainability report that it was no longer “maintaining operational carbon neutrality.” The company says it’s still pushing for its net-zero goal in 2030.
“Google’s real motivation here is to build the best AI systems that they can,” Dodge says. “And they’re willing to pour a ton of resources into that, including things like training AI systems on bigger and bigger data centers all the way up to supercomputers, which incurs a tremendous amount of electricity consumption and therefore CO2 emissions.”
Microsoft has taken its climate pledge one step further than Google, saying it will be carbon negative by 2030. But, it too is facing setbacks because of its focus on AI. In its sustainability report released in May, Microsoft said its emissions grew by 29% since 2020 due to the construction of more datacenters that are “designed and optimized to support AI workloads.”
“The infrastructure and electricity needed for these technologies create new challenges for meeting sustainability commitments across the tech sector,” the report reads.
A company spokesperson declined to comment further.
AI’s deep thirst for energy
AI requires computer power from thousands of servers that are housed in data centers; and those data centers need massive amounts of electricity to meet that demand.
Northern Virginia has become a hub for the burgeoning data center industry. The data centers in that corner of the state will need the equivalent of enough energy to power 6 million homes by 2030, according to the Washington Post.
The thirst for electricity nationwide has become so intense that plans to decommission several coal plants have been delayed, according to another report by the Washington Post.
“There’s a whole material infrastructure that needs to be built to support AI,” says Alex Hanna, the director of research for Distributed AI Research Institute. She worked on Google’s Ethical AI team, but left the company in 2022 over the handling of a research paper that highlighted the environmental costs of AI.
Hanna says the data center boom will continue to grow “as long as there are these organizations that are committed to going whole hog on AI.”
Goldman Sachs has researched the expected growth of data centers in the U.S. and estimates they’ll be using 8% of total power in the country by 2030, up from 3% in 2022. Company analysts say “the proliferation of AI technology, and the data centers necessary to feed it” will drive a surge in power demand “the likes of which hasn’t been seen in a generation.”
Currently, there are more than 7,000 data centers worldwide, according to Bloomberg. That’s up from 3,600 in 2015. When combined, Bloomberg estimates these data centers consume the equivalent amount of electricity per year as the entire country of Italy.
“AI-first” world
All major tech companies are going full throttle on AI. Alphabet CEO Sundar Pichai has dubbed Google an “AI-first” company. Over the last few months, the company released its Gemini chatbot to the world and added its A.I. Overview tool to Google Search. Facebook parent Meta has added chatbots to several of its products. And Apple announced a partnership with OpenAI last month to bring AI to its Siri digital assistant.
During first quarter earnings, all of these companies said they were investing billions of dollars in AI.
Google said it spent $12 billion on capital expenditures just that quarter, which was “driven overwhelmingly” by investments in data centers to fuel its AI endeavors. The company said it expects to keep up that same level of spending throughout the year.
Hanna, the AI researcher, says the environmental costs of artificial intelligence are only going to get worse unless there’s serious intervention.
“There’s a lot of people out there that talk about existential risk around AI, about a rogue thing that somehow gets control of nuclear weapons or whatever,” Hanna says. “That’s not the real existential risk. We have an existential crisis right now. It’s called climate change, and AI is palpably making it worse.”
Editor’s note: Google and Microsoft are among NPR’s financial supporters.