Share and Follow
“Our partnership with OpenAI is expected to deliver tens of billions of dollars in revenue for AMD while accelerating OpenAI’s AI infrastructure build out,” AMD chief financial officer Jean Hu said.
The agreement comes days after OpenAI’s CEO, Sam Altman, revealed a plan to build artificial intelligence data centres worldwide, which could consume as much as 10 gigawatts of power, with another 17 gigawatts already being used.
That amount is comparable to the 10 gigawatts New York City uses in the summer and five gigawatts used by San Diego during its 2024 heatwave.
Experts have also likened the energy consumption used by OpenAI to the amount required by entire countries.
“Ten gigawatts is more than the peak power demand in Switzerland or Portugal. Seventeen gigawatts is like powering both countries together,” Cornell University professor Fengqi You said.
Andrew Chien, a computer science professor at the University of Chicago, told Fortune the energy is set to be on par with what “the whole economy consumes”.
“I’ve been a computer scientist for 40 years, and for most of that time, computing was the tiniest piece of our economy’s power use.
“A year and a half ago, they were talking about five gigawatts.
“Now they’ve upped the ante to 10, 15, even 17.
“There’s an ongoing escalation.”
“There are some good uses of the technology. But a lot of the time it’s being used to replace human labour without genuine gain for humans,” she said.
“Additionally, more generative AI doesn’t necessarily mean more accurate or safer technology.”
Altman recently claimed ChatGPT has surpassed 800 million weekly active users.
”More than 800 million people use ChatGPT every week, and we process over 6 billion tokens per minute on the API.”
“Thanks to all of you, AI has gone from something people play with to something people build with every day,” he said.