HomeAUOpenAI CEO Addresses Energy Consumption Concerns, Compares AI Training to Human Development

OpenAI CEO Addresses Energy Consumption Concerns, Compares AI Training to Human Development

Share and Follow

Sam Altman, the CEO of OpenAI, has recently addressed the ongoing debate about the energy demands of artificial intelligence, suggesting that the energy required to train AI is not as disproportionate as some might think. He likened it to the energy investment needed to educate a human being.

Altman, who appeared at a Q&A session hosted by newspaper The Indian Express this week, pushed back on the comparisons between humans and artificial intelligence.

Altman highlighted a common misconception in the dialogue surrounding AI’s energy consumption. “People often compare the energy needed to train an AI model with the energy a human expends during a single inference query,” he explained. “However, they overlook the substantial energy it takes to educate a human being over time.”

OpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too.
OpenAI CEO Sam Altman has downplayed concerns about AI’s energy cost, arguing it takes a lot of energy to train a human too. (AP)

He further elaborated on the human aspect, noting, “It takes about 20 years of living and all the sustenance consumed in that period to develop an intelligent human.” Altman also pointed out the broader evolutionary context, stating, “Beyond individual training, it has taken the evolution and experiences of approximately 100 billion people throughout history to learn survival skills, scientific understanding, and much more that culminates in modern human intelligence.”

“It takes like 20 years of life and all of the food you eat during that time before you get smart. 

“Not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.

“Then you took whatever you took.”

According to US research, a single ChatGPT query consumes nearly 10 times as much energy as a typical Google search, while it takes about half a litre of water to process 20 to 50 queries.
Data centres consumed more electricity than all of Australia in 2022.
The world’s data centres consumed more electricity than all of Australia in 2022. (Getty)

Generative AI models need massive computational power to train and run them.

Data centres, which contain buildings full of computer servers, churn through huge amounts of electricity all over the globe.

Data centres consumed about 460 terawatt-hours (TWh) in 2022 alone, according to the International Energy Agency (IEA).

All of Australia consumed less than 300TWh that same year.

Altman argued concerns about water usage were “totally fake” but conceded ”we used to do evaporative cooling in data centres”.

“Now that we don’t do that,” he said.

“You see these things on the internet where [a post says] ‘don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever.

“This is completely untrue, totally insane, no connection to reality.

“What is fair though is the energy consumption, not per query, but in total because the world is now using so much AI.

“It is real and we need to move towards nuclear or wind and solar very quickly.”

NEVER MISS A STORY: Get your breaking news and exclusive stories first by following us across all platforms.

Share and Follow