In a recent blog post, OpenAI’s CEO Sam Altman announced that an average query to ChatGPT utilizes approximately 0.000085 gallons of water, which equates to just over a teaspoon. This information was shared within a larger discussion on his expectations regarding the transformative impact of artificial intelligence on various aspects of life.
Altman highlighted the energy consumption involved in a typical ChatGPT query, noting it requires about 0.34 watt-hours. To put that into perspective, he compared it to the energy used by an oven over the course of slightly more than one second or by an energy-efficient lightbulb during a few minutes of operation. He further posited that the future cost of intelligence may align closely with electricity expenses. OpenAI has not yet responded to inquiries regarding the methodology that led to these statistics.
As the AI industry evolves, companies are increasingly facing questions about the energy requirements of their technologies. This year, experts have predicted that AI might surpass Bitcoin mining in power usage by year’s end. A report last year from The Washington Post, in collaboration with researchers, revealed that generating a 100-word email with an AI chatbot powered by GPT-4 could require slightly more than a bottle of water. The study also indicated that water consumption varies depending on the geographical location of data centers.