Introduction to the Problem
The problem with finding the energy consumption of AI models, such as those used in chatbots, was that the companies behind them, like Google, OpenAI, and Microsoft, were not willing to share this information. Researchers who study the impact of AI on energy grids compared this to trying to measure the fuel efficiency of a car without being able to drive it, relying on guesses based on rumors of its engine size and the sound it makes.
A Shift Towards Transparency
However, after an article was published in May highlighting this issue, something unexpected happened. In June, OpenAI’s Sam Altman shared that an average ChatGPT query uses 0.34 watt-hours of energy. Following this, in July, the French AI startup Mistral released an estimate of the emissions generated by their AI model, though not a direct number. Then, in August, Google revealed that answering a question using their Gemini model uses about 0.24 watt-hours of energy. These figures from Google and OpenAI were similar to the estimates made for medium-size AI models.
The Limitations of the Published Figures
Despite this newfound transparency, there are significant limitations to the figures published by tech companies. OpenAI’s number was shared in a blog post without detailed technical explanation, leaving many questions unanswered, such as which specific model was being referred to, how the energy use was measured, and the variability of this measurement. Google’s figure refers to the median energy use per query, which does not account for more energy-intensive responses, such as those requiring complex reasoning or generating long responses.
The Need for More Comprehensive Data
The numbers provided by these companies only pertain to interactions with chatbots and do not consider other ways in which people are increasingly relying on generative AI, such as video and image generation. According to Sasha Luccioni, AI and climate lead at Hugging Face, there is a need for data on different types of AI applications and how they compare in terms of energy consumption. As AI expands into more areas, such as video and image processing, understanding the energy footprint of these applications becomes crucial.
Conclusion
While the recent disclosures by Google and OpenAI represent a step towards transparency regarding the energy consumption of AI models, there is still much to be uncovered. The current figures are limited in scope, applying only to chat-based interactions and lacking in detail. For a comprehensive understanding of AI’s impact on the environment, more detailed and expansive data are needed, covering various types of AI applications and their energy usage patterns.
FAQs
- Q: Why is it hard to measure the energy consumption of AI models?
A: It’s challenging because the companies behind these models, like Google and OpenAI, were initially not willing to share this information, making it difficult for researchers to accurately measure or estimate the energy consumption. - Q: What did OpenAI and Google reveal about their AI models’ energy consumption?
A: OpenAI shared that an average ChatGPT query uses 0.34 watt-hours of energy, and Google revealed that their Gemini model uses about 0.24 watt-hours of energy per query. - Q: Why are the published figures considered limited?
A: The figures are limited because they lack detailed technical explanations, refer only to median energy use, and do not account for more energy-intensive queries or other types of AI applications like video and image generation. - Q: What is needed for a better understanding of AI’s environmental impact?
A: More comprehensive data that includes various types of AI applications, detailed measurements of energy consumption, and an understanding of how these models’ energy use varies under different conditions.