
Report: ChatGPT consumes less energy than household appliances
A new study by the nonprofit institute Epoch AI has disproven the widespread belief about ChatGPT’s high energy consumption. According to the analysis, OpenAI’s popular chatbot proved to be significantly more energy-efficient than previously thought.
The study showed that an average query to ChatGPT using the latest GPT-4o model consumes only about 0.3 watt-hours of energy. This figure is ten times lower than the previously cited estimate of 3 watt-hours per query, which was widely quoted in media and compared to the energy consumption of ten Google searches.
“Energy consumption is actually insignificant compared to using regular household appliances, heating or cooling a home, or taking a car trip,” explained Joshua Yu, Epoch’s data analyst who conducted the study. According to him, previous studies were based on outdated data and assumed that OpenAI uses less efficient older-generation chips to run their models.
The discussion about AI energy consumption remains relevant against the backdrop of rapid expansion of AI companies’ infrastructure. Just a week ago, more than 100 organizations published an open letter calling on the AI industry and regulators to ensure that new data centers don’t deplete natural resources and don’t force utilities to rely on non-renewable energy sources.
This research could become an important argument in the discussion about artificial intelligence’s environmental impact, demonstrating that modern AI systems can be more energy-efficient than previously thought. However, the overall impact of the AI industry on the environment remains a subject of close attention from environmentalists and regulators.