AI startup OpenAI recently revealed that its groundbreaking chatbot, ChatGPT, has reached 200 million weekly active users, doubling the number it had in the last fall season. This rapid growth has made ChatGPT one of the fastest-growing and most popular apps of all time. Beyond answering basic questions, this powerful tool can compose entire essays in seconds, engage in human-like conversations, solve complex math problems, translate texts, and even write code. However, ChatGPT consumes a lot of energy in the process, reportedly nearly ten times more than traditional Google searches.
With this in mind, the team at BestBrokers decided to put into perspective the considerable amount of electric power OpenAI’s chatbot uses to respond to prompts on a yearly basis. We also looked at what that would cost at the average U.S. commercial electricity rate of $0.131 per kWh as of June 2024. After running the numbers, it turned out that for answering questions alone, ChatGPT consumes a whopping 453.6 million kilowatt-hours on average every year, which amounts to an expenditure of $59.4 million.
I’ll ChatGPT it
When you submit a query to an AI model like ChatGPT, the process is known as inference. This involves passing your query through the model’s parameters to generate a response. Estimating the energy required for that process is a bit tricky, however, as it depends on many variables, such as query length, number of users, and how efficiently the model runs. In most cases, this information is kept private. Still, we can make some rough calculations to get a sense of the amount of electricity that will be needed.
Apparently, each time you ask ChatGPT a question, it uses about 0.0029 kilowatt-hours of electricity. This is nearly ten times more than the energy needed for a typical Google search, which consumes about 0.0003 kilowatt-hours per query, according to The Electric Power Research Institute (EPRI).
On August 29, 2024, OpenAI announced that ChatGPT now boasts 200 million weekly active users, double the number reported by CEO Sam Altman last November. If we assume that each user asks around 15 questions weekly, keeping in mind that some users engage only occasionally while others ask dozens of questions daily, this adds up to roughly 3 billion queries per week. Spread out over the days of the week, this results in about 428.6 million daily user prompts, consuming a staggering 1.2 million kilowatt-hours of energy. For context, the average U.S. household uses about 29 kilowatt-hours of electricity per day. This means ChatGPT consumes more than 43 thousand times more power than a typical household each day.
Over the course of a year, ChatGPT’s total energy consumption works out to 453.6 million kilowatt-hours, an amount enough to respond to about 156.4 billion prompts. At the average U.S. commercial electricity rate of $0.131 per kWh as of June 2024, this translates to an annual electricity cost of about $59.4 million, with each query costing approximately $0.00038. Additionally, as OpenAI continues to enhance its chatbots with new features like audio, video, and image generation, as seen in the GPT-4o model, these figures could climb even higher. OpenAI has also developed a new series of reasoning AI models, including the recently released “OpenAI o1”. These models are designed to spend more time “thinking” before responding, similar to how humans approach complex problems. This enhanced reasoning ability allows them to tackle more difficult tasks but will likely lead to higher costs and greater energy consumption.
Putting ChatGPT’s massive energy usage for handling user prompts into perspective
By the end of 2023, the U.S. Department of Energy of Energy reported approximately 3.3 million electric cars on the road in the United States. According to the EV database, the average electric car consumes 0.191 kilowatt-hours per kilometer and can travel about 379 kilometers on a full charge, which means it has a battery that can store roughly 72.4 kilowatt-hours of energy. Therefore, we can calculate that charging all the electric vehicles in the U.S. once would require 238.8 million kilowatt-hours. If we take the total energy ChatGPT uses in a year to generate responses and divide it by that number, it turns out we could fully charge every electric vehicle in the U.S. about twice.
Now, let’s look at this from a household perspective. According to the Energy Information Administration (EIA), the average home in the United States consumes about 10,500 kilowatt-hours of electricity per year. That means the energy ChatGPT uses each year to handle requests could power 43,204 U.S. homes for an entire year. While that’s only about 0.03% of the 131 million households in the country, as per the latest Census data, it’s still a massive amount of energy, especially when you consider that the U.S. ranks third globally in terms of the number of households.
Think about smartphones next. The latest iPhone 15, for example, has a battery capacity of 12.98 watt-hours, which amounts to about 4.74 kilowatt-hours to charge it every day for a whole year. If we compare this to ChatGPT’s yearly energy-use figure, we’d find that the chatbot could fully charge 95.7 million iPhones every single day, for an entire year. That’s an incredible number of phones kept powered up, simply from the energy ChatGPT uses to generate responses. Furthermore, according to the Carbon Trust, which is a UK-based non-profit dedicated to helping businesses cut carbon emissions, one hour of video streaming in Europe consumes about 188 watt-hours of energy. In contrast, ChatGPT spitting out information for just one hour uses approximately 51.8 million watt-hours. That means to match the energy consumption of ChatGPT’s hourly operation, you’d need to stream video for a whopping 275,456 hours.
And here’s where it gets even more interesting. Using the latest EIA data, we identified twenty-four countries and territories, including Gibraltar, Grenada, Dominica, Haiti, and Saint Lucia, which separately consume less energy annually than ChatGPT uses solely to process user queries. In fact, based on our calculations, ChatGPT’s yearly energy usage to handle prompts could power Ireland or Serbia, each consuming approximately 31 billion kilowatt-hours per year, for five days and eight hours. For nations with higher energy demands, like Finland or Belgium, where the annual electricity consumption is roughly 80 billion kilowatt-hours, it would keep the lights on for two days and two hours.
Even energy-intensive countries such as the United Kingdom and Italy where annual consumption ranges between 287 and 298 billion kilowatt-hours, could stay powered for up to fourteen hours on ChatGPT’s yearly energy usage for responding to prompts. Amazingly, ChatGPT’s yearly energy consumption to answer questions could even keep Canada powered for seven hours or provide electricity for the entire United States for one hour.
The Training Costs of ChatGPT
Training large-language models (LLMs) is a highly energy-intensive process as well. During this phase, the AI model “learns” by analysing large amounts of data and examples. This can take anywhere from a few minutes to several months depending on data volume and model complexity. Throughout the training, the CPUs and GPUs, the electronic chips designed to process large datasets, run nonstop, consuming significant amounts of energy.
For instance, training OpenAI’s GPT-3, with its 175 billion parameters, took about 34 days and used roughly 1,287,000 kilowatt-hours of electricity. But as models evolve and become more complex, their energy demand increases as well. Training the GPT-4 model, with over 1 trillion parameters, consumed around 62,318,800 kilowatt-hours of electricity over 100 days, 48 times more than GPT-3.
In fact, it is the new training method that truly sets the o1 model apart from its predecessors. It was developed using a new optimisation algorithm and a dataset specifically tailored for the model. Unlike previous GPT models that mimic patterns from their training data, o1 was trained to solve problems on its own using reinforcement learning, a technique that teaches the system through rewards and penalties. It then uses a “chain of thought” to process queries, similar to how humans tackle problems step-by-step. This approach allows o1 to refine its thinking process, experiment with different strategies, and learn from its mistakes.
Methodology
To estimate the number of queries ChatGPT handles each week, the team at BestBrokers began by considering its 200 million weekly users. We speculated that, on average, each user asks the chatbot 15 questions per week. Bear in mind that this is a rough assumption, recognising that while some users may not engage daily, others might ask dozens of questions every day. Therefore, the actual number of queries could be much higher.
With this estimate in hand, we calculated the weekly electricity consumption of ChatGPT by multiplying the total number of weekly queries by the estimated energy usage per prompt, which is 0.0029 kWh, according to the Electric Power Research Institute (EPRI). From there, we extrapolated the energy consumption on a monthly and yearly basis. To determine the associated costs of this electricity usage, we applied the average U.S. commercial electricity rate of $0.131 per kWh as of June 2024.