AI’s Power Demand: Calculating ChatGPT’s electricity consumption for handling over 365 billion user queries every year

AI startup OpenAI recently revealed that its groundbreaking chatbot, ChatGPT, has reached 300 million weekly active users, doubling the number it had in September 2023. This rapid growth has made ChatGPT one of the fastest-growing and most popular apps of all time. Beyond answering basic questions, this powerful tool can compose entire essays in seconds, engage in human-like conversations, generate images, solve complex math problems, translate texts, and even write code. However, ChatGPT consumes a lot of energy in the process, reportedly nearly ten times more than traditional Google searches.

With this in mind, the team at BestBrokers decided to put into perspective the considerable amount of electric power OpenAI’s chatbot uses to respond to prompts on a yearly basis. We also looked at what that would cost at the average U.S. commercial electricity rate of $0.132 per kWh as of October 2024 (the latest rates published by the U.S. Energy Information Administration). After running the numbers, it turned out that for answering questions alone, ChatGPT consumes a whopping 1.059 billion kilowatt-hours on average every year, which amounts to an expenditure of $139.7 million.

I’ll ChatGPT it

When you submit a query to a trained AI model like ChatGPT, the process is known as inference. This involves passing your query through the model’s parameters, which allows the model to spot a pattern and generate a response from the available data. Estimating the energy required for that process is a bit tricky, however, as it depends on many variables, such as query length, number of users, and how efficiently the model runs. In most cases, this information is kept private. Still, we can make some rough calculations to get a sense of the amount of electricity that will be needed.

Apparently, each time you ask ChatGPT a question, it uses about 0.0029 kilowatt-hours of electricity. This is nearly ten times more than the energy needed for a typical Google search, which consumes about 0.0003 kilowatt-hours per query, according to The Electric Power Research Institute (EPRI).

On December 4, 2024, OpenAI posted on X that ChatGPT now boasted 300 million weekly active users, double the number reported by CEO Sam Altman in September 2023. Moreover, these 300 million users are generating 1 billion queries per day. To process these queries and generate answers, the chatbot uses roughly 2.9 million kilowatt-hours of energy every day. For context, the average U.S. household uses about 29 kilowatt-hours of electricity per day. This means ChatGPT consumes 100 thousand times more power than a typical household.

Over the course of a year, ChatGPT’s total energy consumption works out to 1,058.5 GWh, an amount enough to respond to about 365 billion prompts. This massive use of electricity is equivalent to the power consumption of a small country; according to the U.S. Energy Information Administration, in 2022, Barbados consumed around a thousand gigawatt-hours, or just under what ChatGPT uses to process questions and generate answers.

At the average U.S. commercial electricity rate of $0.132 per kWh as of October 2024, this translates to an annual electricity cost of about $139.72 million, with each query costing approximately $0.00038. Additionally, as OpenAI continues to enhance its chatbots with new features like audio, video, and image generation, as seen in the GPT-4o model, these figures could climb even higher. OpenAI has also developed a new series of reasoning AI models, including the recently released “OpenAI o1”. These models are designed to spend more time “thinking” before responding, similar to how humans approach complex problems. This enhanced reasoning ability allows them to tackle more difficult tasks but will likely lead to higher costs and greater energy consumption.

Putting ChatGPT’s massive energy usage for handling user prompts into perspective

By the end of 2023, the U.S. Department of Energy reported approximately 3.3 million electric cars on the roads in the United States. According to the EV database, the average electric car consumes 0.189 kilowatt-hours per kilometer and can travel about 378 kilometers on a full charge, which means it has a battery that can store roughly 71.4 kilowatt-hours of energy. Therefore, we can calculate that charging all the electric vehicles in the U.S. once would require 235.6 million kilowatt-hours. If we take the total energy ChatGPT uses in a year to generate responses and divide it by that number, it turns out we could fully charge every electric vehicle in the U.S. about 4.5 times.

Now, let’s look at this from a household perspective. According to the Energy Information Administration (EIA), the average home in the United States consumes about 10,500 kilowatt-hours of electricity per year. That means the energy ChatGPT uses each year to handle requests could power 100,810 U.S. homes for an entire year. While that’s only about 0.08% of the 131 million households in the country, as per the latest Census data, it’s still a massive amount of energy, especially when you consider that the U.S. ranks third globally in terms of the number of households.

Think about smartphones next. iPhone 15, for example, has a battery capacity of 12.98 watt-hours, which amounts to about 4.74 kilowatt-hours to charge it every day for a whole year. If we compare this to ChatGPT’s yearly energy-use figure, we’d find that the electricity for the chatbot could fully charge 223.4 million iPhones every single day, for an entire year. That’s an incredible number of phones kept powered up, simply from the energy ChatGPT uses to generate responses.

Furthermore, according to the Carbon Trust, which is a UK-based non-profit dedicated to helping businesses cut carbon emissions, one hour of video streaming in Europe consumes about 188 watt-hours of energy. In contrast, ChatGPT spitting out information for just one hour uses approximately 120.83 million watt-hours. That means to match the energy consumption of ChatGPT’s hourly operation, you’d need to stream video for a whopping 642,730 hours, which is more than 73 years.

And here’s where it gets even more interesting. Using the latest EIA data, we identified 40 countries and territories, including Barbados, Fiji, Greenland, Gibraltar, Dominica, and Haiti, which separately consume less energy annually than ChatGPT uses solely to process user queries. In fact, based on our calculations, ChatGPT’s yearly energy usage to handle prompts could power Switzerland or Singapore, each consuming approximately 57 billion kilowatt-hours per year, for nearly a week (163 hours, to be exact). For nations with higher energy demands, like Finland or Belgium, where the annual electricity consumption is roughly 80 billion kilowatt-hours, it would keep the lights on for two days and two hours.

Even energy-intensive countries such as the United Kingdom and Italy where annual consumption ranges between 287 and 298 billion kilowatt-hours, could stay powered for up to 32 hours on ChatGPT’s yearly energy usage for responding to prompts. Amazingly, ChatGPT’s yearly energy consumption to answer questions could even keep Canada powered for seventeen hours or provide electricity for the entire United States for two hours.

The Training Costs of ChatGPT

Training large-language models (LLMs) is a highly energy-intensive process as well. During this phase, the AI model “learns” by analysing large amounts of data and examples. This can take anywhere from a few minutes to several months depending on data volume and model complexity. Throughout the training, the CPUs and GPUs, the electronic chips designed to process large datasets, run nonstop, consuming significant amounts of energy.

For instance, training OpenAI’s GPT-3, with its 175 billion parameters, took about 34 days and used roughly 1,287,000 kilowatt-hours of electricity. But as models evolve and become more complex, their energy demand increases as well. Training the GPT-4 model, with over 1 trillion parameters, consumed around 62,318,800 kilowatt-hours of electricity over 100 days, 48 times more than GPT-3.

In fact, it is the new training method that truly sets the o1 model apart from its predecessors. It was developed using a new optimisation algorithm and a dataset specifically tailored for the model. Unlike previous GPT models that mimic patterns from their training data, o1 was trained to solve problems on its own using reinforcement learning, a technique that teaches the system through rewards and penalties. It then uses a “chain of thought” to process queries, similar to how humans tackle problems step-by-step. This approach allows o1 to refine its thinking process, experiment with different strategies, and learn from its mistakes.

Estimating OpenAI’s Profits from Paid Subscriptions

While the 1 billion daily user queries may cost OpenAI close to $140 million per year, this is nothing compared to the company’s revenues from paid subscriptions. In September, OpenAI COO Brad Lightcap said that ChatGPT had exceeded 11 million paying subscribers. Around a million of those were business-oriented plans, i.e. Enterprise and Team users, while the rest, roughly 10 million people, were paying for the Plus plan.

On December 5, the new $200/month Pro plan was made available, which gave users unlimited access to the o1 and o1-mini models, GPT-4o, and Advanced Voice. It also included the o1 pro mode model, as well as unlimited use of the Sora AI video creator and higher rate limits on API calls. While the $200 monthly price may seem excessive for most users, it is probably a great deal for software developers or users who create videos using the (almost) unlimited AI capabilities under the Pro plan.

What came as a surprise early in 2025 was Sam Altman admitting in a post on X that the company was losing money on the Pro subscriptions. It turns out that people were using the Pro perks more than the company had expected. Now, OpenAI has not disclosed the number of Pro subscriptions or how much the services included in the plan actually cost them. So, how much is OpenAI profiting from paid subscriptions?

Based on past statistics (1 million Business subscribers with 75% of those being Team users as of September 2024 plus 10 million Plus subscribers), we estimate that the number of paying subscribers in January 2025 is around 11.11 million. Considering the hefty price of the Pro plan and the fact that it was launched in December of last year, we believe there are roughly 10 thousand Pro users generating $2 million in revenue for the company. Here is a breakdown of what OpenAI might be making from paid subscriptions as of January 10, 2025:

  • 250 thousand Enterprise users paying $60/month
  • 800 thousand Team users paying $25/month
  • 10 thousand Pro users paying $200/month
  • 10.05 million Plus users paying $20/month

This means that the company makes roughly $238 million from paid subscriptions every month or around $2.856 billion per year. To put things in perspective, this is more than 20 times the estimated annual electricity costs from processing user queries. The annual energy costs ($139.7 million) could easily be covered with less than a month’s worth of revenue. These are, of course, rough estimations based on publicly available information; not much has been confirmed by the company, however.

Bear in mind that the estimations for the number of paying subscribers are quite conservative, including for the Pro users. The electricity costs, on the other hand, are likely much lower if the company uses its own renewable energy sources such as photovoltaics and uses electricity from the grid at preferential rates (which it almost certainly does).

Methodology

To estimate the number of queries ChatGPT handles each week, the team at BestBrokers began by considering its 200 million weekly users. We speculated that, on average, each user asks the chatbot 15 questions per week. Bear in mind that this is a rough assumption, recognising that while some users may not engage daily, others might ask dozens of questions every day. Therefore, the actual number of queries could be much higher.

With this figure in hand, we calculated the daily electricity consumption of ChatGPT by multiplying the total number of queries by the estimated energy usage per prompt, which is 0.0029 kWh, according to the Electric Power Research Institute (EPRI). From there, we extrapolated the energy consumption on a monthly and yearly basis. To determine the associated costs of this electricity usage, we applied the average U.S. commercial electricity rate of $0.132 per kWh as of October 2024.