AI’s Power Demand: Calculating ChatGPT’s electricity consumption for handling over 78 billion user queries every year

Nearly two years have passed since OpenAI launched its groundbreaking text-generating AI chatbot ChatGPT, which quickly became the fastest-growing consumer internet app in history, hitting an estimated 100 million monthly users within just two months. Beyond answering basic questions, this powerful tool can compose entire essays in seconds, engage in human-like conversations, solve complex math problems, translate texts, and even write code. However, ChatGPT consumes a lot of energy in the process, reportedly nearly ten times more than traditional Google searches.

With this in mind, the team at BestBrokers decided to put into perspective the considerable amount of electric power OpenAI’s chatbot uses for responding to prompts on a yearly basis. We also looked at what that would cost at the average U.S. commercial electricity rate per kWh as of June 2024. After running the numbers, it turned out that for answering questions alone, ChatGPT consumes a whopping 226.82 million kilowatt-hours on average every year, which amounts to an expenditure of $29.71 million.

I’ll ChatGPT it

When you submit a query to an AI model like ChatGPT, the process is known as inference. This involves passing your query through the model’s parameters to generate a response. Estimating the energy required for that process is a bit tricky, however, as it depends on many variables, such as query length, number of users, and how efficiently the model runs. In most cases, this information is kept private. Still, we can make some rough calculations to get a sense of the amount of electricity that will be needed.

Apparently, each time you ask ChatGPT a question, it uses about 0.0029 kilowatt-hours of electricity. This is nearly ten times more than the energy needed for a typical Google search, which consumes about 0.0003 kilowatt-hours per query, according to The Electric Power Research Institute (EPRI).

As of November 2023, OpenAI CEO Sam Altman reports that 100 million users worldwide interact with ChatGPT each week. If we assume that each user asks around 15 questions weekly, keeping in mind that some users engage only occasionally while others ask dozens of questions daily, this adds up to roughly 1.5 billion queries per week. Spread out over the days of the week, this results in more than 214 million daily requests, consuming over half a million kilowatt-hours of energy. For context, the average U.S. household uses about 29 kilowatt-hours of electricity per day. This means ChatGPT consumes over 21,602 times more power than a typical household each day.

Over the course of a year, ChatGPT’s total energy consumption works out to 226,821,615 kilowatt-hours, an amount enough to respond to about 78 billion prompts. At the average U.S. commercial electricity rate of $0.131 per kWh as of June 2024, this translates to about $29,713,632 spent on electricity annually, with each query costing $0.00038. And as ChatGPT comes up with new versions, and features like audio and video generation come into play, these figures could soar even higher.

Putting ChatGPT’s massive energy usage for handling user prompts into perspective

According to the EV database, the average electric car consumes 0.191 kilowatt-hours per kilometre and can travel about 379 kilometres on a full charge, which means it has a battery that can store roughly 72.4 kilowatt-hours of energy. If we take all the energy ChatGPT uses in a year to generate responses and divide it by the capacity of an average EV battery, it turns out that it could fully charge 3,133,371 electric vehicles. That’s nearly 95% of the 3.3 million electric cars on the road in the United States by the end of 2023, as reported by the U.S. Department of Energy.

Now, let’s look at this from a household perspective. According to the Energy Information Administration (EIA), the average home in the United States consumes about 10,500 kilowatt-hours of electricity per year. That means the energy ChatGPT uses each year to handle requests could power 21,602 U.S. homes for an entire year. While that’s only about 0.02% of the 131 million households in the country, as per the latest Census data, it’s still a massive amount of energy, especially when you consider that the U.S. ranks third globally in terms of the number of households.

Think about smartphones next. The latest iPhone 15, for example, has a battery capacity of 12.98 watt-hours, which amounts to about 4.74 kilowatt-hours to charge it every day for a whole year. If we compare this to ChatGPT’s yearly energy-use figure, we’d find that the chatbot could fully charge 47.9 million iPhones every single day, for an entire year. That’s an incredible number of phones kept powered up, simply from the energy ChatGPT uses to generate responses. Furthermore, according to the Carbon Trust, which is a UK-based non-profit dedicated to helping businesses cut carbon emissions, one hour of video streaming in Europe consumes about 188 watt-hours of energy. In contrast, ChatGPT spitting out information for just one hour uses approximately 25,892,857.14 watt-hours. That means to match the energy consumption of ChatGPT’s hourly operation, you’d need to stream video for a whopping 137,728 hours.

And here’s where it gets even more interesting: ChatGPT’s yearly energy consumption surpasses the entire electricity usage of twelve individual countries and territories, including Gibraltar, Grenada, Dominica, Samoa, and the British Virgin Islands, each using around 200 million kilowatt-hours per year, according to the latest EIA data. Additionally, according to our calculations, the energy ChatGPT uses in a year to answer user queries could power all of Finland or Belgium for an entire day. In Ireland, it would keep the lights on and the country running for over two days.

The Training Costs of ChatGPT

Training large-language models (LLMs) is a highly energy-intensive process as well. During this phase, the AI model “learns” by analysing large amounts of data and examples. This can take anywhere from a few minutes to several months depending on data volume and model complexity. Throughout the training, the CPUs and GPUs, the electronic chips designed to process large datasets, run nonstop, consuming significant amounts of energy.

For instance, training OpenAI’s GPT-3, with its 175 billion parameters, took about 34 days and used roughly 1,287,000 kilowatt-hours of electricity. But as models evolve and become more complex, their energy demand increases as well. Training the GPT-4 model, with over 1 trillion parameters, consumed around 62,318,800 kilowatt-hours of electricity over 100 days, 48 times more than GPT-3.

Methodology

To estimate the number of queries ChatGPT handles each week, the team at BestBrokers began by considering its 100 million weekly users. We speculated that, on average, each user asks the chatbot 15 questions per week. Bear in mind that this is a rough assumption, recognising that while some users may not engage daily, others might ask dozens of questions every day. Therefore, the actual number of queries could be much higher.

With this estimate in hand, we calculated the weekly electricity consumption of ChatGPT by multiplying the total number of weekly queries by the estimated energy usage per prompt, which is 0.0029 kWh, according to the Electric Power Research Institute (EPRI). From there, we extrapolated the energy consumption on a monthly and yearly basis. To determine the associated costs of this electricity usage, we applied the average U.S. commercial electricity rate of $0.131 per kWh as of June 2024.