AI requires significant computing power and training, severely consuming human electricity.

The rise of artificial intelligence (AI) is accelerating the continuous expansion and upgrading of data centers, leading to a corresponding surge in electricity consumption, with the demand for immense power only expected to grow.

AI large models are the result of “big data + big computing power + strong algorithms”. How much electricity does it take to train a large AI model?

Taking the example of the large language model GPT-3 developed by OpenAI, which has 175 billion parameters. Researchers ran 1,024 GPUs continuously for about a month for training.

Mosharaf Chowdhury, associate professor of electrical engineering and computer science at the University of Michigan, estimated that training GPT-3 consumes 1,287 terawatt-hours (1 terawatt-hour equals 1,000 kilowatt-hours) of electricity. This figure is equivalent to the electricity consumption of an average American household over 120 years.

GPT-3 was introduced four years ago, and now the scale of large language models is growing exponentially. It was reported that GPT-4, launched in 2023, has a total of 176 trillion parameters, ten times that of GPT-3. It is expected that GPT-5, set to be released next year, will have even faster speed and stronger language processing capability, with estimated training energy consumption even more substantial.

The initial stage of training AI is just a short-term event, while usage is a long-term process. With the popularization of applications and the increase in users, energy consumption will continue to accumulate.

In a report released by the International Energy Agency (IEA) headquartered in Paris, France in January of this year, it was stated that ChatGPT responds to one request on average consuming 2.9 watt-hours – equivalent to lighting a 60-watt bulb for about three minutes. This is nearly 10 times the average energy consumption of a Google search.

It is reported that ChatGPT responds to about 200 million requests per day, resulting in consuming over 500,000 kilowatt-hours of electricity daily, equivalent to the daily electricity consumption of 17,000 average American households. Calculated over a year of 365 days, ChatGPT consumes 182.5 million kilowatt-hours.

Google performs about 9 billion searches per day. If Google extensively integrates generative AI into searches, the IEA estimates an additional need of 10 terawatt-hours of electricity per year (1 terawatt-hour equals 1 billion kilowatt-hours).

Data centers are the foundation infrastructure for AI, providing the necessary computing resources, storage capacity, and network bandwidth for efficient operation and development of AI applications.

Additionally, data centers must provide powerful cooling systems to maintain appropriate temperatures as thousands of servers and chips run continuously day and night, generating a significant amount of heat. Therefore, the electricity consumption of data centers themselves is quite substantial.

According to the IEA report, global data center electricity usage in 2022 was estimated at 460 terawatt-hours, accounting for nearly 2% of the world’s total electricity demand. However, this electricity consumption is not solely due to AI, as cryptocurrency mining also consumed nearly a quarter of it, reaching 110 terawatt-hours in 2022.

The report indicates that data centers’ power consumption mainly comes from two processes, with computing accounting for 40%, cooling demands also accounting for 40%, and the remaining 20% from other related IT equipment.

The IEA estimates that by 2026, total electricity usage of data centers could double to reach 1,000 terawatt-hours, roughly equivalent to Japan’s annual electricity consumption. Compared to 2022, the increased electricity usage by 2026 could be equivalent to a Sweden at minimum, and at maximum, the equivalent of a Germany.

There were approximately 3,600 data centers globally in 2015, which climbed to nearly 8,000 in 2021. According to statistics from the information technology website Brightlio, by the end of 2023, the number has reached 10,978 and is believed to be continually increasing.

The United States currently has 5,388 data centers, leading globally in data center numbers. John Ketchum, CEO of NextEra Energy Inc., the world’s largest wind and solar energy developer, suggests that U.S. electricity demand is projected to grow by 40% over the next twenty years, as opposed to the 9% growth in the past two decades.

Ketchum pointed out that the main reason for the surge in demand is data centers. When asked why data centers suddenly consume so much electricity, his direct response was, “It’s AI.”

Countries worldwide are now riding the wave of “sovereign AI”. Due to differences in languages and texts among countries, combined with economic development, national security needs, and the overall U.S.-China competition, countries in Asia, the Middle East, Europe, and the Americas are investing in constructing their own AI computing facilities. Jensen Huang, the founder and CEO of NVIDIA, expressed at the World Government Summit held in Dubai in February that every country needs to have its own intelligent products.

The battle for global AI supremacy may depend on which countries possess sufficient data centers and power to support AI technology. This means that the demand for energy by artificial intelligence and data centers will only continue to grow, potentially infinitely.

The world’s largest cloud service providers Amazon, Microsoft, and Google have announced their goal of using only green energy in their data centers. All three companies have stated that they are researching ways to use less power or more effectively balance grid demands through technologies such as increasing chip and server efficiency, while also reducing cooling demands.

Some tech leaders believe that the key to adapting to the new situation lies in energy breakthroughs. Sam Altman, CEO of OpenAI, stated at the World Economic Forum Annual Meeting in Davos earlier this year that future iterations of AI technology would require energy breakthroughs, as the energy consumed by artificial intelligence technology will far exceed expectations. “Without breakthroughs, achieving this goal will be impossible,” he said. “It motivates us to increase our investment in nuclear fusion.”

Last month, TerraPower, an energy company founded by Microsoft co-founder Bill Gates, began constructing a next-generation nuclear power plant in Wyoming, the United States. The reactor uses sodium instead of water for cooling, with Gates believing this will “completely change” power generation.

The sustainability of artificial intelligence development is a topic of concern, with energy being a crucial element.