Home News Industry Focus: How much power is consumed to train out ChatGPT

Industry Focus: How much power is consumed to train out ChatGPT

0

AI has re-emerged as a hot topic in the tech industry and is expected to revolutionize trillion-dollar industries from retail to medicine. But every new chatbot or image generator created uses a lot of electricity, which means the technology could release a lot of greenhouse gases that could contribute to the global warming problem.

Microsoft, Google and ChatGPT maker OpenAI all use the cloud, which relies on thousands of chips in massive data center servers around the world to train AI algorithms called models and analyze data to help those algorithms “learn” how to perform tasks. The success of ChatGPT has led other companies to launch their own AI systems and chatbots or to develop products that use large AI models.

AI uses more energy than other forms of computing and training a single model consumes more electricity than more than 100 U.S. households use in a year. However, while the AI industry is growing rapidly, it is not transparent enough to know exactly how much electricity AI uses and how much carbon it emits. Carbon emissions can also vary widely, depending on the type of power plant providing the electricity, with data centers powered by coal-fired or natural gas-fired power clearly having higher carbon emissions than those supported by solar or wind power.

While researchers have counted the carbon emissions generated by creating a single model, and some companies have provided data on their energy use, they have not provided an overall estimate of the technology’s total electricity use. Bloom, a competitor to the OpenAI model GPT-3, has written a paper quantifying the carbon emissions of her company Bloom. Luccioni also sought to assess the carbon emissions of ChatGPT, an OpenAI chatbot, based on a limited set of publicly available data.

Improving transparency

Lucioni and other researchers say there is a need for greater transparency when it comes to the AI model’s power use and emissions. With this information, governments and companies may decide whether it’s worthwhile to use GPT-3 or other large-scale models to study cancer treatments or protect indigenous languages.

Greater transparency may also lead to more scrutiny, and the cryptocurrency industry may provide a foretaste of what’s to come. According to the Cambridge Bitcoin Electricity Consumption Index, Bitcoin has been criticized for consuming too much electricity, as much as Argentina each year. This insatiable demand for electricity prompted New York State to pass a two-year ban on licensing cryptocurrency miners who power their operations with fossil fuel power.

GPT-3 is a single functional general-purpose AI program that generates languages with many different uses. a research paper published in 2021 showed that training GPT-3 used 1.287 gigawatt-hours of electricity, the equivalent of about 120 U.S. households for one year. At the same time, such training produced 502 tons of carbon, equivalent to the emissions of 110 U.S. cars in one year. Moreover, this training is only applicable to one program, or “model”.

While the upfront cost of power to train an AI model is significant, the researchers found that in some cases, this is only about 40% of the power consumed by the actual use of the model. In addition, AI models are getting bigger, and OpenAI’s GPT-3 uses 175 billion parameters or variables, compared to its predecessor’s 1.5 billion parameters.

OpenAI is already working on GPT-4, and the model must be retrained periodically to maintain its knowledge of current events. Emma Strubell, a professor at Carnegie Mellon University who was one of the first researchers to work on AI energy problems, says, “If you don’t retrain the model, it may not even know what a new coronary pneumonia is.”

Another relative measure comes from Google, where researchers found that AI training accounts for 10 to 15 percent of the company’s total electricity use, which will be 18.3 terawatt-hours in 2021. This means that Google’s AI uses 2.3 terawatt-hours of electricity per year, which is roughly equivalent to the electricity used by all Atlanta households in one year.

Tech Giants Make Net Zero Commitment

While AI models are getting bigger in many cases, AI companies are also improving to make them run in a more efficient way. The largest U.S. cloud computing companies, such as Microsoft, Google and Amazon, have all made carbon reduction or net-zero commitments. Google said in a statement that it will be net-zero across all of its operations by 2030, with the goal of operating its offices and data centers entirely with carbon-free energy. Google is also using AI to improve the energy efficiency of its data centers, with the technology directly controlling the cooling systems in the facilities.

OpenAI also cited the company’s work to improve the efficiency of its ChatGPT application programming interface, which has helped customers reduce electricity use and prices. an OpenAI spokesperson said, “We take our responsibility to stop and reverse climate change very seriously, and we’ve done a lot of thinking about how to maximize our computing power. OpenAI runs on Azure, and we’re working closely with the Microsoft team to improve the efficiency of running large language models and reduce carbon emissions.”

Microsoft noted that the company is buying renewable energy and taking other steps to meet its previously announced goal of achieving net zero emissions by 2030. In a statement, Microsoft said, “As part of our commitment to creating a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI, while working to improve the efficiency of large systems for training and applications.”

Roy Schwartz, a professor at the Hebrew University of Jerusalem, worked with a team at Microsoft to measure the carbon footprint of a large AI model. He said, “Obviously, these companies don’t want to disclose what model they’re using and how much carbon it emits.”

There are ways to make AI work more efficiently. Because AI training can be done at any time, developers or data centers can schedule their training when power is cheaper or in surplus, making their operations more environmentally friendly, says Ben Hertz-Shargel of energy consulting firm Wood Mackenzie. This can then be marketed as a major selling point to show that they are environmentally conscious.

Chips run on a lot of power

Most data centers use graphics processing units (GPUs) to train AI models, and these components are among the most power-hungry made by the chip industry. A report released earlier this month by Morgan Stanley analysts said large models require tens of thousands of GPUs, with training cycles ranging from a few weeks to a few months.

One of the bigger mysteries in AI is the total carbon emissions associated with the chips used. NVIDIA, the largest GPU maker, said their chips can complete tasks faster and are more efficient overall when it comes to AI tasks.

Using a GPU to accelerate AI is faster and more efficient than using a CPU,” NVIDIA said in a statement. Energy efficiency can typically be increased by a factor of 20 for certain AI workloads and up to 300 times for large language models essential for generative AI.”

Lucioni said that while NVIDIA has disclosed data on direct and indirect emissions related to energy, the company has not revealed more details. She believes that when NVIDIA shares this information, we could find out that GPUs consume about the same amount of electricity as a small country, “which could be maddening!”

Exit mobile version