AI

Carbon footprint: how big the emissions of artificial intelligence models really are

Carbon footprint: how big the emissions of artificial intelligence models really are
Written by admin
Carbon footprint: how big the emissions of artificial intelligence models really are

Complex language AIs, called large language models (LLMs), have a dirty secret: they require huge amounts of energy to train and run. How much energy do they really need and how big is the CO?2However, the footprint of these models is still difficult to say.

AI start-up face hugging he believes he has now found a new way to calculate it more accurately. Method: Estimate emissions that occur throughout the life cycle of the model and not just during its training phase. This could be an important step towards more realistic CO data from tech companies2– Footprint of their AI products.

In any case, the timing is auspicious; Experts have long called on the AI ​​industry to better assess the environmental impact of AI and its applications. The work Hugging Face is available as a preprint, but has not yet been assessed by independent experts. To test their new approach, Hugging Face experts first estimated total emissions for their own BLOOM language model, which was released earlier this year. To this they added many different parameters: the amount of energy used to train the model on the supercomputer; the energy required to manufacture the supercomputer’s hardware and maintain it, and the energy consumed by BLOOM after its deployment.

For the latter, the researchers used a software tool called CodeCarbon, which tracked in real time the carbon emissions that BLOOM produced over 18 days. Based on this data, BLOOM training alone resulted in an estimated 25 tons of carbon emissions. This number doubles if you also take into account the emissions that arise from the production of computer equipment, the construction of the computer infrastructure and the subsequent regular operation of BLOOM.

50 tons of carbon emissions equivalent to approximately 60 years between London and New York. So it’s a lot, but other LLMs of the same size could give out a lot more. That’s because BLOOM was trained on a French supercomputer that runs mostly on nuclear power and, as a result, produces zero carbon emissions. Models trained in China, Australia or parts of the US, where energy use is increasingly fossil fuel-based, could be even more polluting, at least in terms of greenhouse gases. After launching the BLOOM Hugging Face, he estimated that using the model releases around 19 kilograms of carbon dioxide per day. This is equivalent to what the average new American car will do when driving about 86 kilometers.




For comparison: GPT-3 by OpenAI and OPT from Meta They are estimated to emit more than 500 tonnes and 75 tonnes of carbon dioxide respectively during training, with the GPT-3’s enormous emissions partly explained by the fact that the training took place on older, less efficient hardware. In both cases, the numbers are based on external estimates or, in Meta’s case, on the company’s own published data.

Also because there is no standardized method for measuring carbon dioxide emissions from AI models yet, Hugging Face researchers want to give a new impetus: “Our goal was to go beyond emissions during training and instead consider a larger part of the life cycle. With this, we want to help the AI ​​community get a better idea of ​​the impact on the environment and how we can reduce it,” says Sasha Luccioni, lead author of the study.

The study could provide more needed clarity on how much CO2the footprint of large language models is real, says Lynn Kaack, an assistant professor of computer science and public policy at the Hertie School in Berlin, who was not involved in the Hugging Face research. Kaack says she was surprised by the estimated life-cycle emissions. However, more needs to be done to understand the impact of large language models on real-world environments.

according to estimates The global technology sector is responsible for 1.8 to 3.9 percent of global greenhouse gas emissions. Although only a fraction of these emissions are due to artificial intelligence and machine learning, CO2– The footprint of AI for one area in the tech industry is still very high.

With a better understanding of how much energy AI systems actually use, companies and developers can decide what trade-offs to make between pollution and cost, Luccioni says. This could lead to thinking about more efficient methods, such as fine-tuning existing models, instead of pushing ever newer, ever larger models.

The authors of the paper therefore hope that in the future companies and researchers will think more about how they can develop large language models in a way that reduces their CO2footprint limited, says Sylvain Viguier, co-author of Hugging Face and director of applications at semiconductor company Graphcore.

The study’s findings are “a wake-up call for anyone using these types of models, especially large technology companies,” said David Rolnick, an assistant professor in McGill University’s computer science department and who was not involved at the Mila AI Institute in Quebec. in studio. “The environmental impacts of artificial intelligence are not inevitable. Rather, they are based on our decisions about how we use these algorithms and which ones we use,” says Rolnick.




(ghee)

To the home page

About the author

admin

Leave a Comment