Here's what Should be Prioritized as the Main Threat of AI

post by APAP12 · 2020-09-10T16:22:37.305Z · EA · GW · 6 comments

Most of the emphasis of the existential risk of AI is drawn to worker replacments and eventual AGI. However, I see much more in the short term a bigger threat AI is posing: that of GHG emissions.

In this paper, Strubell & al (2019) outline the hidden cost of machine learning (from inception to training and fine tuning) and found emissions for 1 model is about 360 tCo2. That is no insignificant amount. For comparison, the entire life of a car with fueling is about 70 tCo2.

The AI community is aware of this such as Mila Labs hosting an online tool on their website to calculate the carbon intensity of building and training ML models. As companies rush to incorporate AI into their businesses, emissions could balloon if the energy sources of AI aren't clean.

To me, for the next decade at least, the threat of AI to contribute to climate change should be prioritized over concerns of AI governance.

Please let me know what your thoughts are on the subject!


Comments sorted by top scores.

comment by G Gordon Worley III (gworley3) · 2020-09-11T00:39:24.835Z · EA(p) · GW(p)

It seems unclear to me that the level of CO2 emissions from one model being greater than one car necessarily implies that AI is likely to have an outsized impact on climate change. I think there's some missing calculations here about number of models, number of cars, how much additional marginal CO2 is being created here not accounted for by other segments, and how much marginal impact on climate change is to be expected from the additional CO2 from AI models. That in hand, we could potentially assess how much additional risk there is from AI in the short term on climate change.

comment by James_Banks · 2020-09-10T17:24:43.466Z · EA(p) · GW(p)

It looks like some people downvoted you, and my guess is that it may have to do with the title of the post. It's a strong claim, but also not as informative as it could be, doesn't mention anything to do with climate change or GHGs, for instance.

comment by James_Banks · 2020-09-10T16:59:29.162Z · EA(p) · GW(p)

Similarly, one could be concerned that the rapid economic growth that AI are expected to bring about could cause a lot of GHG emissions unless somehow we (or they) figure out how to use clean energy instead.

comment by Larks · 2020-09-10T16:49:18.147Z · EA(p) · GW(p)
In this paper

I think you may have forgotten to add a hyperlink?

comment by APAP12 · 2020-09-14T12:45:44.650Z · EA(p) · GW(p)

Yes my apologies! I've added the necessary corrections.

comment by Larks · 2020-09-14T13:59:00.366Z · EA(p) · GW(p)


In this paper, Strubell & al (2019) outline the hidden cost of machine learning (from inception to training and fine tuning) and found emissions for 1 model is about 360 tCo2.

The highest estimate they find is for Neural Architecture Search, which they estimated as emitting 313 tons of C02 after training for over 30 years. This suggests to me that they're using an inappropriate hardware choice! Additionally, the work they reference - here - does not seem to be the sort of work you'd expect to see widely used. Cars emit a lot of CO2 because everyone has one; most people have no need to search for new transformer architecture. The answers from one search could presumably be used for many applications.

Most of the models they train produce dramatically lower estimates.

I also don't really understand how their estimates for renewable generation for the cloud companies are so low. Amazon say they were 50% renewable in 2018, but the paper only gives them 18% credit, and Google say they are CO2 neutral now. It makes sense that they should look quite efficient, given that cloud datacenters are often located near geothermal or similar power sources. This 18% is based on a Greenpeace report which I do not really trust.

Finally, I found this unintentionally very funny:

Academic researchers need equitable access to computation resources.
Recent advances in available compute come at a high price not attainable to all who desire access. ... . Limiting this style of research to industry labs hurts the NLP research community in many ways. ... This even more deeply promotes the already problematic “rich get richer” cycle of research funding, where groups that are already successful and thus well-funded tend to receive more funding due to their existing accomplishments. Third, the prohibitive start-up cost of building in-house resources forces resource-poor groups to rely on cloud compute services such as AWS, Google Cloud and Microsoft Azure.
While these services provide valuable, flexible, and often relatively environmentally friendly compute resources ...

This whole paragraph is totally different to the rest of the paper. It appears in the conclusion section, but isn't really concluding from anything in the main body - it appears the authors simply wanted to share some left wing opinions at the end. But this 'conclusion' is exactly backwards - if training models is bad for the environment, it is good to prevent too many people doing it! And if cloud computing is more environmentally friendly than buying your own GPU, it is good that people are forced into using it!

Overall this paper was not very convincing that training models will be a significant driver of climate change. And there is compelling reason to be less worried about climate change than AGI. So I don't think this was very convincing that the main AI risk concern is the secondary effect on climate change.