Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
PAW Preview Video: Evan Wimpey, Director of Strategic Analytics at Elder Research
 In anticipation of his upcoming presentation at Deep Learning...
Podcast: P-Hacking—How to Know Your Predictive Discovery is Conclusive
  Welcome to the next episode of The Machine Learning...
PAW Preview Video: Piotr Wygocki, Ph.D., CEO & Co-Founder at MIM Solutions
 In anticipation of his upcoming presentation at Predictive Analytics...
PAW Preview Video: James Taylor, Decision Management Solutions
 In anticipation of his upcoming presentation at Predictive Analytics...
SHARE THIS:

4 months ago
Good News About the Carbon Footprint of Machine Learning Training

 
Originally published in Google AI Blog, Feb 15, 2022.

Machine learning (ML) has become prominent in information technology, which has led some to raise concerns about the associated rise in the costs of computation, primarily the carbon footprint, i.e., total greenhouse gas emissions. While these assertions rightfully elevated the discussion around carbon emissions in ML, they also highlight the need for accurate data to assess true carbon footprint, which can help identify strategies to mitigate carbon emission in ML.

In “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink”, accepted for publication in IEEE Computer, we focus on operational carbon emissions — i.e., the energy cost of operating ML hardware, including data center overheads — from training of natural language processing (NLP) models and investigate best practices that could reduce the carbon footprint. We demonstrate four key practices that reduce the carbon (and energy) footprint of ML workloads by large margins, which we have employed to help keep ML under 15% of Google’s total energy use.

The 4Ms: Best Practices to Reduce Energy and Carbon Footprints

We identified four best practices that reduce energy and carbon emissions significantly — we call these the “4Ms” — all of which are being used at Google today and are available to anyone using Google Cloud services.

  • Model. Selecting efficient ML model architectures, such as sparse models, can advance ML quality while reducing computation by 3x–10x.
  • Machine. Using processors and systems optimized for ML trainingversus general-purpose processors, can improve performance and energy efficiency by 2x–5x.
  • Mechanization. Computing in the Cloud rather than on premise reduces energy usage and therefore emissions by 1.4x–2x. Cloud-based data centers are new, custom-designed warehouses equipped for energy efficiency for 50,000 servers, resulting in very good power usage effectiveness (PUE). On-premise data centers are often older and smaller and thus cannot amortize the cost of new energy-efficient cooling and power distribution systems.
  • Map Optimization. Moreover, the cloud lets customers pick the location with the cleanest energy, further reducing the gross carbon footprint by 5x–10x. While one might worry that map optimization could lead to the greenest locations quickly reaching maximum capacity, user demand for efficient data centers will result in continued advancement in green data center design and deployment.

To continue reading this article, click here.

One thought on “Good News About the Carbon Footprint of Machine Learning Training

  1. Pingback: Good News About the Carbon Footprint of Machine Learning Training « Machine Learning Times – The Machine Learning Times - AI Caosuo

Leave a Reply