The use of artificial intelligence (AI) and machine learning to drive innovation across all industries has increased significantly in recent years. Indeed, the proliferation of data science applications from genome sequencing for better disease diagnosis and prevention, to advances in leading edge engineering for autonomous driving, and climate modelling to combat Climate Change, has led to an exponential demand for High Performance Computing (HPC).
AI for sustainability is one of the most promising new fields of study, with a recent report by PwC and Microsoft reporting that using AI for environmental applications in four key sectors could reduce global greenhouse gas emissions by 4% in just 10 years’ time. Recent efforts include international non-profit organisation, Global Fishing Watch, using AI and satellite data to prevent overfishing, and wind companies using AI to get each turbine’s propeller to produce more electricity per rotation by incorporating real time weather and operational data.
Power hungry AI
But alongside worries about AI bias or human jobs being replaced by machines, concerns about the environmental impact of AI itself should be at the fore. The enormous computing power needed for machine learning and deep learning applications equates to markedly high energy consumption. The carbon cost of AI becomes even more substantial when you add the energy required to keep equipment cool and prevent overheating – at least 40% of all energy consumed in a conventional data centre goes towards cooling.
As a result, the carbon emissions from training one deep learning model for machine translation can be up to five times greater than the amount generated during the entire lifetime of a car, per 2019 University of Massachusetts research. It is somewhat ironic that the climate models being built to predict and prevent climate change ignore the carbon footprint of running energy-hungry AI systems.
Improving AI efficiency
AI researchers can reduce their carbon footprint at the outset by considering efficiency as well as accuracy when undertaking research activity. This means weighing up whether massive models, colossal datasets, or copious experiments are strictly necessary, compared to the results achieved.
The businesses utilising AI for innovation can do their part by ensuring that the data centres that power their AI applications are truly sustainable, or in the case of AI-as-a-service, that their cloud service provider’s green credentials pay more than just lip service.
The servers that train AI models can be comfortably housed in data centres powered by renewable energy sources, such as in Iceland, where only renewable hydroelectric and geothermal power is used to supply the power grid. In a year-round cool climate like Iceland’s, natural air cooling of powerful AI servers minimises energy usage. More than 80% of computer hardware doesn’t need to be near the end-user, and whenever possible, should be located in the most energy efficient data centres that rely on abundant renewable energy and environmentally friendly cooling.
At the recent UN Climate Change Conference COP25 in Madrid, president-designate Clare O’Neill challenged both businesses and governments to move from rhetoric to reality. Given the exceptionally high power demands of AI research and development, now is the time to turn ambition into action and make the change to sustainable AI. This begins not only with AI researchers, but also the businesses driving AI innovation themselves.
Tate Cantrell is CTO at London-based data centre owner and operator Verne Global