A recent Intergovernmental Panel on Climate Change (IPCC) report has made it very clear that drastic, immediate cuts to greenhouse gas emissions are needed to limit global warming to 1.5° C. With the absence of a technological silver bullet, this requires rapid changes at unprecedented scale across all sectors of the global economy. Just as the climate clock is ticking, technological breakthroughs in machine learning algorithms (ML) and robotic control have turned artificial intelligence (AI) into a powerful new agent of change. The wide-ranging potential impacts of the AI revolution on both society and the environment are well explored by intellectuals like Daniel Susskind and Peter Dauvergne. Today, the world’s five most valuable companies are tech firms with a strong focus on AI, and there is heavy investment in AI industrial strategy by governments around the world.
General-purpose machine learning algorithms are heavily proliferating in scientific research areas relevant to climate change mitigation. Novel automated processes within materials science could prove a game changer in renewable energy research, accelerating the search for efficient solar panel and battery substrates. Machine learning also plays a key role in analysing climate change effects on biological ecosystems by making sense of vast amounts of species data. Autonomous drones and submersibles are poised to play a crucial role in emissions monitoring; the fight against illegal logging, fishing and poaching, and even pest control – including RangerBot, an autonomous submarine capable of administering precision toxin injections to coral-eating crown-of-thorns starfish.
Climate science, the scientific discipline dedicated to the study of future climate projections, is heavily profiting from the AI revolution. In fact, most machine learning methods have already found successful application in the field. For example, deep learning and Bayesian modelling, two particularly hot subareas of machine learning, are used in tasks such as the automated classification of cloud formations on satellite images and the prediction of daily precipitation rates during the Indian monsoon. However, it is in the subfield of climate modelling that climate scientists hope to unleash the real power of AI.
Long-term climate simulations constitute the backbone of IPCC reports, and in turn, inform the decisions of policy makers worldwide. They are used in attribution studies to determine whether heat waves, droughts or other extreme weather events have been made significantly more likely due to anthropogenic climate change. This opens pathways for damage litigation claims against fossil fuel companies. Crucial to this application and the wider success of such climate modelling is both accuracy and sufficient resolution at regional levels.
Climate science, the scientific discipline dedicated to the study of future climate projections, is heavily profiting from the AI revolution.
For decades, model forecasts routinely improved thanks to increasingly sophisticated hardware, which allowed for the inclusion of ever more complex physical processes. Unfortunately, despite millions of hours invested, the past decade has seen little improvement in the reduction of systematic errors. Many suspect that this impasse can only be overcome by radically increasing model resolution: current global circulation models have grid sizes of about a hundred kilometres, but some major non-linear effects related to ocean and atmospheric circulation, sea ice and cloud formation, can only be adequately captured at a resolution of about one kilometre. This would require processing speeds that exceed today’s by several orders of magnitude. Exascale computing infrastructure may provide part of the solution. Academic consortiums, such as the EU’s flagship project Extreme Earth, are set to provide computing infrastructure allowing for substantially finer grid resolutions over the next decade. But even such expensive brute-force projects will not go the whole way.
AI is poised to come to the rescue in two different ways: One approach tries to re-express small-scale process simulations in a way that allows them to make use of specialised high-throughput hardware originally designed for AI. This entails finding ways of solving complicated maths equations in a way that is computationally similar to training deep neural networks. The alternative approach is to train machine learning algorithms to emulate the outputs of small-scale simulations directly. This is conceptually simpler, but it is difficult to guarantee that black box models will generalise well to unseen situations. Researchers from the University of Oxford are mainly involved with the first approach, while a rivalling group at CalTech, supported by influential philanthropist and Microsoft co-founder Paul Allen, is advancing the second. It is not clear yet whether any team will be successful, but the community is cautiously optimistic that AI will eventually speed up simulations by about a factor ten. First conclusions will be drawn at a workshop hosted by Corpus Christi in early September 2019.
Some major tech companies operate hardware infrastructure at considerable scale, but do not usually make them accessible to the climate modelling community. A rare counterexample is Amazon’s provision of computing infrastructure to the Met Office. Sometimes tech companies release large commercial datasets for climate change research; Google recently released Maps and Waze transportation data to facilitate urban carbon footprinting. However, most AI technology leaders such as Google Deepmind, Facebook and OpenAI concentrate their research on marketable areas, such as protein folding or healthcare, or other traditionally prestigious fields like computer gameplay. Even though accurate predictions of regional climate change are of immense value to many groups, it is generally difficult to construct profitable business cases in this resource-intensive domain. This may help explain why climate science research is rarely cited as a focus area by tech startups and academic AI communities.
It has been suggested that increasing the tech firms’ interest in climate change-related research might ultimately require additional economic incentives, possibly through carefully balanced public-private partnerships. As for the impact of individual researchers, many tech firms and research laboratories are known to accommodate employee preferences on research directions to some extent. Alternatively, large international research consortiums, similar in scale to CERN, might be able to provide the critical mass of initial investment required. Eschewing academic siloing in favour of a worldwide meteorological collaboration was in fact first suggested by none other than Albert Einstein in the 1920s.
Successful application of generic ML algorithms to scientific research requires both a comprehensive understanding of the methods’ applicability, as well as experience in fine-tuning approaches given data and domain knowledge. A lot of this methodology involves transferable skills. “Better access to pre-existing ML expertise could significantly accelerate climate science. Sometimes it feels like one field has a hammer, the other field has a nail, and they’re not talking to each other”, says Thomas Hornigold, Oxford DPhil student in Atmospheric Physics and Dynamics.
A recent workshop “Machine learning in Climate Science”, hosted by the University of Oxford, aimed to facilitate knowledge exchange within the climate science community. However, it was not widely advertised among machine learners. Similarly, while some other interdisciplinary fields, including healthcare or biomedical imaging, are regularly hosting dedicated workshops at major AI conferences such as NeurIPS, related climate science workshops are often confined to climate informatics conferences. A particularly impactful way of engaging with the AI community is illustrated by a step taken by a Californian philanthropic medical foundation in 2015.
Climate science research is rarely cited as a focus area by tech startups and academic AI communities.
The nonprofit sponsored a competition on the renowned data science online platform Kaggle in the hope of progressing the detection of a variety of eye diseases using retina scans, which even skilled ophthalmologists have difficulty diagnosing, particularly if they are early onset. Using retina scans from a free platform for retinography screening, almost 700 contesting teams set out to predict the occurrence of the widespread disease diabetic retinopathy. The competition, resulting online forum discussions and contestants’ code submissions generated meaningful science and attracted considerable attention among the machine learning community.
As the climate clock continues to tick, AI has become a pillar of hope in our quest to understand, or even mitigate, climate change. Were its treasure chests of datasets and models to be opened, the climate change research community might be able to better profit from the considerable knowledge and resources available to AI researchers in academia and industry.
Image Credit: NOAA