Machine Learning Experts Issue Call To Arms For Climate Focus

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Over the past several months, CleanTechnica has been publishing a series of articles on the application of machine learning to clean technologies. It’s been an exploration of not only machine learning basics through the whimsically ominous means of a robotic machine learning drive velociraptor, but also an exploration of the application of the technology in coastal elevation studies, commercial rooftop solar panel placement, waste stream sorting, global tree carbon capture potential, concentrated solar power optimization and water quality management.

Image by author with graphics courtesy NASA and Oakridge Lab

But the global machine learning community isn’t just applying the technology ad hoc, they are calling for their community to pay attention to several major areas of high value for climate change. In a November 2019 paper, Tackling Climate Change with Machine Learning, almost two dozen machine learning researchers from North America and Europe issued a call to arms with a battle plan for the use of machine learning to address key climate change solutions. Included among them is an eminence gris of machine learning, Yoshua Bengio, one of the trio of researchers who jointly won the million dollar Turing Award for their efforts in the space over the past two decades. (Bengio is apparently somewhat of a celebrity in Montreal based on conversations late in 2019 with a Quebec provincial government, industry and academic committee I’d been invited to speak to about autonomous vehicle sensors and implications.)

The researchers hail from a who’s who of institutions as well, including Carnegie Mellon, ETH Zurich, Element AI, Universite de Montreal, Harvard University, Mercator Research Institute on Global Commons and Climate Change, Technische Universitat Berlin, Massachusetts Institute of Technology, Cornell, Stanford University, DeepMind, Google AI, Microsoft Research, and University of Pennsylvania. These are just the primary affiliations of the researchers, as a common pattern is to see a deep expert have a major academic and corporate affiliation.

The abstract is crisp, clear and to the point:

“Climate change is one of the greatest challenges facing humanity, and we, as machine learning experts, may wonder how we can help. Here we describe how machine learning can be a powerful tool in reducing greenhouse gas emissions and helping society adapt to a changing climate. From smart grids to disaster management, we identify high impact problems where existing gaps can be filled by machine learning, in collaboration with other fields. Our recommendations encompass exciting research questions as well as promising business opportunities. We call on the machine learning community to join the global effort against climate change.”

The paper isn’t just for the machine learning community. The authors explicitly are targeting a bigger group that includes researchers and engineers, entrepreneurs and investors, corporate leaders, and local and national governments. One key takeaway is that this is a paper to bring to your elected representatives, any angels and VCs you know, and the CEOs of firms you work for and with.

A point that they make for entrepreneurs and investors is one that I’ve been pushing for months with my investment and agency contacts as we discuss business opportunities, that machine learning has become an exploitable technology with a large class of solutions that no longer requires substantial research and risk, but application of the technology to problems that are often intransigent without it. I’m still pursuing coastal floating log identification in BC, for example, seeing millions in lost log revenue annually as well as the creation of substantial floating hazards for float planes and boats, and a solution which is easily extensible globally.

For the public sector, they highlight intelligent transportation systems, techniques for automatically assessing the energy consumption of buildings in cities, and tools for improving disaster management. The coastal digital elevation study I wrote about in 2019 is an example that fits directly into this space. This week I was speaking with the Chris Wiesinger, CEO of GeoSim Cities, and David Clement, the machine learning expert who I co-authored the Plastic Dinosaur series of explainers with. We were talking about how to leverage GeoSim’s high precision lidar-sourced 3D model of a large portion of Vancouver into building emissions heat maps, projections of ground data of envelope studies onto formerly unstudied buildings, and traffic simulation around new construction such as the instantly iconic Vancouver House, using machine learning techniques.

A fundamental organizational principle of the paper is to start with the domains. They’ve split the problem space into 13 domains: electricity systems, transportation, buildings and cities, industry, farms and forests, finance, and more. Regardless of your academic, business, or general interest focus, they likely have covered it.

Machine learning isn’t a single magic box. It’s a set of technologies and techniques, including causal inference, computer vision, natural language processing, transfer learning, uncertainty quantification, and others. The authors mapped the solutions against the problem domains. They’ve done the thinking about which approaches likely are most productive in tackling the specific domain. It’s a rich intersection document that should be on the desks of policy makers, corporate leaders, and investors.

To pull one example out, in the transportation section the authors talk about modal shift, i.e. taking passengers out of one-person cars and into a rich set of other transportation modes including walking, biking. rideable electric vehicles, transit, and vehicle sharing. They call out the technologies and techniques of causal inference, computer vision, time-series analysis, and uncertainty quantification as the key focus areas for research and deployment.

To the best of the authors’ ability, they have denoted the solutions within the spaces as having immediate applicability, longer-term applicability and/or uncertainty of impact. This is a strategic document laying out near-term value, long-term efforts, and areas of potential that need research. It should be core to shaping machine learning agendas for the coming decade.

And it comes with a call for collaboration. For those interested in specific areas or more broadly in the topic, they’ve created a website to assist, climatechange.ai. They’ve had a series of events at major conferences throughout 2019, including NeurIPS 2019 in Vancouver, a conference that David Clement attended.

To dig into just one portion of the 111 page PDF, the electricity section talks about low-data regions.

“While ML methods have often been applied to grids with widespread sensors, system operators in many countries do not collect or share system data. Although these data availability practices may evolve, it may meanwhile be beneficial to use ML techniques such as transfer learning to translate insights from high-data to low-data settings (especially since all electric grids share the same underlying system physics).”

This is a core capability of machine learning. As the coastal digital elevation model study showed, if you have high-resolution data for a subset of the globe, and lower-resolution data for the rest, you can improve the quality of the global data substantially solely through machine learning techniques. Similarly, you can project with a reasonable amount of certainty emissivity onto buildings that haven’t had envelope audits from data from buildings that have had them.

This is the most superficial of overviews of the deep and broad paper. It’s a must read for anyone engaged in climate mitigation or adaptation, whether they are the head of a global urban planning and design practice, an entrepreneurial investor, a government-funded economic growth agency, or a researcher. I’ve forwarded the paper to people I know in all of these spaces already. It will undoubtedly inform further articles in the series on cleantech and machine learning CleanTechnica is publishing, and the assembled report on the subject.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Michael Barnard

is a climate futurist, strategist and author. He spends his time projecting scenarios for decarbonization 40-80 years into the future. He assists multi-billion dollar investment funds and firms, executives, Boards and startups to pick wisely today. He is founder and Chief Strategist of TFIE Strategy Inc and a member of the Advisory Board of electric aviation startup FLIMAX. He hosts the Redefining Energy - Tech podcast (https://shorturl.at/tuEF5) , a part of the award-winning Redefining Energy team.

Michael Barnard has 702 posts and counting. See all posts by Michael Barnard