How big data can help us fight climate change faster
An aerial view of Brazil's Atibainha dam, part of the Cantareira reservoir, during a drought. Image: REUTERS/Nacho Doce
A new report from the UN’s Intergovernmental Panel on Climate Change (IPCC) warns us that “rapid, far reaching and unprecedented changes in all aspects of society” are required to limit global warming to 1.5C and avoid an irreversible natural tipping point.
By 2030, carbon dioxide emissions caused by humans need to fall by about 45% from 2010 levels, reaching ‘net zero’ around 2050, says the report. Remaining emissions would need to be balanced by removing CO2 from the atmosphere, for example through reforestation and improved land management.
From analyzing large data sets — or big data — we know that our planet lost the equivalent of 40 football fields per minute last year in tree cover. We know that more than a quarter of global tree cover loss between 2001 and 2015 was associated with commodity-driven deforestation.
Big data — whether historical or real-time — can also help us tackle the problem, for example by locating harmful emissions or identifying pressure points along the supply chain. This transformative change in data capabilities is an example of what the World Economic Forum refers to as the Fourth Industrial Revolution (4IR).
When California Governor Jerry Brown announced in September that the US state would be launching “its own damn satellite” to monitor the effects of climate change, his promise was bold: an initiative to help governments, businesses and landowners to pinpoint — and stop — destructive emissions “with unprecedented precision, on a scale that’s never been done before”.
Launched against a backdrop of the Trump administration’s withdrawal from the Paris Agreement, California is taking local action to a global problem in the absence of federal leadership. Working with San Francisco-based Earth imaging company, Planet Labs, the state will develop and eventually launch a satellite capable of detecting the “point source” of climate pollutants, monitoring leaks and other anomalies at specific locations.
The satellite, an initiative of the California Air Resources Board, will complement project partner Environmental Defense Fund’s MethaneSAT, scheduled for launch in 2021. The latter will provide broader, more frequent coverage, quantifying emissions from oil and gas fields producing at least 80% of global output roughly every four days.
Individually, the projects will generate important data on harmful emissions. Combined, the data sets pack a bigger punch. For instance, if MethaneSAT waves a red flag about an emissions spike in a given field, the California instrument would then zero in on specific facilities and pinpoint the larger sources. EDF says it is “like having two camera lenses — wide angle and telephoto” that together produce a more complete picture of the methane problem.
The California Air Resources Board and Planet Labs will work in partnership with the Environmental Defense Fund amongst others to set up a new Climate Data Partnership, a common platform for reporting data to enable governments, businesses, landowners and others to pursue more targeted mitigation measures worldwide.
Tom Ingersoll, who is leading EDF’s MethaneSAT project, suggests thinking of these separate but complementary data projects “as a set of overlapping circles, like the Olympic rings”.
He says: “Multiple methods of assessing methane emissions lead to a more complete and actionable set of insights than any single method can by itself.”
Bringing solutions into focus
Combining data captured via satellite imagery and artificial intelligence to monitor forests and land use to provide the ‘where, why, when and who’ was among the solutions discussed at a 4IR event held at the recent Global Climate Action Summit in San Francisco.
Kavita Prakash-Mani, practice leader - markets & food at WWF, says data from projects such as Eyes on the Forest, which investigates deforestation and land grabs in Indonesia, needs to be combined with other information to create the full picture. “We need technology to monitor why we are losing forests, looking at traceability through citizen sites and data on where food is coming from,” she says.
One example of this is the Trase platform, which connects independent data sources to reveal the trade flows for commodities such as beef, soy and palm oil which are responsible for an estimated two thirds of tropical deforestation. The aim is bring transparency to global supply chains, using publicly available data to map in detail the links between consumer countries via trading companies to the places of production.
Using existing data such as customs records and trade contracts, tax registration data, production data and shipping data, Trase pieces together a bigger picture of how exports are linked to agricultural conditions (including specific environmental and social risks) in the places where they are produced. This enables companies, governments and others to better understand the risks and identify opportunities for more sustainable production.
By 2021, Trase aims to map the trade of over 70% of commodities that pose a major risk to forests. Clément Suavat, lead developer at Trase, says combining different data sets “allows you to connect landscapes, to quickly identify sustainability risks”. So, a mash-up of supply chain information and shipping data for example can help a business to pinpoint exactly where to make changes that will have an impact on climate change goals.
Sharpening the picture
Like Orbital Insight, which is using geospatial analytics to support the Global Forest Watch monitoring initiative, the Trase and Planet Labs projects are just some examples of this fast-growing field of using technology for climate change transparency.
Most recently, US climate change think tank Woods Hole Research Center is using a satellite-based tool to create a new global carbon monitoring map. The dataset behind the launch of The Carbon Source comes from the Woods Hole Carbon Monitoring System, which employs an innovative time-series approach to measuring changes (losses and gains) in aboveground carbon density across tropical America, Africa and Asia.
Woods Hole claims the approach is “poised to transform how the world measures and tracks changes in forest carbon” and its vision is to use the data “to tell a near real-time story” about the state and vulnerability of land-based carbon from the Arctic to the Tropics.
Dominic Waughray, head of the World Economic Forum’s Centre for Global Public Goods, predicts a “huge transformation” of the environmental community: “Think about how difficult it’s been to tackle climate change, to track emissions. And then think about the possibilities of large-scale transparency of data, and what that would mean to be a company or a civil society organization, finding that information and doing something with it.”
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Data Science
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Climate ActionSee all
Giorgio Parolini and Yiran He
December 6, 2024