AI Meets Satellite Imagery: A New Strategy for Monitoring Climate Change

Project Team

Team photo.
Team members gather at an escape room in Durham for team building.

Team profile by Song Young Oh, Kushagra Ghosh and Anushka Srinivasan

As news of extreme weather events becomes more frequent – such as scorching heat waves, devastating floods and rising polar temperatures – it’s clear that climate change is a global crisis. This urgent situation calls for innovative solutions, and one promising tool in the fight against climate change is the use of satellite imagery in conjunction with advanced artificial intelligence (AI) technologies.

Our team, comprising six undergraduate students and three master’s students from diverse academic backgrounds, has come together to tackle this challenge. Our project seeks to provide automated tracking of climate change causes and their consequences using state-of-the-art AI techniques. We believe that more efficient tracking will enable more informed policymaking, allowing for efficient and effective responses to climate-related events.

However, working with satellite imagery presents a significant obstacle: collecting labeled data is costly due to the sheer volume and varied resolutions of the imagery. This complicates the use of AI models. Specifically, we face two major challenges: the scarcity of high-resolution training data and the limited scope of most algorithms, which typically focus on specific locales. Since climate change monitoring requires a global perspective, we needed to find a way to extend the applicability of these algorithms so they would work for a variety of regions.

To address these challenges, our team has devised a novel approach that leverages embedding space – a way to represent data in a different format – to overcome the limitations posed by scarce labeled data in satellite imagery. We particularly utilize zero-shot learning, a technique that aligns the embedding spaces of textual and visual data, allowing models to understand intrinsic relationships even when direct labeled examples are sparse or entirely absent.

In our experiments, we have worked with a variety of cutting-edge language-image models. Notably, these models were not trained on the satellite imagery used for our tests, allowing us to evaluate their ability to classify unseen data. The results have been impressive, with an average accuracy rate of nearly 80% across multiple datasets of satellite imagery. This high accuracy underscores the potential of zero-shot learning as a cost-effective solution for analyzing remote sensing imagery.

Our team is excited about extending this approach to diverse domains, including various image resolutions and different objects. By doing so, we hope to increase the efficiency of climate change monitoring and pave the way for more informed decisions that can have a global impact. We plan to carry out further research over the summer, ultimately publishing a paper to share our insights with the wider community. We are confident that our work will significantly contribute to the ongoing efforts to combat climate change. Together, we can make a positive difference for our planet!


Tracking Climate Change Using Satellites and Artificial Intelligence

Poster by Kushagra Ghosh, Malini Kamlani, Muaz Bin Kashif, Alexander Van Lanschot, Darui Lu, Evan Ma, Song Young Oh, Morgan Pruchniewski, Anushka Srinivasan, Kyle Bradbury and Jordan Malof

Research poster.