Over the last four decades, populations of African elephants and white and black rhinos have plummeted across the continent. This decline is a major concern for conservationists, who rely on wildlife surveys to track changes in population numbers and species movement. However, traditional survey methods using human counts in planes can be expensive, labour-intensive, and produce inconsistent results.
Connected Conservation Foundation (CCF) has led a cross-sector collaboration to explore if very-high satellite imagery (VHS) and AI could offer a feasible alternative to wildlife surveying in savannah environments.
Thanks to the help of the Airbus Foundation's 30 cm satellite imagery from Pléiades Neo, a team of data scientists, and field partners, worked to test whether AI and human detection of species on satellite images could replace traditional survey methods.
Three approaches for the detection and identification of animals on satellite images were trialled:
01. The human eye using field-based expertise and insight.
02. Computer vision and open-source (pre-trained) neural network models.
03. Bespoke convolutional neural network model, trained on a set of synthetic animal images.
This novel study compared the results of AI and human detections in these three approaches with on-the-ground sightings reported by field teams around the time of satellite passover.
Whilst some results showed promise, conservation teams at NRT and Madikwe Game Reserve concluded that both AI and human detection of species on 30 cm satellite imagery was not accurate enough to replace traditional wildlife survey techniques.
Specifically in certain situations, when:
- Different species are interspersed together in the same area (elephants/rhinos, wildebeest/buffalo). i.e. watering holes.
- Juvenile species are mixing with other adult species. For example, the difference between a juvenile elephant and an adult buffalo or rhino.
- When species are bunched close together, their shadows merge into one ‘blob’ and individual animal counting is not possible.
Head over to our blog for in-depth results, challenges and insights. We also highlight areas of future research needed to harness the new possibilities of using VHR satellite imagery and AI for biodiversity monitoring applications.
One promising use-case
Whilst the prorotype and study revealed that AI may not yet be capable of providing exact counts of animals in VHR satellite imagery, results found it can produce a heatmap style of potential sightings, acting as an indicator of "where to look '' for human-eye balling within an image. Scanning satellite imagery with the human eye is incredibly time-consuming and requires exceptional diligence. Consulting with an A.I. model to highlight areas of potential detection could save time and help us better understand and protect vulnerable animal populations.
Building on this work, CCF and Airbus Foundation are now seeking to identify the species and situations around the world where 50 cm and 30 cm resolution satellite imagery can be of value, especially in homogenous landscape or seascapes, where greater variability in accuracy can be tolerated.
Use cases could include monitoring colonies or herds of animals in hard-to-reach environments particularly where the objects of interest have a strong contrast to stand out within their background environment.
Airbus Foundation and the Connected Conservation Foundation recently partnered to launch the Satellites for Biodiversity Award, a program that will provide funding for projects and access to very high-resolution satellite data to monitor and manage threatened species and habitats.
The application period has now closed. Winners will be announced in April – so stay tuned to find out who they are!
Add the first post in this thread.