discussion / AI for Conservation  / 27 July 2017

MIT's SLOOP: machine learning (ML) animal image recognition

Joining others in the space like IBEIS and Dr. Frederic Maire from Queensland University of Technology, MIT has a program to do something similar. From the MIT SLOOP site:

What is SLOOP?
SLOOP is a pattern retrieval engine for Animal Biometrics that uses cloud computing, machine learning and crowd sourcing to greatly improve the study of animal movement and behavior.

In order for scientists to get a handle on issues such as genetic variation, dispersal, diversity and movement of a species, an accurate track or capture history of individuals is needed. Historically, such counts involved a laborious process of comparing hundreds of images, often obtained by remote camera. Scientists and students spent thousands of hours poring over these images in order to identify individual animals, time arguably better spent elsewhere.

SLOOP applies pattern recognition to these same images. It then challenges online crowds to sort and identify a small proportion of potential matches. SLOOP learns from the citizen scientist’s skill to match images of individual animals more accurately and far faster than by previous methods. It is an early example of a system where the human uses the machine to accomplish a task more efficiently and the machine learns from sparse human input to get better at what it does.

SLOOP was originally designed for the marbled salamander of Western Massachusetts. Scientists around the globe rapidly understood its application to other species. Thus far, SLOOP has been adapted for use with Grand and Otago skinks in New Zealand and is in the process of being extended to many species.




It looks like they haven't updated for a couple of years do you know if it is still active or are they changing to a different system like tensor flow?