Autonomous Camera Traps for Insects / Feed

Camera trapping for insects is becoming a reality using advances in camera, AI, and autonomous systems technologies. This group discusses the latest advances, shares experiences, and offers a space for anyone interested in the technology, from beginners to experts.


Metadata standards for Automated Insect Camera Traps

Have others watched this webinar from GBIF introducing the data model for camera trap data. I wonder if this is something we can easily adopt/adapt for our sorts of camera traps?

2 1

I did attend the webinar and had a strong feeling that this standard will be well supported and taken up in the camera trapping community! I would also love to hear if someone has tried to use it.

I've added this to the main camera trap thread as it would be good to get thoughts from those folk too.

See full post

Cameras - pros and cons

So, what makes a good camera for an autonomous camera trap for insects?We use a web camera in our system, which seems to work well a lot of the time, it produces...

10 1

Hi Valentin, sorry for the big delay! I forgot to set notifications for messages. 


Yes so the PICT was relatively straight forward to follow - we are not python experts either but could follow along enough. The one thing we need to sort out is capping the data storage, etc. as it crashes them. And there's not instructions per say for that but sure we can work it out. So I would say yes its certainly worth a try! We had a few students putting together the cameras as well, along with our support so always encouraging if intructions work in that scenario too. 


VIAME is awesome - you actually can use a cloud-based version and forgo the hefty computing requirements by going to Super cool option!

Hi there! I'm actually in the process of procuring another set of camera trap hardware for our group and would be interested in this is if you had the full set of documentation for putting together hardware? We have the PICT camera's but had some problems getting Raspberry Pis so was hoping to find some alternate. But if there isn't one then of course, no problem! If easier my email is [email protected] Thanks!

Hi Liz, unfortunately you will still need a Raspberry Pi as host for the OAK-1 camera to reproduce our hardware setup. It's also possible to use another Linux-based system (e.g. Raspberry Pi alternatives), but I didn't test this myself and the setup process will be different to our documentation (and probably not so straightforward). I'm planning to publish the documentation website in the next weeks, but I can already send you detailed information about putting together the hardware if you are still interested.

See full post

Easy-RIDER project Workshop IV: Pollinator monitoring recording

In case you missed our webinar on Pollinator monitoring, here is the recording.

We had presentations from three teams who will be presenting their work in designing automated monitoring tools for flower-visiting insects, different ways for creating datasets for training machine learning algorithms for insect identification and how these new technologies can be integrated in traditional monitoring schemes. The talks were followed with a discussion session.


Project introductions and updates

Tell us about your project!If you are just starting out with autonomous camera traps for insects, or if you are a seasoned expert, this is the place to share your...

10 1

Machine Learning for Automated Monitoring of Moths

We are an interdisciplinary team focused on building a centralized platform and library of machine learning tools for moth camera trap systems. This work is being carried out by David Rolnick (McGill, Mila), Aditya Jain (University of Toronto, Mila), Maxim Larrivée (Montreal Insectarium), Fagner Cunha (Federal University of Amazonas), and Michael Bunsen (eButterfly,, in collaboration with teams at UKCEH, Vermont Centre for Ecostudies, Aarhus University, and elsewhere.

We are building a software architecture whose central theme is to be agnostic to trap hardware (camera type, background sheet color, lighting conditions, and geography of operation)  and which requires minimal expert labeling of moth species to train the machine learning system. Our current updates are the following:

  • A pipeline for training a moth species classifier for any region using open-source labeled data from GBIF (this will soon be released on GitHub)
  • A binary classifier that distinguishes moths from non-moths
  • Modules for localization and tracking of moths

Our ongoing work focuses on using unsupervised/semi-supervised methods for improving the model classification using unlabelled trap data. The below figure shows our localization and classification pipeline:


We have tested our software on data collected from traps deployed in Quebec, Canada and Vermont, USA. We will soon deploy them on data from Denmark and the UK through our partners.


Hi Maximillian

I would also like to give the set up a try, if that would be OK? Could you email [email protected] 

We would want to use it for dung beetles, using a dung based bait, do you think it would be suitable in that context? 


Andy Gray


Multi-point insect pollination monitoring using computer vision for agriculture

We are an interdisciplinary team from Monash University and RMIT University, Melbourne, Australia. The core-team members are Alan Dorin (Monash University), Adrian Dyer (RMIT University/Monash University), Malika Ratnayake (Monash University), Chathurika Amarathunga (Monash University) and Asaduz Zaman (Monash University). Our research mainly focuses on designing computer vision-based systems for automated pollination monitoring in agriculture. 

Last year, we developed a computer vision system for pollination monitoring in an industrial strawberry farm. It comprised (1) Edge computing-based multi-point video recording, (2) Automated multispecies insect tracking and (3) Insect behavioural analysis. 

  1. Edge computing-based multi-point video recording

We used Raspberry Pi-based units as camera traps to record videos of pollinators across multiple monitoring stations on a strawberry farm. We used Raspberry Pi 4 with Raspberry Pi camera v2 to record continuous videos of insects at 1920 × 1080 resolution at 30fps. Camera units were powered using 20000 mAh battery banks and mounted on a combination of tripods of monopods. [More info: pub3]

  1. Automated multispecies insect tracking

We developed an automated multispecies tracking algorithm to extract insect tracks from recorded videos. Our algorithm (Hybrid Detection and Tracking - HyDaT) uses a hybrid detection model consisting of a deep learning-based object detection model (YOLO) and a background subtraction-based detection model (KNN background subtractor) to track insects. Here, the YOLO model is used to detect insects at their first appearance in the frame and identify their species. The background subtraction is used to detect insects’ position in all other frames, provided that there are no multiple detections in the foreground. If the environment is too dynamic, and the background subtraction cannot accurately identify the insect’s position, the YOLO model is used for the detection. This enables tracking of unmarked and free-flying insects amidst the changes in the environment. In addition, the YOLO model is used to detect and track flowers enabling insect-flower interaction monitoring. [More info: pub1, pub2, pub3]

  1. Insect behavioural analysis

We analysed the pollination behaviour of four insect types (Honeybee, Syrphidae, Lepidoptera and Vespidae) using the insect and flower tracks extracted using the methods described above. This analysis involved insect and flower-visit counting across the study area. In addition, we used the extracted data to analyse the contribution made by each insect type towards strawberry pollination. We also curated a dataset of over 2300 insect trajectories of four insect types and YOLOv4 annotated images. [More info: pub3]


[1] Tracking individual honeybees among wildflower clusters with computer vision-facilitated pollinator monitoring [Code][Dataset]

[2] Towards Computer Vision and Deep Learning Facilitated Pollination Monitoring for Agriculture [Code][Dataset]

[3] Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination [Code][Dataset]

[4] Towards precision apiculture: Traditional and technological insect monitoring methods in strawberry and raspberry crop polytunnels tell different pollination  stories [Dataset]

For more information, contact me on Twitter or send me a message.

See full post

Implementation of video surveillance to quantify the predation rate

Hello everyone,First of all, thank you for all the information on your great website. My name is Julien Péters and I am a PhD student at the University of Liège (Belgium). For my...

2 0


We are having this problem too and it might be worthy of its own thread! The lack of RaspberryPis is a big problem and we are currently looking into alternatives. We haven't found one yet, but if we do I will let you know. @Max_Sitt might have some suggested alternatives for his system?

Hi Julien,


we are working with the Luxonis OAK-1 which can run lightweight detection models (e.g. YOLOv5n/s) directly on-device. However you will still need a host, for outdoor deployment Raspberry Pi (e.g. Zero 2 W) is perfect. But for testing you could also use another Linux-based system as host device or just connect it to e.g. your notebook. You can find more info in the Luxonis Docs.


Regarding the Raspberry Pi availability, this blog post from Jeff Geerling probably sums up the current situation pretty well. I hope in Q1 2023 the situation will get better, but at the moment nobody really knows for sure.

See full post

Welcome to the Autonomous camera traps for insects group!

Hello and welcome to the Autonomous Camera Traps for Insects group :).In this group we will be discussing the use of autonomous camera traps as a tool for long-term remote...

11 5

Hi folks! Great to have this group online! I actually am a marine ecologist specializing in bioacoustics but recently joined a Pollinator Monitoring group a Cal-State San Marcos and am helping with an automated camera aspect of the project. We are using the PICT guide and recently migrated to use of VIAME for insect track detection (well, we are just trying it out now). Very exciting work, look forward to learning from you all! Has anyone worked work PICTS? 

- Liz Ferguson

Hi Tom

I am a farmer in Devon researching silvopasture with several research organisations. I have just landed a farmer lead grant to research dung beetles using camera traps and AI. I am collaborating with Rothamsted. 


Do you have any tips on sourcing the right camera trap please?



Hi there!

I am a field biologist and research technician working with ecosystem monitoring at the research station Zackenberg in Northeast Greenland. For the last couple of years we have been cooperating with Toke Høye and have deployed his timelapse cameras on Dryas flowers to monitor polinator activity but also to compare to our nearby flower phenology monitoring plots. We have also done yellow pit fall trap monitoring in plots and have a long time series of 25 years. We are considering testing camera based methods for this though and I am happy to see many folks working along those lines. Hopefully I can get some inspiration here and we can start testing it out.

Cheers, Lars




See full post

Workshop IV: Pollinator monitoring

This workshop is part of a series of online meetings to share experiences around the globe using automated technology (Cameras + AI) to monitor moths and other nocturnal insects.

2 1
This sounds amazing and I advertised it among my colleagues. Unfortunately, most probably I will not be able to attend, however, would be nice if you can provide the recording...
See full post

Most interesting images / sightings 'caught on camera'

A thread for people to upload the most interesting or unusual sightings recorded by their traps. To get the ball rolling here's a coy looking crow..

5 3

No - the trap was in their path and they just walked through it. I've now moved it to a place they can't go. The biggest threat to the moths is from pied currawongs. I schedule the trap so it shuts off at least two before sunrise to try to avoid them feasting on the larger insects.

At first I was finding wings below the screen in the morning when I put our units out. So I put a game camera on the units to see what was feeding and when. I found three bird species, likely 3 individuals, quickly found it to be a good bird feeder- Song Sparrow (most frequent), House Wren, and this Tufted Titmouse. I changed my units to turn off about 1.5 hours before dawn and that worked! Nearly all the moths left the scene before the birds came to visit. 

My most prized camera trap image - a hummingbird caught on camera!


See full post

Identify animal from Image

I am thankful to the members of Wildlabs net for giving us the right information to enable us to plan Bioacoustics solution implementation. It seems to be on track as of now....

2 2

Hi Jitendra.

If they are still images, many people are using Megadetector to analyze their images. I'm not sure how it will do in species classification, but it can tell you if there are images of interest in the shots. Others here can probably give you more detailed instructions on how to use it to batch process camera trap images.


Have you considered creating a Kaggle competition? If you already have lots of images, and some that have been labelled, then this could be a good way to get people working on a solution

See full post