Group

Autonomous Camera Traps for Insects / Feed

Camera trapping for insects is becoming a reality using advances in camera, AI, and autonomous systems technologies. This group discusses the latest advances, shares experiences, and offers a space for anyone interested in the technology, from beginners to experts.

discussion

Mothbox + Mothbeam Update: 4

Prepping for DeploymentIt's been mega busy at dinalab here in panama as me, Kitty, and @Hubertszcz prepped for his big field deployment in western panama.We finalized our designs...

1 0

We did some more testing with the Mothbeam in the forest. It's the height of dry season right now, so not many moths came out, but the mothbeam shined super bright and attracted a whole bunch of really tiny things that swarmed a lot

 

 

and some nocturnal bees

 

you could also see the mothbeam's aura from  far away in the forest! so that was impressive!

I also tested out attaching a 12V USB booster cable to the Mothbeam, and it works 

 

nice! So you can attach regular USB 5V battery packs to the mothbeam as well!

 

 

 

See full post
discussion

Underwater camera trap - call for early users

Hi!The CAMPHIBIAN project aims at developing an underwater camera trap primarily targeting amphibian such as newts, but co-occurring taxa are recorded as well such as frogs, grass...

7 4

Many thanks for your contribution to the survey! We are now summarizing the list of early users and making our best to propose a newtcam to all in due time. 

All the best!

Xavier

See full post
discussion

Testing Raspberry Pi cameras: Results

So, we (mainly @albags ) have done some tests to compare the camera we currently use in the AMI-trap with the range of cameras that are available for the Pi. I said in a thread...

9 0

And finally for now, the object detectors are wrapped by a python websocket network wrapper to make it easy for the system to use different types of object detectors. Usually, it's about 1/2 a day for me to write a new python wrapper for a new object detector type. You just need to wrap in the network connection and make it conform to the yolo way of expressing the hits, i.e. the json format that yolo outputs with bounding boxes, class names and confidence level.

What's more, you can even use multiple object detector models in different parts of a single captured image and you can cascade the logic to require multiple object detectors to match for example, or a choice from different object detectors.

It's the perfect anti-poaching system (If I say so myself :) )

Hey @kimhendrikse , thanks for all these details. I just caught up. I like your approach of supporting multiple object detectors and using the python websockets wrapper! Is your code available somewhere?

Yep, here:

Currently it only installs on older Jetsons as in the coming weeks I’ll finish the install code for current jetsons.


Technically speaking, if you were an IT specialist you could even make it work in wsl2 Ubuntu on windows, but I haven’t published instructions for that. If you were even more of a specialist you wouldn’t need wsl2 either. One day I’ll publish instructions for that once I’ve done it. Though it would be slow unless the windows machine had an NVidia GPU and you PyTorch work with it.

See full post
discussion

Update 3: Cheap Automated Mothbox

Wanted to share a final set of updates on the work we have been cranking on here in Gamboa, Panama making this inexpensive portable night insect surveyor! There's been a lot of...

3 5

This looks amazing! I'm currently work with hastatus bats up in Bocas, it would be really interesting to utilize some of these near foraging sites. Be sure to post again when you post the final documentation on github!

 

Also, Gamboa......dang I miss that little slice of heaven...

 

Super cool work Andrew!

 

Best,

Travis

Great work! I very much look forward to trying out the MothBeam light. That's going to be a huge help in making moth monitoring more accessible.

And well done digging into the picamera2 library to reduce the amount of time the light needs to be on while taking a photo. That is a super annoying issue!

See full post
discussion

Project introductions and updates

Tell us about your project!If you are just starting out with autonomous camera traps for insects, or if you are a seasoned expert, this is the place to share your...

28 1

Hi all! I'm part of a Pollinator Monitoring Program at California State University, San Marcos which was started by a colleague lecturer of mine who was interested in learning more about the efficacy of pollinator gardens. It grew to include comparing local natural habitat of the Coastal Sage Scrub and I was initially brought on board to assist with data analysis, data management, etc. We then pivoted to the idea of using camera traps and AI for insect detection in place of the in-person monitoring approach (for increasing data and adding a cool tech angle to the effort, given it is of interest to local community partners that have pollinator gardens). 

The group heavily involves students as researchers, and they are instrumental to the projects. We have settled on a combination of video footage and development of deep neural networks using the cloud-hosted video track detection tool, VIAME (developed by Kitware for NOAA Fisheries originally for fish track detection). Students built our first two PICTs (low-cost camera traps), and annotated the data from our pilot study that we are currently starting the process of network development for. Here's a cool pic of the easy-to-use interface that students use when annotating data: 

A screenshot of a computer

Description automatically generated with medium confidence

                                                    Figure 1: VIAME software demonstrating annotation of the track of an insect in the video (red box). Annotations are                                                           done manually to develop a neural network for the automated processing.

The goal of the group's camera trap team is develop a neural network that can track insect pollinators associated with a wide variety of plants, and to use this information to collect large datasets to better understand the pollinator occurrence and activities with local habitats. This ultimately relates to native habitat health and can be used for long-term tracking of changes in the ecosystem, with the idea that knowledge of pollinators may inform resources and conservation managers, as well as local organizations in their land use practices. We ultimately are interested in working with the Kitware folks further to not only develop a robust network (and share broadly of course!), but also to customize the data extraction from automated tracks to include automated species/species group identification and information on interaction rate by those pollinators. We would love any suggestions for appropriate proposals to apply to, as well as any information/suggestions regarding the PICT camera or suggestions on methods. We are looking to include night time data collection at some point as well and are aware the near infrared is advised, but would appreciate any thoughts/advice on that avenue as well. 

We will of course post when we have more results and look forward to hearing more about all the interesting projects happening in this space!

Cheers, 
Liz Ferguson 

HI, indeed as Tom mentioned, I am working here in Vermont on moth monitoring using machines with Tom and others. We have a network going from here into Canada with others. Would love to catch up with you soon. I am away until late April, but would love to connect after that!

See full post
discussion

Update 2: Cheap Automated Mothbox

Here's an update of the work we are doing building the cheap open source mothbox for @Hubertszcz External design In a lot of wilderness tool development...

3 1

Hi Andrew,

thanks for sharing your development process so openly, that's really cool and boosts creative thinking also for the readers! :)

Regarding a solution for Raspberry Pi power management: we are using the PiJuice Zero pHAT in combination with a Raspberry Pi Zero 2 W in our insect camera trap. There are also other versions available, e.g. for RPi 4 (more info: PiJuice GitHub). From my experience the PiJuice works mostly great and is super easy to install and set up. Downsides are the price and the lack of further software updates/development. It would be really interesting if you could compare one of their HATs to the products from Waveshare. Another possible solution would be a product from UUGear. I have the Witty Pi 4 L3V7 lying around, but couldn't really test and compare it to the PiJuice HAT yet.

Is there a reason why you are using the Raspberry Pi 4? From what I understand about your use case, the RPi Zero 2 W or even RPi Zero should give enough computing power and require a lot less power. Also they are smaller and would be easier to integrate in your box (and generate less heat).

I'm excited for the next updates to see in which direction you will be moving forward with your Mothbox!

Best,

Max

Thanks a lot for this detailed update on your project! It looks great!

See full post
discussion

Cheap Automated Mothbox

Hi everyone! @Hubertszcz has a biodiversity monitoring project in Panama, and we have been working on quick and dirty, ultra low cost high quality insect monitoring. We built a...

9 1

I'm looking into writing a sketch for the esp32-cam that can detect pixel changes and take a photo, wish me luck.

One question, does it even need motion detection? What about taking a photo every 5 seconds and sorting the photos afterwards?

It depends on which scientists you talk to. I am an favor of just doing a timelapse and doing a post-processing sort afterwards. There's not much reason i can see for such motion fidelity. For the box i am making we are doing exactly that, though maybe a photo every minute or so

See full post
discussion

Metadata standards for Automated Insect Camera Traps

Have others watched this webinar from GBIF introducing the data model for camera trap data. I wonder if this is something we can easily adopt/adapt for our sorts of camera traps?

5 3

Yes. I think this is really the way to go!

Here is another metadata initiative to be aware of. OGC has been developing a standard for describing training datasets for AI/ML image recognition and labeling. The review phase is over and it will become a new standard in the next few weeks. We should consider its adoption when we develop our own training image collections.

See full post
discussion

Q&A: UK NERC £3.6m AI (image) for Biodiversity Funding Call - ask your questions here

In our last Variety Hour, Simon Gardner, Head of Digital Environment at NERC, popped in to share more about their open £3.6m funding call supporting innovation in tools for...

1 1

This is super cool! Me and @Hubertszcz and @briannajohns and several others are all working towards some big biodiversity monitoring projects for a large conservation project here in panama. The conservation project is happening already, but hubert starts on the ground work in January and im working on a V3 of our open source automated insect monitoring box to have ready for him by then.

 

I guess my main question would be if this funding call is appropriate/interested for this type of project? and what types of assistance are possible through this type of funding (researchers? design time? materials? laboratory field construction)

See full post
discussion

Best Material for Moth Lighting?

Maybe folks have discussed this before? But does anyone have an "optimal" material for moth lighting?My guess is the best material would be something with high reflectivity and...

1 0

Plasticy substances like polyester can be slippery, so I imagine that's why cotton is most often used. White is good for color correction, while still reflecting light pretty well. When I've had the option I've chosen high thread count cotton sheets, so the background is smoothest and even the tiniest arthropods are on a flat background, not within contours of threads. Main problem with cotton is mildew and discoloration. 

 

That being said, I haven't actually done proper tests with different materials. Maybe a little side project once standardized light traps are a thing? 

See full post
discussion

360 Camera for Marine Monitoring

Hi all, I'm trying to set up a low-cost, 360 camera for underwater use. The main criteria are:1. It needs to run for 1 week, with 3* 2 hour intervals of recording per day,...

4 1

Hi Sol,

For my research on fish, I had to put together a low-cost camera that could record video for several weeks. Here is the design I came up with

At the time of the paper, I was able to record video for ~12 hours a day at 10 fps and for up to 14 days. With new SD cards now, it is pushed to 21 days. It costs about 600 USD if you build it yourself. If you don't want to make it yourself, there is a company selling it now, but it is much more expensive. The FOV is 110 degrees, so not the 360 that you need, but I think there are ways to make it work (e.g. with the servo motor). 

Happy to chat if you decide to go this route and/or want to brainstorm ideas.

Cheers,

Xavier 

Hi Xavier, this is fantastic! Thanks for sharing, the time frame is really impressive and really in line with what we're looking for. I'll send you a message.

Cheers,

Sol

See full post
discussion

Insect camera traps for phototactic insects and diurnal pollinating insects

Hello, we developed an automated camera trap for phototactic insects a few years ago and are planning on further developing our system to also assess diurnal pollinating...

13 0

Hi @abra_ash , @MaximilianPink, @Sarita , @Lars_Holst_Hansen

I'm looking to train a very compact (TinyML) model for flying pollinator detection on a static background. I hope a network small enough for microcontroller hardware will prove useful for measuring plant-pollinator interactions in the field. 

Presently, I'm gathering a dataset for training using a basic motion-triggered video-capture program on a raspberry pi. This forms a very crude insect camera trap. 

I'm wondering if anyone has any insights on how I might attract pollinators into my camera field of view?  I've done some very elementary reading on bee optical vision and currently trying the following: 

Purple and yellow artifical flowers are placed on a green background, the center of the flowers are lightly painted with a UV (365nm) coat. 

A sugar paste is added to each flower. 

The system is deployed in an inner-city garden (outside my flat), and I regularly see bees attending the flowers nearby. 

Here's a picture of the field of view: 

Does anyone have ideas for how I might maximise insect attraction? I'm particularly interested in what @abra_ash and @tom_august might have to say - are optical methods enough or do we need to add pheremone lures?

Thanks in advance!

Best, 

Ross

 

Hi Ross, 

Where exactly did you put the UV paint? Was it on the petals or the actual middle of the flowers? 

I would recommend switching from sugar paste to sugar water and maybe put a little hole in the centre for a nectary. Adding scent would make the flowers more attractive but trying to attract bees is difficult since they very obviously prefer real flowers to artificial ones. I would recommend getting the essential oil Linalool since it is a component of scented nectar and adding a small amount of it to the sugar water. Please let us know if the changes make any difference!

Kind Regards, 

Abra

 

See full post
discussion

Welcome to the Autonomous camera traps for insects group!

Hello and welcome to the Autonomous Camera Traps for Insects group :).In this group we will be discussing the use of autonomous camera traps as a tool for long-term remote...

16 6

Hi Peter,

EcoAssist looks really cool! It's great that you combined every step for custom model training and deployment into one application. I will take a deeper look at it asap.

Regarding YOLOv5 insect detection models:

  • Bjerge et al. (2023) released a dataset with annotated insects on complex background together with three YOLOv5 models at Zenodo.
  • For a DIY camera trap for automated insect monitoring, we published a dataset with annotated insects on homogeneous background (flower platform) at Roboflow Universe and at Zenodo. The available models that are trained on this dataset are converted to .blob format for deployment on the Luxonis OAK cameras. If you are interested, I could train a YOLOv5 model with your preferred parameters and input size and send it to you in PyTorch format (and/or ONNX for CPU inference) to include in your application. Of course you can also use the dataset to train the model on your own.

Best,
Max

Greetings, everyone! I'm thrilled to join this wonderful community. I work as a postdoctoral researcher at MeBioS KU Leuven having recently completed my PhD on "Optical insect identification using Artificial Intelligence". While our lab primarily focuses on agricultural applications, we're also eager to explore biodiversity projects for insect population estimation, which provides crucial insights into our environment's overall health.

Our team has been developing imaging systems that leverage Raspberry Pi's, various camera models, and sticky traps to efficiently identify insects. My expertise lies in computer science and machine learning, and I specialize in building AI classification models for images and wingbeat signals. I've worked as a PhD researcher at a Neurophysiology lab in the past, as well as a Data Scientist at an applied AI company. You can find more about me by checking my website or my linkedin.

Recently, I've created a user-friendly web-app (Streamlit) which is hosted on AWS (FastAPI) that helps entomology experts annotate insect detections to improve our model's predictions. You can find some examples of this work here: [link1] and [link2].  And lastly, for anyone interested in tiling large images for object detection or segmentation purposes in a fast and efficient way, please check my open-source library "plakakia".

I'm truly excited to learn from and collaborate with fellow members of this forum, and I wish you all the best with your work!

Yannis Kalfas

See full post