Group

Autonomous Camera Traps for Insects / Feed

Camera trapping for insects is becoming a reality using advances in camera, AI, and autonomous systems technologies. This group discusses the latest advances, shares experiences, and offers a space for anyone interested in the technology, from beginners to experts.

discussion

Update 3: Cheap Automated Mothbox

Wanted to share a final set of updates on the work we have been cranking on here in Gamboa, Panama making this inexpensive portable night insect surveyor! There's been a lot of...

3 5

This looks amazing! I'm currently work with hastatus bats up in Bocas, it would be really interesting to utilize some of these near foraging sites. Be sure to post again when you post the final documentation on github!

 

Also, Gamboa......dang I miss that little slice of heaven...

 

Super cool work Andrew!

 

Best,

Travis

Great work! I very much look forward to trying out the MothBeam light. That's going to be a huge help in making moth monitoring more accessible.

And well done digging into the picamera2 library to reduce the amount of time the light needs to be on while taking a photo. That is a super annoying issue!

See full post
discussion

Project introductions and updates

Tell us about your project!If you are just starting out with autonomous camera traps for insects, or if you are a seasoned expert, this is the place to share your...

28 1

Hi all! I'm part of a Pollinator Monitoring Program at California State University, San Marcos which was started by a colleague lecturer of mine who was interested in learning more about the efficacy of pollinator gardens. It grew to include comparing local natural habitat of the Coastal Sage Scrub and I was initially brought on board to assist with data analysis, data management, etc. We then pivoted to the idea of using camera traps and AI for insect detection in place of the in-person monitoring approach (for increasing data and adding a cool tech angle to the effort, given it is of interest to local community partners that have pollinator gardens). 

The group heavily involves students as researchers, and they are instrumental to the projects. We have settled on a combination of video footage and development of deep neural networks using the cloud-hosted video track detection tool, VIAME (developed by Kitware for NOAA Fisheries originally for fish track detection). Students built our first two PICTs (low-cost camera traps), and annotated the data from our pilot study that we are currently starting the process of network development for. Here's a cool pic of the easy-to-use interface that students use when annotating data: 

A screenshot of a computer

Description automatically generated with medium confidence

                                                    Figure 1: VIAME software demonstrating annotation of the track of an insect in the video (red box). Annotations are                                                           done manually to develop a neural network for the automated processing.

The goal of the group's camera trap team is develop a neural network that can track insect pollinators associated with a wide variety of plants, and to use this information to collect large datasets to better understand the pollinator occurrence and activities with local habitats. This ultimately relates to native habitat health and can be used for long-term tracking of changes in the ecosystem, with the idea that knowledge of pollinators may inform resources and conservation managers, as well as local organizations in their land use practices. We ultimately are interested in working with the Kitware folks further to not only develop a robust network (and share broadly of course!), but also to customize the data extraction from automated tracks to include automated species/species group identification and information on interaction rate by those pollinators. We would love any suggestions for appropriate proposals to apply to, as well as any information/suggestions regarding the PICT camera or suggestions on methods. We are looking to include night time data collection at some point as well and are aware the near infrared is advised, but would appreciate any thoughts/advice on that avenue as well. 

We will of course post when we have more results and look forward to hearing more about all the interesting projects happening in this space!

Cheers, 
Liz Ferguson 

HI, indeed as Tom mentioned, I am working here in Vermont on moth monitoring using machines with Tom and others. We have a network going from here into Canada with others. Would love to catch up with you soon. I am away until late April, but would love to connect after that!

See full post
discussion

Update 2: Cheap Automated Mothbox

Here's an update of the work we are doing building the cheap open source mothbox for @Hubertszcz External design In a lot of wilderness tool development...

3 1

Hi Andrew,

thanks for sharing your development process so openly, that's really cool and boosts creative thinking also for the readers! :)

Regarding a solution for Raspberry Pi power management: we are using the PiJuice Zero pHAT in combination with a Raspberry Pi Zero 2 W in our insect camera trap. There are also other versions available, e.g. for RPi 4 (more info: PiJuice GitHub). From my experience the PiJuice works mostly great and is super easy to install and set up. Downsides are the price and the lack of further software updates/development. It would be really interesting if you could compare one of their HATs to the products from Waveshare. Another possible solution would be a product from UUGear. I have the Witty Pi 4 L3V7 lying around, but couldn't really test and compare it to the PiJuice HAT yet.

Is there a reason why you are using the Raspberry Pi 4? From what I understand about your use case, the RPi Zero 2 W or even RPi Zero should give enough computing power and require a lot less power. Also they are smaller and would be easier to integrate in your box (and generate less heat).

I'm excited for the next updates to see in which direction you will be moving forward with your Mothbox!

Best,

Max

Thanks a lot for this detailed update on your project! It looks great!

See full post
discussion

Cheap Automated Mothbox

Hi everyone! @Hubertszcz has a biodiversity monitoring project in Panama, and we have been working on quick and dirty, ultra low cost high quality insect monitoring. We built a...

9 1

I'm looking into writing a sketch for the esp32-cam that can detect pixel changes and take a photo, wish me luck.

One question, does it even need motion detection? What about taking a photo every 5 seconds and sorting the photos afterwards?

It depends on which scientists you talk to. I am an favor of just doing a timelapse and doing a post-processing sort afterwards. There's not much reason i can see for such motion fidelity. For the box i am making we are doing exactly that, though maybe a photo every minute or so

See full post
discussion

Metadata standards for Automated Insect Camera Traps

Have others watched this webinar from GBIF introducing the data model for camera trap data. I wonder if this is something we can easily adopt/adapt for our sorts of camera traps?

5 3

Yes. I think this is really the way to go!

Here is another metadata initiative to be aware of. OGC has been developing a standard for describing training datasets for AI/ML image recognition and labeling. The review phase is over and it will become a new standard in the next few weeks. We should consider its adoption when we develop our own training image collections.

See full post
discussion

Q&A: UK NERC £3.6m AI (image) for Biodiversity Funding Call - ask your questions here

In our last Variety Hour, Simon Gardner, Head of Digital Environment at NERC, popped in to share more about their open £3.6m funding call supporting innovation in tools for...

1 1

This is super cool! Me and @Hubertszcz and @briannajohns and several others are all working towards some big biodiversity monitoring projects for a large conservation project here in panama. The conservation project is happening already, but hubert starts on the ground work in January and im working on a V3 of our open source automated insect monitoring box to have ready for him by then.

 

I guess my main question would be if this funding call is appropriate/interested for this type of project? and what types of assistance are possible through this type of funding (researchers? design time? materials? laboratory field construction)

See full post
discussion

Best Material for Moth Lighting?

Maybe folks have discussed this before? But does anyone have an "optimal" material for moth lighting?My guess is the best material would be something with high reflectivity and...

1 0

Plasticy substances like polyester can be slippery, so I imagine that's why cotton is most often used. White is good for color correction, while still reflecting light pretty well. When I've had the option I've chosen high thread count cotton sheets, so the background is smoothest and even the tiniest arthropods are on a flat background, not within contours of threads. Main problem with cotton is mildew and discoloration. 

 

That being said, I haven't actually done proper tests with different materials. Maybe a little side project once standardized light traps are a thing? 

See full post
Link

Improving the generalization capability of YOLOv5 on remote sensed insect trap images with data augmentation

Interesting new methods to help improve insect detection

"...this paper proposes three previously unused data augmentation approaches (gamma correction, bilateral filtering, and bit-plate slicing) which artificially enrich the training..."

0
discussion

360 Camera for Marine Monitoring

Hi all, I'm trying to set up a low-cost, 360 camera for underwater use. The main criteria are:1. It needs to run for 1 week, with 3* 2 hour intervals of recording per day,...

4 1

Hi Sol,

For my research on fish, I had to put together a low-cost camera that could record video for several weeks. Here is the design I came up with

At the time of the paper, I was able to record video for ~12 hours a day at 10 fps and for up to 14 days. With new SD cards now, it is pushed to 21 days. It costs about 600 USD if you build it yourself. If you don't want to make it yourself, there is a company selling it now, but it is much more expensive. The FOV is 110 degrees, so not the 360 that you need, but I think there are ways to make it work (e.g. with the servo motor). 

Happy to chat if you decide to go this route and/or want to brainstorm ideas.

Cheers,

Xavier 

Hi Xavier, this is fantastic! Thanks for sharing, the time frame is really impressive and really in line with what we're looking for. I'll send you a message.

Cheers,

Sol

See full post
discussion

Insect camera traps for phototactic insects and diurnal pollinating insects

Hello, we developed an automated camera trap for phototactic insects a few years ago and are planning on further developing our system to also assess diurnal pollinating...

13 0

Hi @abra_ash , @MaximilianPink, @Sarita , @Lars_Holst_Hansen

I'm looking to train a very compact (TinyML) model for flying pollinator detection on a static background. I hope a network small enough for microcontroller hardware will prove useful for measuring plant-pollinator interactions in the field. 

Presently, I'm gathering a dataset for training using a basic motion-triggered video-capture program on a raspberry pi. This forms a very crude insect camera trap. 

I'm wondering if anyone has any insights on how I might attract pollinators into my camera field of view?  I've done some very elementary reading on bee optical vision and currently trying the following: 

Purple and yellow artifical flowers are placed on a green background, the center of the flowers are lightly painted with a UV (365nm) coat. 

A sugar paste is added to each flower. 

The system is deployed in an inner-city garden (outside my flat), and I regularly see bees attending the flowers nearby. 

Here's a picture of the field of view: 

Does anyone have ideas for how I might maximise insect attraction? I'm particularly interested in what @abra_ash and @tom_august might have to say - are optical methods enough or do we need to add pheremone lures?

Thanks in advance!

Best, 

Ross

 

Hi Ross, 

Where exactly did you put the UV paint? Was it on the petals or the actual middle of the flowers? 

I would recommend switching from sugar paste to sugar water and maybe put a little hole in the centre for a nectary. Adding scent would make the flowers more attractive but trying to attract bees is difficult since they very obviously prefer real flowers to artificial ones. I would recommend getting the essential oil Linalool since it is a component of scented nectar and adding a small amount of it to the sugar water. Please let us know if the changes make any difference!

Kind Regards, 

Abra

 

See full post
discussion

Welcome to the Autonomous camera traps for insects group!

Hello and welcome to the Autonomous Camera Traps for Insects group :).In this group we will be discussing the use of autonomous camera traps as a tool for long-term remote...

16 6

Hi Peter,

EcoAssist looks really cool! It's great that you combined every step for custom model training and deployment into one application. I will take a deeper look at it asap.

Regarding YOLOv5 insect detection models:

  • Bjerge et al. (2023) released a dataset with annotated insects on complex background together with three YOLOv5 models at Zenodo.
  • For a DIY camera trap for automated insect monitoring, we published a dataset with annotated insects on homogeneous background (flower platform) at Roboflow Universe and at Zenodo. The available models that are trained on this dataset are converted to .blob format for deployment on the Luxonis OAK cameras. If you are interested, I could train a YOLOv5 model with your preferred parameters and input size and send it to you in PyTorch format (and/or ONNX for CPU inference) to include in your application. Of course you can also use the dataset to train the model on your own.

Best,
Max

Greetings, everyone! I'm thrilled to join this wonderful community. I work as a postdoctoral researcher at MeBioS KU Leuven having recently completed my PhD on "Optical insect identification using Artificial Intelligence". While our lab primarily focuses on agricultural applications, we're also eager to explore biodiversity projects for insect population estimation, which provides crucial insights into our environment's overall health.

Our team has been developing imaging systems that leverage Raspberry Pi's, various camera models, and sticky traps to efficiently identify insects. My expertise lies in computer science and machine learning, and I specialize in building AI classification models for images and wingbeat signals. I've worked as a PhD researcher at a Neurophysiology lab in the past, as well as a Data Scientist at an applied AI company. You can find more about me by checking my website or my linkedin.

Recently, I've created a user-friendly web-app (Streamlit) which is hosted on AWS (FastAPI) that helps entomology experts annotate insect detections to improve our model's predictions. You can find some examples of this work here: [link1] and [link2].  And lastly, for anyone interested in tiling large images for object detection or segmentation purposes in a fast and efficient way, please check my open-source library "plakakia".

I'm truly excited to learn from and collaborate with fellow members of this forum, and I wish you all the best with your work!

Yannis Kalfas

See full post
discussion

Capture And Identify Flying Insects

Hello EveryoneI already found a lot of helpful information on this page though I am having a hard time finding a system which is confirmed to be able to identify insects that fly...

2 0

This sounds like an interesting challenge. I think depth of focus and shutter speeds are going to be challenging. You'll need a fast shutter speed to be able to get shape images of insects in flight. Are you interested in species ID or are you more interested in abundance. having a backboard on the other side of the hotel would be a good idea to remove background clutter from your images.

Hi there,

I am also trying to get some visuals from wildlife cameras of insects visiting insect hotels. Was wondering if you had gained any further information on which cameras might be used for testing this?

 

See full post
discussion

What is the best light for attracting moths?

We want to upgrade the UV lights on our moth traps. We currently use a UV fluorescent tube, but we are thinking about moving to a LED setup, like the LepiLED or EntoLED. We think...

5 0

We have also thought about these sorts of things. We have chosen to keep the light on continuously for the night, but turn it off before dawn to allow the moths to fly away before predators arrive. 

We are going to be trying out the EntoLEDs and LepiLEDs in Panama in the last two weeks of January, I'll post here on my thoughts.

Would be great to hear more. We found that the lepiLED was great! The ento mini did not attract as much, but if compensated with many nights of deployment it would probably work okay.

See full post
discussion

Hack a momentary on-off button 

I have several very bright UV LEDs that I bought for cheap online that are built into nice housings of UV curing lamps and flashlights that can already be automatically powered by...

5 0

 

 

Hi @hikinghack

If I am understanding correctly, you want to be able to have the UV lights come on and go off at a certain time (?) and emulate the button push which actually switches them on and off? Is the momentary switch the little button at the top of the image you attached? Is it going to be cotrolled by a timer or a microcontroller at all? Sorry for all the questions, but I am not 100% clear on exactly what you are after. In the meantime, I've linked to a pretty decent tutorial on the process of hacking a momentary switch with a view to automating it with an Arduino microcontroller board, although it sort of assumes a bit of knowledge of electronics (e.g. MOSFETS/transistors) in certain places.  

Alternatively, this tutorial is also good, with good explanations throughout:

If neither of these help, let me know and there might be some easier instructions I can put together. 

All the best,

Rob

Hi Andrew,

If I understand you correctly, you want to turn on the LEDs when USB power is applied.  The easiest way I can see to do this is to reroute the red wire to USBC VBUS, via an appropriate current limiting resistor.  This bypasses all the electronics in your photo.

You could insert the current limiting resistor in the USB cable for better heat dissipation, or use a DC-DC constant current source instead of a resistor if power consumption is a concern.

Further to @htarold 's excellent suggestion, you can replace that entire PCB with a simple USB breakout board (e.g. USB micro attached below) by removing the red wire and attaching it to VCC on the breakout board, and removing and attaching the black wire to GND. 

See full post
event

Camera traps, AI, and Ecology

3rd International Workshop in Jena, Germany (7. - 8. September 2023)

4 1
Are you ready for the event next week!? 17 talks, 4 sessions, 4 invited speakers - check out the schedule if you want more information:...
See full post
discussion

Who's going to ESA in Portland this year?

So who is going to be at the Ecological Society of America meeting this year in Portland, August 6-11th?It will be my first time at the conference, so I won't know many people...

4 3

Indeed, I'll be there too!  I like to meet new conservation friends with morning runs, so I will likely organize a couple of runs, maybe one right near the conference, and one somewhere in a nearby park where we can look for wildlife.  The latter would probably be at an obscenely early hour, so we can drive somewhere, ideally see elk (there are elk within 25 minutes of Portland!), and still get back in time for the morning sessions.

See full post