Group

Camera Traps / Feed

Looking for a place to discuss camera trap troubleshooting, compare models, collaborate with members working with other technologies like machine learning and bioacoustics, or share and exchange data from your camera trap research? Get involved in our Camera Traps group! All are welcome whether you are new to camera trapping, have expertise from the field to share, or are curious about how your skill sets can help those working with camera traps. 

discussion

Timelapse Infrared Camera Suggestions

I am researching cameras for my thesis project researching harbor seals. I need a trail camera that can take infrared images in Timelapse mode. Does anyone have any...

5 0

I doubt there is an off the shelf solution. Likely you will have to build one. Again I think the FLIR leptons could be of value here.

@krasi_georgiev  you have worked with Leptons before ? Is this something you are able to advise on ?

See full post
discussion

Pytorch-Wildlife: A Collaborative Deep Learning Framework for Conservation (v1.0)

Welcome to Pytorch-Wildlife v1.0At the core of our mission is the desire to create a harmonious space where conservation scientists from all over the globe can unite, share, and...

7 2

Is there any plan for upcoming workshop or training for PyTorch-Wildlife for non coder? I was trying to use MDv5 via PyTorch Wildlife but stop after stuck with the unfamiliarity of interface. Thanks! 

See full post
discussion

Camera Traps batteries waste

Hi Wildlabs community! I am wondering how you or your country handle the battery wastes after the Camera Traps - including one-off alkaline and Lithium. In Indonesia, there is...

23 0

Thanks for the update @Frank_van_der_Most . I have been curious about the AA Li-Ion/Li-Po batteries and how they perform. The sudden drop in reported voltage will likely be from the internal voltage regulator switching off when the internal cell gets to the low voltage threshold (usually around 3V) to avoid damage to the cell. Looking now at a discharge curve they show a constant 1.5V, then a step down to 1.1V before dropping to zero. I don't know how you can possibly test these externally to know how much energy they have left until you hit the 1.1V step. 

I've used AA rechargeable almost exclusively for many years now. I try to get rechargeables sourced from Japan (Panasonic Eneloop and Fujitsu), but have also used Eveready  and EBL. I've used them in Reconyx, Scoutguard, Loreda and other low end cameras. One option you could try if it's in your budget is the solar powered camera traps. You need one set of rechargeables when you first deploy them but don't need to change the batteries afterwards. I'm trialling the Gardepro model that a local supplier sells. I intend to deploy them high up in trees to monitor nest boxes and tree hollows, so regular access to change batteries and SD cards was going to be difficult.

See full post
discussion

Li-ion rechargeable batteries suddenly drain

I am currently testing a camera trap with rechargeable Li-ion batteries and untill a week ago they were doing much better than the NiMH rechargeable batteries that I used before....

9 0

Hi Frank,

Yes, I agree.  There is a halfway-house solution if you take a look at the Energizer Ultimate Lithium range of batteries.  They have superior life to alkaline batteries (3.5Ah in a single AA cell), though they are of course still one-time use.  They would also work a little better with your battery monitor since they have a graceful degradation between 1.8V down to 1.5V, but then they fall off a cliff ;-)

A multi meter would probably not be enough, unless you have a very fancy one. To get the energy use, you need to be able to integrate the current drawn over time.  Something like this: Otii Arc Pro

I don't know how much you paid for these, but Amazon Basics has a line of rechargeable AA batteries, including a high capacity version which can store 2400 mAh which is a little more than the ones you're using.  (I've seen even more capacity with other manufacturers.)

You might also want to consider avoiding batteries with USB ports in the future. It seems to me like just an additional thing that can break, especially if moisture could be an issue. 

Thanks for the link, Amanda. The price of $900 is a bit too steep for me, but at least I now know a bit better what you meant with a power analyser.

I can't remember either what I paid for the batteries, but I try to avoid buying stuff from mr. Bezos, because he is rich enough as far as I am concerned.

The moisture issue slipped my mind when I was in Europe. As far as  I remember, I liked the usb ports because I bought the batteries as a test and I didn't want to buy a separate charger, as I thought one needed one designed for Li-ion batteries. The usb ports made that possible.

See full post
discussion

Tools for automating image augmentation 

Does anyone know of tools to automate image augmentation and manipulation. I wish to train ML image recognition models with images in which the target animal (and false targets)...

11 0

Hi @arky !

Thanks for your reply.

I am running into pytorch/torchvision incompatibility issues when trying to run your script.

Which versions are you using?

Best regards,

Lars

 

@Lars_Holst_Hansen  Here is the information you requested. Also run Yolov8 in multiple remote environments without any issues.  Perhaps you'll need to use a virtual environment (venv et al) or conda to remedy incompatibility issues. 

$ yolo checks
Ultralytics YOLOv8.1.4 🚀 Python-3.10.12 torch-1.13.1+cu117 CUDA:0 (Quadro T2000, 3904MiB)
Setup complete ✅ (16 CPUs, 62.5 GB RAM, 465.0/467.9 GB disk)

OS                  Linux-6.5.0-17-generic-x86_64-with-glibc2.35
Environment         Linux
Python              3.10.12
Install             pip
RAM                 62.54 GB
CPU                 Intel Core(TM) i7-10875H 2.30GHz
CUDA                11.7

matplotlib          ✅ 3.5.1>=3.3.0
numpy               ✅ 1.26.3>=1.22.2
opencv-python       ✅ 4.7.0.72>=4.6.0
pillow              ✅ 10.2.0>=7.1.2
pyyaml              ✅ 6.0.1>=5.3.1
requests            ✅ 2.31.0>=2.23.0
scipy               ✅ 1.11.4>=1.4.1
torch               ✅ 1.13.1>=1.8.0
torchvision         ✅ 0.14.1>=0.9.0
tqdm                ✅ 4.66.1>=4.64.0
psutil              ✅ 5.9.8
py-cpuinfo          ✅ 9.0.0
thop                ✅ 0.1.1-2209072238>=0.1.1
pandas              ✅ 1.5.3>=1.1.4
seaborn             ✅ 0.12.2>=0.11.0
See full post
discussion

Funding for Camera Trap Projects

Hello everyone; I'm a current Peace Corps volunteer serving in South America and wanted to start a camera trap program. I am working with a local nonprofit. This idea would use...

5 0

I'm in Paraguay!

I'm looking for any starting points - databases, specific orgs i can apply to etc. 

I have found a nonprofit I've been working with and have found several grants to apply through my partnership with them but am obviously looking for more. 

I'm in Paraguay!

I'm looking for any starting points - databases, specific orgs i can apply to etc. 

I have found a nonprofit I've been working with and have found several grants to apply through my partnership with them but am obviously looking for more. 

See full post
discussion

Conservation Technology for Human-Wildlife Conflict in Non-Protected Areas: Advice on Generating Evidence

Hello,I am interested in human-dominated landscapes around protected areas. In my case study, the local community does not get compensation because they are unable to provide...

2 0

This is an area where my system would do very well in:



 

 

Also, as you mention areas dominated by humans, there is a high likelyhood that there will be enough power there to support this system, which provides very high performance and flexibility but it comes with a power and somewhat a cost cost.



Additionally, it's life blood comes with generating alerts and making security and evidence gathering practical and manageable, with it's flexible state management system.



Ping me offline if you would like to have a look at the system.

Hi Amit,

The most important thing is that the livestock owners contact you as soon as possible after finding the carcass. We commonly do two things if they contact us on the same day or just after the livestock was killed:

  1. Use CyberTracker (or similar software) on an Android smart phone to record all tracks, bite marks, feeding pattern and any other relevant signs of the reason for the loss with pictures and GPS coordinates. [BTW, Compensation is a big issue -- What do you do if the livestock was stolen? What do you do if a domestic animal killed the livestock? What if it died from disease or natural causes and was scavenged upon by carnivores afterwards?]
  2. In the case of most cats, they would hide the prey (or just mark it by covering it with grass or branches and urinating in the area). In this case you can put up a camera trap on the carcass to capture the animal when it returns to its kill (Reconyx is good if you can afford it - we use mostly Cuddeback with white flash). This will normally only work if the carcass is fresh (so other predators would not be able to smell it and not know where it is yet), so the camera only has to be up for 3-5 days max.

This is not really high-tech, but can be very useful to not only establish which predator was responsible (or if a predator was responsible), but also to record all the evidence for that.

See full post
discussion

Passionate engineer offering funding and tech solutions pro-bono.

My name is Krasi Georgiev and I run an initiative focused on providing funding and tech solutions for stories with a real-world impact. The main reason is that I am passionate...

2 1

Hi Krasi! Greetings from Brazil!



That's a cool journey you've started! Congratulations. And I felt like theSearchLife resonates with the work I'm involved round here. In a nutshell, I live at the heart of the largest remaining of Atlantic forest in the planet - one of the most biodiverse biomes that exist. The subregion where I live is named after and bathed by the "Rio Sagrado" (Sacred River), a magnificent water body with a very rich cultural significance to the region (it has served as a safe zone for fleeing slaves). Well, the river and the entire bioregion is currently under the threat of a truly devastating railroad project which, to say the least is planned to cut through over 100 water springs! 



In face of that the local community (myself included) has been mobilizing to raise awareness of the issue and hopefully stop this madness (fueled by strong international forces). One of the ways we've been fighting this is through the seeking of the recognition of the sacred river as an entity of legal rights, who can manifest itself in court, against such threats. And to illustrate what this would look like, I've been developing this AI (LLM) powered avatar for the river, which could maybe serve as its human-relatable voice. An existing prototype of such avatar is available here. It has been fine-tuned with over 20 scientific papers on the Sacred River watershed.



And right now myself and other are mobilizing to manifest the conditions/resources to develop a next version of the avatar, which would include remote sensing capacities so the avatar is directly connected to the river and can possibly write full scientific reports on its physical properties (i.e. water quality) and the surrounding biodiversity. In fact, myself and 3 other members of the WildLabs community have just applied to the WildLabs Grant program in order to accomplish that. Hopefully the results are positive.



Finally, it's worth mentioning that our mobilization around providing an expression medium for the river has been multimodal, including the creation of a shortfilm based on theatrical mobilizations we did during a fest dedicated to the river and its surrounding more-than-human communities. You can check that out here:



 

https://vimeo.com/manage/videos/850179762



 

Let's chat if any of that catches your interest!

Cheers!

Hi Danilo. you seem very passionate about this initiative which is a good start.
It is an interesting coincidence that I am starting another project for the coral reefs in the Philipines which also requires water analytics so I can probably work on both projects at the same time.

Let's that have a call and discuss, will send you a pm with my contact details

There is a tech glitch and I don't get email notifications from here.

See full post
discussion

Underwater camera trap - call for early users

Hi!The CAMPHIBIAN project aims at developing an underwater camera trap primarily targeting amphibian such as newts, but co-occurring taxa are recorded as well such as frogs, grass...

7 4

Many thanks for your contribution to the survey! We are now summarizing the list of early users and making our best to propose a newtcam to all in due time. 

All the best!

Xavier

See full post
discussion

Jupyter Notebook: Aquatic Computer Vision

Dive Into Underwater Computer Vision Exploration OceanLabs Seychelles is excited to share a Jupyter notebook tailored for those intrigued by the...

3 0

This is quite interesting. Would love to see if we could improve this code using custom models and alternative ways of processing the video stream. 

This definitely seems like the community to do it. I was looking at the thread about wolf detection and it seems like people here are no strangers to image classification. A little overwhelming to be quite honest 😂

While it would be incredible to have a powerful model that was capable of auto-classifying everything right away and storing all the detected creatures & correlated sensor data straight into a database - I wonder if in remote cases where power (and therefore cpu bandwidth), data storage, and network connectivity is at a premium if it would be more valuable to just be able to highlight moments of interest for lab analysis later? OR if you do you have cellular connection, you could download just those moments of interest and not hours and hours of footage? 

Am working on similar AI challenge at the moment. Hoping to translate my workflow to wolves in future if needed. 

We all are little overstretched but it there is no pressing deadlines, it should be possible to explore building efficient model for object detection and looking at suitable hardware for running these model on the edge. 

 

 

See full post
discussion

Replacement screen for Bushnell cameratrap

Hi all,I have an arboreal camera trap array using the Bushnell E3 Trophy cam. One of the cameras has suffered damage at the hands of white faced capuchins. The camera trap still...

3 0

Hey Lucy!

You should be able to pick up a small piece of infrared emitting plastic online for super cheap that would allow for the IR lights to pass through, but block UV from coming in. Anything should be able to be glued and sealed using expoy, which shouldn't damage any electronic components, but will ensure weatherproofing.

 

Goodluck!

 

Best,

Travis

I have fixed Bushnell TrophyCam IR windows with plastic cut from the bottom of a supermarket fruit package. Any thin, clear plastic will be OK. I stuck it in with silicone, but make sure you get the neutral cure type that does not emit acetic acid as it sets.

See full post
discussion

Using drones and camtraps to find sloths in the canopy

Recently, I started volunteering for Sloth Conservation Foundation and learned that it is extremely difficult to find sloths in the canopy  because: 1) they hardly move,...

16 1

I was under then impression that "Infrared Imaging" meant shining an infrared light source on to the subject and capturing primarily the reflected light.  As opposed to "Thermal Imaging" which meant capturing the infrared signatures generated by the subjects themselves?  That said, I'm sure some others are much better informed on this subject than me.

 

I took delivery of the DJI Mavic Enterprise 3 Thermal the other day. The short hand nomenclature used on the controller for the thermal imagery is "IR" (short for infrared) so it is used even for cases where no IR lighting is in play.

 

 

See full post
discussion

Need advice - image management and tagging 

Hello Wildlabs,Our botany team is using drones to survey vertical cliffs for rare and endangered plants. Its going well and we have been able to locate and map many new...

6 0

I have no familiarity with Lightroom, but the problem you describe seems like a pretty typical data storage and look up issue.  This is the kind of problem that many software engineers deal with on a daily bases.  In almost every circumstance this class of problem is solved using a database.

In fact, a potentially useful analysis is that the Lightroom database is not providing the feature set you need.

It seems likely that you are not looking for a software development project, and setting up you own DB would certainly require some effort, but if this is a serious issue for your work, you hope to scale your work up, or bring many other participants into your project, it might make sense to have an information system that better fits your needs.

There are many different databases out there optimized for different sorts of things.  For this I might suggest taking a look at MongoDB with GridFS for a couple of reasons.

  1. It looks like you meta data is in JSON format.  Many DBs are JSON compatible, but Mongo is JSON native.  It is especially good at storing and retrieving JSON data.  Its JSON search capabilities are excellent and easy to use.  It looks like you could export your data directly from Lightroom into Mongo, so it might be pretty easy actually.
  2. Mongo with the GridFS package is an excellent repository for arbitrarily large image files.
  3. It is straightforward to make a Mongo database accessible via a website.
  4. They are open source (in a manner of speaking) and you can run it for free.

Disclaimer: I used to work for MongoDB.  I don't anymore and I have no vested interest at all, but they make a great product that would really crush this whole class of problem.

See full post
discussion

Recycled & DIY Remote Monitoring Buoy

Hello everybody, My name is Brett Smith, and I wanted share an open source remote monitoring buoy we have been working on in Seychelles as part of our company named "...

2 1

Hello fellow Brett. Cool project. You mentioned a waterseal testing process. Is there documentation on that?

I dont have anything written up but I can tell what parts we used and how we tested.



Its pretty straightforward, we used this M10 Enclosure Vent from Blue Robotics:

 

Along with this nipple adapter:

Then you can use any cheap hand held break pump to connect to your enclosure. You can pump a small vacuum in and make sure the pressure holds.

Here's a tutorial video from blue robotics:

 





Let me know if you have any questions or if I can help out.

See full post
discussion

Cheap camera traps with "Timelapse+" mode?

Hi everyone,I have a fairly specific query about camera trap time lapse functionality. I am looking for cheap models that have something similar to Bushnell's "Timelapse+" mode,...

10 0

Thank you @mguins  and @NickGardner for your praise and addition. I had not thought of the backup possibility, but it sure is a good point, Michelle. I find it amazing how often one reads about and experiences camtrap malfunction. Even the relatively cheap ones are still quite a lot of money for what is, at the end of the day, a relatively simple piece of electronics and a plastic container.

Frank's idea of using 2 camera traps is inspired!

I've fiddled with cheap camera traps a bit, and some (most?) of them use a low power, inaccurate timer for the time lapse function instead of the accurate real time clock.  This is ok for Michelle's purpose, but not for Nick's as he needs to specify the exact time of day to trigger.

I made this interface to allow a camera trap to be triggered by an external device.  To it you could attach, say, a timer programmed to fire at the desired times, to cause a capture.  A $4 DS3231 RTC module could do the job, after the alarm times have been programmed into it with , for example, an Arduino.

Hi Nick, 

Any update from your project? did you find good price value Camera Traps?

We in Indonesia don't have local suppliers for any research grade Camera Traps like Bushnell, Browning or Reconyx. So we need to import them and the price inflated a lot even without the distributor. So, me and my team recently use the China model like GardePro or Meidase one ($40-60). Though we bought it in the US in small quantity if some of our friends travel back to Indonesia. They have more feature than typical Bushnell with same price range. The images are AI upscale, but doesn't really bother us. So I am curious if you found any good Camera Traps to recommend? Thanks!

Cheers,

Dhanu

See full post
discussion

Using "motion extraction" for animal identification

Hi all, I am no expert in the underlying machine learning models, algorithms, and AI-things used to identify animals in the various computer-vision tools out there (...

6 0

Hi Dhanu,

Our group moved to Wildlife Insights a few years back (for a few reasons but mostly ease of data upload/annotation by multiple users) so I haven't tried EcoAssist. This being said, I will look into it as a pre-WildlifeInsights filter to analyze the tens of thousands of images that get recorded when camera traps start to fail, or get confused with sun spots (which can be common at one of our sites, a south-facing slope with sparse canopy cover).

Thanks for sharing!

 

See full post
discussion

Wildlife Conservation for "Dummies"

Hello WILDLBAS community,For individuals newly venturing into the realm of Wildlife Conservation, especially Software Developers, Computer Vision researchers, or...

3 4

Maybe this is obvious, but maybe it's so obvious that you could easily forget to include this in your list of recommendations: encourage them to hang out here on WILDLABS!  I say that in all seriousness: if you get some great responses here and compile them into a list, it would be easy to forget the fact that you came to WILDLABS to get those responses.

I get questions like this frequently, and my recommended entry points are always (1) attend the WILDLABS Variety Hour series, (2) lurk on WILDLABS.net, and (3) if they express a specific interest in AI, lurk on the AI for Conservation Slack.

I usually also recommend that folks visit the Work on Climate Slack and - if they live in a major city - to attend one of the in-person Work on Climate events.  You'll see relatively little conservation talk there, but conservation tech is just a small subset of sustainability tech, and for a new person in the field, if they're interested in environmental sustainability, even if they're a bit more interested in conservation than in other aspects of sustainability, the sheer number of opportunities in non-conservation-related climate tech may help them get their hands dirty more quickly than in conservation specifically, especially if they're looking to make a full-time career transition.  But of course, I'd rather have everyone working on conservation!

Some good overview papers I'd recommend include: 

I'd also encourage you to follow the #tech4wildlife hashtags on social media! 


 

 

I'm also here for this. This is my first comment... I've been lurking for a while.

I have 20 years of professional knowledge in design, with the bulk of that being software design. I also have a keen interest in wildlife. I've never really combined the two; and I'm starting to feel like that is a waste. I have a lot to contribute. The loss of biodiversity is terrifying me. So I’m making a plan that in 2024 I’m going to combine both.

However, if I’m honest with you – I struggle with where to start. There are such vast amounts of information out there I find myself jumping all over the place. A lot of it is highly scientific, which is great – but I do not have a science background.

As suggested by the post title.. a “Wildlife Conservation for Dummies” would be exactly what I am looking for. Because in this case I’m happy to admit I am a complete dummy.

See full post
discussion

Testing Raspberry Pi cameras: Results

So, we (mainly @albags ) have done some tests to compare the camera we currently use in the AMI-trap with the range of cameras that are available for the Pi. I said in a thread...

9 0

And finally for now, the object detectors are wrapped by a python websocket network wrapper to make it easy for the system to use different types of object detectors. Usually, it's about 1/2 a day for me to write a new python wrapper for a new object detector type. You just need to wrap in the network connection and make it conform to the yolo way of expressing the hits, i.e. the json format that yolo outputs with bounding boxes, class names and confidence level.

What's more, you can even use multiple object detector models in different parts of a single captured image and you can cascade the logic to require multiple object detectors to match for example, or a choice from different object detectors.

It's the perfect anti-poaching system (If I say so myself :) )

Hey @kimhendrikse , thanks for all these details. I just caught up. I like your approach of supporting multiple object detectors and using the python websockets wrapper! Is your code available somewhere?

Yep, here:

Currently it only installs on older Jetsons as in the coming weeks I’ll finish the install code for current jetsons.


Technically speaking, if you were an IT specialist you could even make it work in wsl2 Ubuntu on windows, but I haven’t published instructions for that. If you were even more of a specialist you wouldn’t need wsl2 either. One day I’ll publish instructions for that once I’ve done it. Though it would be slow unless the windows machine had an NVidia GPU and you PyTorch work with it.

See full post
discussion

Open-Source design guide for a low-cost, long-running aquatic stereo camera

Katie Dunkley's project has been getting a heap of attention in the conservation tech community - she very kindly joined Variety Hour to give us a walkthrough of her Open-Source...

2 0

This is awesome - thanks for sharing Stephanie!! We actually were looking around for a low-cost video camera to augment an MPA monitoring project locally and this looks like a really great option!

 

Cheers, Liz

Thank you for sharing! Super interesting, as we don't see many underwater stereo cameras! We also use Blue Robotics components in our projects and have found them reliable and easy to work with. 

See full post
discussion

Apply to Beta test Instant Detect 2.0

Hi WildLabs,ZSL is looking for Beta testers for Instant Detect 2.0. If you are a conservationist, scientist or wildlife ranger with experience working with innovative...

1 2

Will you accept personal/hobbyist focused on conservation on their small plots of land (10-100 acres)?

I would, and know others, who would happily pay more than the official conservationists rate for the service, which could help to further subsidize the project. (Referring to your statement here: https://wildlabs.net/discussion/instant-detect-20-and-related-cost)

See full post
discussion

Mesh camera trap network?

Does anyone have something to share about wireless camera traps that make use of a mesh-network type of architecture. One such solution, BuckeyeCam allows cameras to route images...

24 0

Hi Sam,

Impressive!  Any chance the LoRa code is open source?  I should like to take a gander.

Thanks

See full post
discussion

Subsea DIY Burnwire for Deep-sea BRUVS

Hello everyone. I'm part of a team working on a low-cost, deep-sea camera (BRUVS) project and we're currently facing challenges with our subsea burnwire release system. We're...

9 0

Yeah from memory we found it difficult to get the relatively high voltage (~50VDC) and current (can't remember) in a small package, but we had almost no experience back then and gave up fairly quickly. We also found it difficult to get much help from the company if I remember correctly...

so is the problem with the nichrome waterproofing everything? I picture something like coating the nichrome in high temp grease (especially where it's in contact with the nylon line and the line itself) and encapsulating the entire thing in a semi-flexible silicone (so the line can slip out after detechment) with something buoyant to help pull it towards the surface maybe? Speaking of, how are tags being recovered (i.e. do they need to pop to the surface)? 

Hi Titus,

We've used this design/procedure for many years with our Deep Sea Camera systems, with good reliability.  Not OTS but not hard to make and most of the materials come out to be inexpensive per unit.  The most expensive item is the M101 connector ($25ea), but if you get them with extra length on the cable, you can essentially cut it off at the point where it joins the burn-loop and reuse that connector until it gets too short.  You'd also need an F101 connector integrated with your BRUV, this connecting with the burnwire and forming the the positive side of the circuit, and a ground - our ground connection goes to a large bolt on the frame near the burnwire loop - but that connector generally shouldn't need replacement unless it gets damaged.

These burnwires generally break in 3-7min, burning at about 1Amp, ~14.5V.  A thinner version of the coated wire could go faster or with less power required.

We do also employ galvanic releases as backups.  I really like redundancy on recovery mechanisms!  The ones we use are made by International Fishing Devices, Inc.  Various distributors sell certain models of their products (i.e. different time durations) but if you contact them directly, they can also make custom duration ones for you.

 

Hi Titus,

I've used latching solenoids as a release in a fresh water application. The product linked to is the one I have used, but has been discontinued (it's been quite a while).  Anyway these little devices hold a plunger in place with a permanent magnet, but release the plunger when a coil is energised that counters the magnet.  The holding force is not great, but more than enough to keep the safety on a mechanical trigger.  The whole device can be potted and sealed (ideally under vacuum to eliminate voids).  When pushing the plunger in to arm the solenoid, there is a definite click when the magnet kicks in, to confirm the locked state.

A similar device is the electropermanent magnet, which doesn't have a plunger, in fact it has no moving parts.  You provide the steel piece that that this device will release when energised, as with a latching solenoid. It generally has greater holding force than a latching solenoid. I've used these in a seawater application.  It's worth noting that there exist ferromagnetic stainless steels that can be used here to avoid corrosion.

Thanks,

-harold

See full post
discussion

Thermal cameras for monitoring visitors in highly vulnerable conservation areas

Hi everybody, Im Alex González, a consultant and researcher in sustainable tourism and conservation. I'm currently consulting a conservation organisation for the development...

7 0

Hi,



This is a really late answer but I was new to wildlabs then. I have a security appliance that uses state of the AI models and user defined polygon areas of interest that generates video alerts of intrusions in typically under a second.

Although its setup to install automatically on an NVidia AI on the edge boxes of your intentions were to monitor a great deal of cameras you could also install it on a desktop with a high end GPU for very high performance. At home I use a desktop with an rtx 2080ti and monitor around 15 cameras and a thermal imaging camera (old one).

I have also tested a high end model (yolov7) on a high end thermal imaging camera image and it works fine as well.

Thermal yolov7

Thermal imaging cameras are hellishly expensive though and I’ve found that new extremely light sensitive cameras like the HIKvision colorvu series almost obsoletes them in terms of people detection at night at a fraction of the cost.

If you are interested I’d be happy to show you a demo in a video meeting sometime if you like. I’m pretty sure it would meeting all your intrusion detection and alerting needs.

My project page is

See full post
discussion

Video camera trap analysis help

muh
Hello,I'm a complete newbie so any help would be appreciated. I never trained any ML models (have junior/mid experience with Python and R) nor annotated the data but would like to...

8 0

Hi there!, 

You should definitely check out VIAME, which includes a video annotation tool in addition to deep learning neural network training and deployment. It has a user friendly interface, has a publicly available server option that mitigates the need for GPU enabled computer for network training, and has an amazing support staff that help you with your questions. You can also download the VIAME software for local use. The tool was originally developed for marine life annotation, but can be used for any type of video or annotation (we are using it to annotate pollinators in video). Super easy to annotate as well. Worth checking out!  

Cheers, 
Liz Ferguson

See full post