Data management and processing tools  / Feed

Conservation tech work doesn't stop after data is collected in the field. Equally as important to success is navigating data management and processing tools. For the many community members who deal with enormous datasets, this group will be an invaluable resource to trade advice, discuss workflows and tools, and share what works for you.


Commercially available connected audio sensors

Hi - can anyone advise if there is a commercially made passive audio recorder that can be powered by solar/battery and have 3g/4g connectivity - ideally with compression on the...

3 0


I am not aware of any such connected loggers/recorders but they would be nice. 

The AudioMoths have been revolutionary in providing audologging at a low cost but they take a lot of "data muling" (carrying SD cards in and out of the field sites) and swapping of batteries.



Hi Lars, thanks for the response. We are using lots of Song Meter Micro's atm and they have proved to be resilient. Just need something which doesn't involve going on site regularly - but get the data off. 

Rainforest Connection's (RFCx) Guardian devices may be of interest. They are solar-powered and have connectivity options for Wifi, GSM and satellite transfer. They've previously been used for detecting e.g., gunshots or chainsaws (using edge computing) and then sending positive detections/alerts to folks on the ground. RFCx also hosts Arbimon, a free, no-code software platform that facilitates analysis of audio data as well. Happy to chat more if you'd like to talk further about it! 

See full post

Interview for Technologies in Conservation

Dear Wildlabs community, my name is Nikolas and I am a Master's Student from Lisbon. Like many of you, I grew up with a great passion for the wildlife that we are surrounded...

4 5

I'd be happy to chat with you if you wanted! My expertise is within passive acoustic monitoring particularly. The Conservation Tech Directory might be useful for you in identifying relevant actors within the space.

My original background is in ecology and conservation, and am now in the elected leadership of the Gathering for Open Science Hardware which convenes researchers developing open source tech for science. I am not working on a specific piece of technology right now, but am happy to contribute some higher-level views for your interview if that helps.

See full post

Software Engineer

Join the Arribada Initiative! We have a unique opportunity for a software developer to create mobile / desktop applications and intuitive user interfaces that assist researchers and fieldworkers to conserve wildlife.

See full post

New paper - An evaluation of platforms for processing camera-trap data using artificial intelligence

We review key characteristics of four AI platforms—Conservation AI, MegaDetector, MLWIC2: Machine Learning for Wildlife Image Classification and Wildlife Insights—and two auxiliary platforms—Camelot and Timelapse—that incorporate AI output for processing camera-trap data. We compare their software and programming requirements, AI features, data management tools and output format. We also provide R code and data from our own work to demonstrate how users can evaluate model performance.


Otter video help!

Hello everyone!I have an otter (I'm pretty sure) on a couple videos from two of my trail cameras, is there ANY way I can clear these videos up at all?  I have tried but I...

4 0
See full post

Workshop Invite: Building Partnerships between Conservation Tech and the UK Space Sector

Hi everyone, In collaboration with our partners over at the Satellite Applications Catapult, we are hosting an in person workshop...

3 2

Hi Steph,

If it's not too late, I'm very interested in this workshop.



See full post

Senior Software Engineer

Conservify is seeking a hands-on Senior Software Engineer with front end and back end experience developing rich web and mobile applications, and a strong desire to build a best-in-class product that stands out in both...

See full post

Using Average speed trapping to mitigate Wildlife Vehicle Collisions

Help needed to manage average speed trapping data with a view to  mitigating Wildlife Vehicle CollisionsWith a view to mitigating WildlifeVehicle collisions...

2 0

Hi Gregory. 
I don't know if it's ethical to record license plate pictures for speed detection. It also sounds like it's a bit of a difficult approach to speed detection. Have you considered using a doppler radar based speed detector? You would likely need to DIY it or collaborate with someone that could assist with designing one but I think you'd get more reliable data in an automated way. 

Just my $0.02.


Hi Gregory, I agree with Akiba - it can be tricky to get ethics approvals for this sort of thing. I've been working on a wildlife warning road signage project for a few years and the project uses radar-enabled signs to take a speed before and after the sign. Problem is, the signs are expensive. However, we are interested in adding some 'control' or nil-treatment roads to the mix, so we'd be interested in finding a radar speed logging system that we could deploy to mimic the sign radars, along the lines of what @Freaklabs mentioned. A variation on this project perhaps? 

Could be worth investigating? This model seems to have reasonable range: 

The other thing that comes to mind is the so-called licence plate camera from Reconyx:

They are limited to vehicles travelling 80km/hr or slower and I think they only take stills, but you possibly use them in multiple places to record plate data. 

How are you measuring speed with cameras by the way? Be very interested to hear more and keen to collaborate on a low(ish) cost radar solution if that's of interest.



See full post

Data mgmt for Passive Acoustic Monitoring best practices?

Hello!I'm running a small passive acoustic monitoring project for terrestrial species, using audiomoths and swifts. How do people and organizations manage the ballooning datasets...

4 0

Hi Alex--

The first thing I'd suggest you think through is how much data you have vs how much data you are currently working on. Because if you have data from previous years that you want to ensure you're storing securely and reliably but don't need immediate access to in order to run analysis on, that opens up some options. You can compress data using lossless algorithms like FLAC, where the compression ratio varies but 50% is a pretty good margin, and then convert back to WAV if necessary for reanalysis. Compressing using MP3, OGG, AIFF, or other compression algorithms is an option that saves even more storage space but you will lose information in ways you wouldn't with FLAC--it depends on your specific needs.

I'd also recommend setting up a RAID array (RAID = "Redundant Array of Inexpensive Disks"). This offers some additional security in event of a drive failure. A lot of folks who do video editing, probably the most similar use case to people working with acoustic data who also lack the institutional support of a large company or university IT department use a local NAS enclosure like that are designed for just this purpose. Some higher initial startup costs than just buying individual USB hard drives but that does come with some perks including additional reliability and can be faster to read data depending on the exact drive specs and your local networking setup.

There are also low-cost cloud storage services like Amazon's Glacier. However, getting these set up can be a little bit tricky and they are not particularly responsive (for example, if you upload data to Glacier, it will be very safe, but getting it back if you need to use it again can take a few days depending on the dataset size).

Hello Alex,

   My information might not be that helpful to you, still, our organisation have an Enterprise license of AWS cloud and we store all our media files (video, pictures, audio etc.) there. We are also using a media management solution, Piction, thru which we upload the files into the S3 bucket and in the process it also captures the file metadata (some of the metadata values needs to be entered manually). This is useful to search the files if someone wants to view or process the file later. We are soon deciding on the file storage configuration so that old files will move to cheap storage like AWS Glacier, which will take a maximum of a week time to retrieve it.  


Hi Alex,

I'd go much further along the lines that David @dtsavage sets out. Before jumping to implementations, better think through why you want to keep all that data, and for who? From your question, it appears you have at least three purposes:

1- for yourself to do your research

2- for others to re-use.

3- for yourself to have a back-up

For 1) you should do what works best for you.

For 3) use your organization's back-up system or whatever comes close to that

For 2 and 3) As you are indicating yourself : deposit your data at your nation's repository or if your nation doe not have one. It may be some documentation work ( which is what you should do anyways, right? ), but then you can stop worrying about holding on to it. Someone else is doing that for you and they do a much better job - because it is their job. Moreover, you increase the chance that other will actually become aware of all that data that you are sitting on by putting it into a repository. Who is otherwise going to find out and how that you have those disks on your desk? Lastly, depositing your data can also serve as a back-up. If you don't want to share it before you've published about it, there is likely the option of depositing under time-embargo or of depositing while requiring your consent for any re-use. 

You ask how many people actually do this? You can find the answers at the repository, but I suggest that what matters most is whether you want to for your own reasons, and whether your funders, or organization's funders require it.

See full post

Shark Lab Data Analyst

This position (at California State University, Long Beach) provides data management & analysis support to Shark Lab research operations including shark tagging, active tracking, receiver data, AUV & UAV data...

See full post

Underwater Fish Datasets from the Mediterranean

Hi Wildlabs,I am Sebastian the project manager for FishID. We are currently in the last stretch of...

6 0

Hi Sebastian,

  How about using webcams, like this one in the Adriatic (if high enough detail): 

If not, it might be worth contacting who are based in Montpellier. They do marine research, (I think including some fish ID machine learning stuff too), so may have done work locally?


Not sure if audio data would be of any use to you, but if so FishSounds has some - I just tried their search function and there is a location tag for 'Mediterranean & Black Sea'. 

You might try reaching out to the folks at Name that Fish, Innovasea, or perhaps an entity on the Fisheries Tech list would have Mediterranean stuff? 


If this is still relevant, you can try reaching out to the Belmaker lab, they do BRUV surveys in the eastern Mediterranean and have hours of video, some of it I believe is annotated. Particularly, Shahar might be helpful, he's the PhD student running point on the project.

See full post