Global Feed

There's always something new happening on WILDLABS. Keep up with the latest from across the community through the Global view, or toggle to My Feed to see curated content from groups you've joined. 

Header image: Laura Kloepper, Ph.D.

article

Introducing Badges: A new WILDLABS feature!

We’re unveiling badges, a new feature that allows you to showcase how you’re involved in WILDLABS. Keep track of engagement, show off your collection, and maybe even engage in some friendly competition.

12 10
So exciting, and so fun! What a nice idea to keep the community engaged!
I find it a great idea, but I hope it keeps the badge junkie in me from adding too much BS additions just to get the badge!How did you arrive to the achievement counts for the...
See full post
discussion

Audiomoth Energy consuption estimates

Hi All,I'm conducting a biodiversity survey that includes a grid of audiomoths. I have 53 deployed, with the following schedule: 15 seconds every minute, 4:00-12:00, and 16:00-24:...

7 0

Agree with the differences between microSD cards. I tested Sandisk Ultra, Samsung EVOplus, and some no-name cards I found myself in possession of. Unfortunately I did not have a logging device, but I watched what was happening on my multimeter. The µMoth with an Ultra was peaking at over double the current flow of the EVOplus. The no-name card was peaking at about the same as the EVOplus but had a near constant background drain that was not obvious with the other cards.

Apparently, cards of different sizes also use different amounts of power:

It would be great if someone with a data-logging multimeter could conduct a search to find the most power miserly cards out there, but it would be a constant search as card specifications are changing all the time. The Samsung EVOplus cards are no longer available, but I have no idea if Samsung has redesigned their cards or just rebranded them. Often manufacturers are going for the highest speeds rather than the most efficient card because most devices are using more power on sensors, screens, etc. so you barely notice a bit of surplus power going to the card.

I recently completed a survey using SongMeter minis and most units managed two weeks of nocturnal recording using eneloops and sandisk cards. But I had a couple of units that had only managed for or five nights so I redeployed with fresh batteries assuming we must have had some badly charged cells. After a week I went back to collect the units only to discover they had again gone for less than a week so this time I had some spare SM minis so I swapped the same cards into those with fresh batteries and redeployed again. After another week I went back and the same issue. We had sufficient data by now so I did not redeploy again but I concluded it was something about the SD cards that was causing the problem all along. If they were mine I would have tested them, but they went back to the owners with sticky labels describing the problem items.

@Hubertszcz you might also consider dropping your sample rate to 32, 16 or even 8 kHz. Do you actually have target species calling at frequencies over 16 kHz. Less data volume means fewer writes to the card. Also bear in mind that short audio bursts have storage and processing overheads compared with fewer longer bursts.

Is there an eco-battery? Well remember that half of the power you pump into a NiMH battery is lost as heat during charging. With correct charging, most types of rechargeable lithium batteries are only losing around 8 to 12 % as heat. Also with Lithium I don't think it is the mining as such that has to be destructive, rather the bad practice and corruption around some sources. A bigger concern is some components such as cobalt and nickel in those cells. LiFePO4 cells do not have the same concerns but they work at a lower voltage and I don't think you will find them in a size to fit an audioMoth case. Happy to be wrong about that though.

Hi Hubert,

There's been some research into which batteries are most effective with ARUs, and there's some results here: 

Battery quality can vary greatly, especially NiMh. Alkaline batteries can be largely recycled, reducing their environmental impact although being single use. The results above doesn't take into account SD card variation, but should hopefully be a good indication.

See full post
discussion

eDNA from terrestrial plant

Hi everyone. I'm still confused about this. Is it feasible to employ environmental DNA (eDNA) for the detection of two distinct communities (animal and plant) within a single...

1
See full post
discussion

Labelled Terrestrial Acoustic Datasets

Hello all,I'm working with a team to develop an on-animal acoustic monitoring collar. To save power and memory, it will have an on board machine learning detector and classifier...

13 0

Thanks for sharing Kim.

We're using <1 mA while processing, equating to ~9 Ah running for a year. The battery is a Tadiran TL-5920 C 3.6V Lithium, providing 8.6 Ah, plus we will a small (optional) solar panel. We also plan to implement a threshold system, in which the system is asleep until noise level crosses a certain threshold and wakes up.

The low-power MCU we are using is https://ambiq.com/apollo4/ which has a built-in low power listening capability.

<1 mA certainly sounds like a breakthrough for this kind of device. I hope you are able to report back  with some real world performance information about your project @jcturn3 . Sounds very promising. Will the device run directly off the optional solar cell or will you include a capacitor since you cannot recharge the lithium thionyl chloride cell. I had trouble obtaining the Tadarian TL-5920 cells in Australia (they would send me old SL-2770s though) so I took a gamble on a couple of brands of Chinese cells (EVE and FANSO) which seemed to perform the same job without a hitch. Maybe in the USA you can get Israeli cells more easily than Chinese ones? 

Message me if you think some feeding sounds, snoring, grooming and heart sounds of koalas would be any use for your model training.

Really interesting project. Interesting chip set you found. With up to around 2mb sram that’s quite a high memory for a  ultra low power soc I think.

It might also be interesting while doing your research thinking about if there are any other requirements people could have for such a platform with a view towards more mass usage later. Thanks for sharing.

See full post
article

Navigating corporate due diligence in the Voluntary Carbon Market

Emerging trends for Nature-Based Solutions project assessments

3 1
Thank you for this article, Cassie. What is the pricing structure for Earth Blox/user/month?
Thanks, Cassie. How much is the annual license? I don't see it anywhere on your site.
See full post
discussion

Lion Deterrence

Hello! We are a group of students at UC Berkeley working to design a lion deterrence system that is more affordable and cost-effective for community livestock protection and human...

5 2

Hi @rokshanabushra 

So are you looking to replicate something like this?

https://predatorguard.com/products/predator-deterrent-light

This is, in principle at least, fairly simple, as it's really just some red LEDs and a small solar-battery power system. You could buy one of the commercial options and do a teardown (or I can do it if you like, as I'd be interested to find out exactly what they are doing). 

In lieu of that, I suspect a light-dependent resistor is probably used to control the lights coming on at night (i.e. something along these lines: https://www.instructables.com/How-to-Make-LDR-Darkness-Sensor-Circuit-Simple-DIY/).

If you employ some sort of 'blink' or flashing protocol (you could use a 555 timer to keep the costs down), you could save quite a bit of power (compared to running the lights constantly). For example, something along these lines: https://www.instructables.com/Adjustable-SingleDual-LED-Flasher-Using-555-Timer-/ You could also add a PIR motion sensor so it only comes on when nearby motion is detected, but of course the costs of building goes up. 

There are also a few off-the-shelf flasher designs that might be cheap enough already to consider (e.g. https://www.ledsales.com.au/index.php?main_page=product_info&cPath=142_143&products_id=2820). This seems like a reasonably good option for low power, although I have no idea how well it actually works...You can also buy LEDs that flash by themselves (e.g. https://www.ledsales.com.au/index.php?main_page=product_info&cPath=148_152_159&products_id=2951).

If you think sounds might also help (e.g. human noises etc.), check out the Boombox from Freaklabs: https://freaklabs.org/technology/boombox/ and it should be possible to add 'eyes' in the form of reflectors, or, some kind of LEDs that activate at the same time as the sound. You could contact Akiba or Jacinta about it as I am sure they'd help if they can: https://freaklabs.org/about/#:~:text=providers%20including%20ARGOS.-,the%20team,-Chris%20%E2%80%98Akiba%E2%80%99%20Wang

Anyway, happy to help if I can and all the best for the project.

Cheers,

Rob

 

Hi Rokshana,

Maybe you can try this product from India called ANIDERS - 

 I think this product would help you a lot. This is their website - 

See full post
discussion

Introduction and Potential Networking

Hello everyone!I joined WILDLABS to meet fellow conservationists and network, as I am currently hunting for a job within the realm of wildlife conservation. The aspects that I'm...

1 2

Hi Sienna, I'm in the Worcester area and always keep an eye on positions nearby for my students. I recommend adding Mass Audubon to your list (currently searching for a land conservation fellow). Also, many towns search for conservation agents/administrators to assist their conservation commission. Museum of Science, the Boston Aquarium, as well as many smaller science museums / zoos (e.g., Ecotarium in Worcester), are frequently hiring for both full-time and temporary positions. Feel free to contact me in private! 

See full post
discussion

Need tips on best practices tracking turtles

Hi, I am working on a project that aims to track the movement of turtles in the Amazon. I would like to get tips mainly on what would be the best equipment regarding...

1 1

Hi Gabriel,

Our TagRanger Tags can be used for tracking turtles,  we already have a tracking project commencing soon in South America for ~40 turtles...

https://www.tagranger.com/  

The Tags use LoRaWAN allowing you to communicate with your Tags in real time.  As well as requesting current GPS locations from long distances away  (20km Line of Sight) you can also use the integrated ranging tools which give you distance to your Tag in metres when you get closer.  

Key features:

LoRaWAN (tested > 20km line of sight). Use a 'Finder' which is a handheld gateway or you can also use your own LoRaWAN network.

UWB ranging gives distance (in metres) to the Tag up to 150m away

Hybrid Ranging combines the equivalent of a VHF pinger from a few km away (line of sight) with the UWB ranging when you get closer

Log Download remotely using UWB radio

The Tag can last for very long lifetimes depending on how you configure it

Please drop me a line if you are interested in hearing more about this and how we could configure it best for your application.

Craig

See full post
discussion

Species ID Needs?

Hi all! New to the WILDLABS space and interested in learning from others about species identification needs in fisheries and wildlife, ranging from monitoring and enforcement to...

2 0

Hello Nadia,

A forensic genetic challenge  exists when DNA is destroyed by processes used in manufacturing of derivative animal products, preventing law enforcement in identification of protected species.  Alternative methods such as lipid profiles or isotope analysis unique to certain species may be possible but require voucher specimens that may or may not be available and methods that have not been tested or peer reviewed. Examples below:

  1. derivative products made from endangered shark squalene (eg. Liver oil capsules).
  2. derivative products made from lion bone and tiger bone (eg. lion bone cake and tiger wine).

    This is a law enforcement issue and would like to discuss possible solutions. 
See full post
discussion

Successfully integrated deepfaune into video alerting system

Hi all, I've successfully integrated deepfaune into my Video alerting full-features security system StalkedByTheState. The yellow box around the image represents the zone of...

14 0

As I understand it, the deepfaune's first pass is an object detector was based on megadetector, @schamaille  could explain it exactly. In short though, it's output is standard yolo like in terms of properties. From this I use standard opencv code to snip out the individual matches and pass them to the second stage, which is a classifier.

My code needs a bit of cleaning up before I can release it, also it needs to be made more robust  for some situations. Also, I'm waiting to hear if I got anywhere with wildlab awards as it would affect my plans going forward. And this could be anything up till the end of next month, though at a wild guessing I'm guessing next week at the UN WWD or at the wildlabs get together :) Anyone else have any theories ?

Also, my code is a little  more complex because I abstract the interface to a network based API.

Finally, I don't want to take the wind out of my sails, I would like to launch my integration in time with the release of the Orin based version of my StalkedByTheState software, the usage of which I'm trying to promote. To release earlier take's some of the oomph out of this.

But maybe we can have a video call sometime and we can have a chat about this?

In the DeepFaune final paper, it's mentioned that the team developed their own observation type model (detector) based on YOLOv8s, utilizing the cropping information provided by MegaDetectorV5a.

Therefore, for the initial phase, I'm also utilizing the YOLO interface (from Ultralytics) to load the deepfaune-yolov8s_960.pt model and perform the prediction procedure. The results list contains one or more bounding boxes with class ID (animal, person, vehicle) and probability values.

For each object detection, I crop and resize the original image to the area of the bounding box, execute the preprocessImage transformation, and utilize the predictOnBatch method (both from the Classifier class which load deepfaune-vit_large_patch14_dinov2.lvd142m.pt in the background) to obtain scores for species-level classification for each individual bounding box.

This approach could prove valuable to other users seeking to integrate two-step DeepFaune detection and classification into their pipelines or APIs.

Absolutely! I pretty much do the same thing, the resizing step I think relates to what I still have to do. Some large images caused my code to crash.

I want to take it one step further, and that's one of the reasons I want to talk to Microsoft about, I'd like to encourage the abstraction of the object detection with the network API approach I developed as that would mean that any new models anyone developed would simply work out of the box with no additional work with my video alerting software. To that end I need to have a chat to see if they agree with the added value, if so they could potentially add this wrapper around their code and all of those models would be available to alert on and to use is simple Python scripts in other peoples pipelines.

Anyway. That's the plan.

See full post
discussion

WILDLABS Awards - Ask your questions!

AboutWith $60,000, $30,000, and $10,000 grants available for 14 outstanding projects, the support of engineering and technology talent from Arm (the leading semiconductor design...

43 0

Good question, I'd like to know the answer as well.  I'm inclined to think that it does mean what you say, but I could be wrong and it's still a couple of weeks away till March :)


To clarify. The answer could affect plans I have for the short term. However,  I can also understand a decision that wants to hold the tension right up to the announcement 😀 For myself I’d guess that it’s 99.9999% likely that it’s like you say. I really couldn’t imagine it being any other way.

Hi Kevin and Kim, 

Apologies for the delayed response; however, we were unable to provide specific information until today. We aim to reach out to each applicant in the upcoming weeks or months.

All the best, 

See full post
discussion

Canopy Camera Trap for Indonesian Lizards

A colleague here in Panama, Scott Trageser, who runs https://biodiversitygroup.org/has an interesting challenge. There’s some kind of, thought to be extinct, rare monitor lizard...

1 1

Hi Andrew! Great to hear your friend, Scott working in Indonesia! I bet he is working on east region with lot of cool monitor lizards!

I use Mavic 2 as well for my crocodile research in place with dense canopy and yes it was tricky! I would suggest to try DJI Avata may be better to do this task. Or maybe try equip propeller guard on the Mavic? 

Would be it possible for the drone setting up the rope and after that the camera lift up using rope and slap to your desire place? just an idea, but in OZ they use drone carrying long rope to caught the crocodile, when it caught, the drone will release the rope and shift to people to work.

See full post
discussion

Calculating Wingbeat Frequency From Accelerometer Data

Does anyone have any experience calculating WBF from ACC data? I'm trying to accomplish this in R. For the most part, I'm getting back pretty accurate number when going in to...

6 0

Great suggestion! Diving bird studies and their analyses are actually what has helped me get thus far with solving this problem. They happen to have done quite the same thing as I'm trying to do, just with more behaviors added. I believe the study was done with murres and kittiwakes.

 

Best,

Travis

I'm very close to solving the problem. Just waiting for a function to run on a fairly large dataset to see the results. I will share the repository link with you when it gets accomplished!

 

The species I'm working with roost atop cave ceiling as also drop from there to get airborne!

 

Yes, they are triaxial (Technosmart) and body mounted right on their backs.

 

So far, I have created thresholds for different metrics derived from the accelerometer data. Essentially, I sectioned out a bunch of ACC data where I am positive flight is occurring, and did the exact same with roosting, and crawling around/scratching(activity while roosting). From there, I plotted the distribution of all the metrics to see which metrics have unique distributions that were significantly different than roosting/activity.

Using those distributions, I created thresholds for the important metrics in which all flight behavior was either above or below a certain value for that metric. This got me to being able to construct a decision tree based on these metrics which had pretty solid accuracy.

 

The downside is a small chunk of flight from the beginning and end of flight bouts are not being included in the behavior classification. I noticed that their wbf during those small chunk are indicative of flight and am going to try and add wbf as the last decision on the tree to improve the accuracy of it.

 

VeDBA is also being included and calculated and based on the values for the thresholds I have created for flight it should not matter how high their head is, rather how low it is, when x y and z thresholds are also met. If that makes sense.

 

Hope I answered most of your questions!

Were you ever able to solve the problem? Interestingly enough, I begin a seal bio-logging study next year!

 

Also, you are correct. The errors were occurring during short bout flights as well as some spectral leakage, but I may have solved the problem by lower the window size. I've also corrected for the spectral leakage by creating a separate function that identifies any significant changes in calculated WBF that last < 2 seconds, then counts number of heave amplitudes within 1 second. I'm using an fft for the calculations and am just waiting for a function to run on a larger dataset to see if everything came out the way I am hoping for. Fingers crossed.

 

Best,

Travis

See full post
discussion

A Sensor-Based Approach to Studying Animal Behavior in Light Pollution Research

Greetings, I'm Sebastian!I am share with you a project that I will need help on some aspects: "Development of a system for recording animal activity and behavior based on sensors...

4 0

Thanks for helping me!

For now, I'll be testing with mains power (220V). The final prototype needs to be functional using solar panel. 

The main system will be a Raspberry Pi, as the Brain of this project, receiving and storage all the measurements from the other devices, controlling the system. The measurement devices will be a microcontroller (still searching for which one to use) with differents sensor, such Lux sensor like TSL2561 or 2591, AudioMoth to record animals sounds, temperature and some other measurements.

For synchronization, must be effective, because the measurements must be or have the same time, so the storage data from differents devices can be grouped by it's timestamp and have full control for animals and fauna. So, probably it will be a lot of data per day.

 

The distance between each devices I haven't study yet, consider 20 meters within each device.

Unless you are planning on making a mesh network between nodes then the total distance spanning the location of all the nodes is important to know, not just the intra node distance.

If you have a Raspberry Pi as a main master node then you could install my sbts-aru project as a base project and you would get a sub-microsecond master time base by default as well as the GPS synchronizes the main system time with typically less than 0.1 microsecond, and SD card corruption resilience due to the in-memory overlayFS architecture.

If the total distance was 20m across all nodes, then the approach above could also combined the audio gathering capability as the sbts-aru project does audio logging as it would be within 20m of an audiomoth anyway, then you also have time-synchonized audio. If the distance is spanning 100m for example and it's just the intranode distance that is 20m then everything is somewhat different with respect to synchronization.

See full post