discussion / Community Base  / 25 January 2018

#Tech4Wildlife Friday Roundup - 26th Jan 2018

Hi everyone,

There's been a lot of activity in the conservation tech space this week. While a lot of it will make its way onto WILDLABS over the next few days/weeks, there is too much great work happening to let it trickle out. I thought this might be a nice chance to experiment with a neat roundup of the top #tech4wildlife posts this week.

If this is useful for you, I'd really appreciate it if you could let me know below. I'm not promising that I'll do this every week, but if there's a good response I'll make it a regular thing to complement our monthly WILDLABS Digest.

Remember, if you have news, opportunities or a case study you'd like to publish here on WILDLABS, you can send them through to our team at [email protected]. Check out our content guide to find out about what we look for in articles.   

Okay, so here we go. My top #Tech4Wildlife happenings this week include:

ZSL has been working with Google to trial facial recognition technology to help track elephants in the wild

ZSL's Tech lead and WILDLABS steering group member @Sophie+Maxwell talks about this project in a nice technical Google Cloud blog and also in an accessible piece in the Evening Standard. As @ollie.wearn shared: 'We've been working with Google for the last yr developing this new platform. It could revolutionize how we identify wildlife images. Early results with 700,000 #cameratrap images are staggeringly good and now testing with > 5 million images. Watch this space.'

Definitely worth a read if you're part of our camera trapping community.

Related:

  • ZSL has been busy this month! @Soseccombe  has a new case study introducing Instant Detect 2.0: A connected future for conservation

  • The DYNI team (University of Toulon, France), is looking for a master student to contribute to the development of an open-source tool for collaborative audio annotation aiming at creating bioacoustic datasets. These datasets will be used to improve our models of automatic bioacoustic scene analysis (e.g. detection and identification of bird species, marine mammals).

Science’s January Cover Story:Moving in the Anthropocene: Global reductions in terrestrial mammalian movements’

Until the past century or so, the movement of wild animals was relatively unrestricted, and their travels contributed substantially to ecological processes. As humans have increasingly altered natural habitats, natural animal movements have been restricted. Tucker et al. examined GPS locations for more than 50 species. In general, animal movements were shorter in areas with high human impact, likely owing to changed behaviors and physical limitations. Besides affecting the species themselves, such changes could have wider effects by limiting the movement of nutrients and altering ecological interactions.

Much of the data from this paper was organized through Movebank. Congratulations to all the movement ecologists who contributed to this study. 

Related:

AudioMoth: Evaluation of a smart open acoustic device for monitoring biodiversity and the environment

The open acoustics team have published an open access paper in Methods in Ecology and Evolution describing the development and proof-of-concept of their low-cost, small-sized and low-energy acoustic detector: “AudioMoth.” The device is open-source and programmable, with diverse applications for recording animal calls or human activity at sample rates of up to 384 kHz. We briefly outline two ongoing real-world case studies of large-scale, long-term monitoring for biodiversity and exploitation of natural resources. These studies demonstrate the potential for AudioMoth to enable a substantial shift away from passive continuous recording by individual devices, towards smart detection by networks of devices flooding large and inaccessible ecosystems.

There are also nice write ups on both Mongabay and Phys.org about the AudioMoth. The second group buy of the Audiomoth has now closed with 101 backers and 339% funded. The 58 members of the first group buy have now received their orders (with much apparent excitement), and we’re seeing lots of photos of audiomoths in the wild as testing and deployments begin.

Cc- @alex_rogers , @Andrew+Hill , @Alasdair 

Related:

  • New Paper:  Shipley JR, Kapoor J, Dreelin RA, Winkler DW. An open-source sensor-logger for recording vertical movement in free-living organisms. Methods Ecol Evol. 2017;00:1–7. https://doi.org/10.1111/2041-210X.12893. We have developed a freely available, open-source, non-invasive datalogger that can measure an animal's altitude or elevation (A) at sampling intervals greater than once per second, storing up to 10,000 total samples, depending on programme size. The realized design weighs 370 mg, opening a wealth of opportunities to study animals as small as 10 g without exceeding the 4% rule-of-thumb for the relative mass of telemetry devices.

  • Oxford University is hosting an eight week Lecture Series on Technology Empowered Conservation. You are invited to attend in person if you live near Oxford, otherwise the lectures will also be recorded and posted on WILDLABS. Prof Alex Rogers and Dr Andrew Markham will talk about Co-creating open-source conservation technology in Week 3: 4.15pm, Thursday 1 February 2018.

Active localization of VHF collared animals with Drones

Haluk Bayram, Krishna Doddapaneni, Nikolaos Stefas and Volkan Isler study the problem of localizing a tagged animal using the audio output of an off-the-shelf receiver mounted on an autonomous Unmanned Aerial Vehicle (UAV). When VHF radio tracking with a drone, directly estimating range or bearing from the signal may not be possible but a local-search algorithm can drive the UAV toward the target. Read more

Related:

Before I wrap up, if you haven't seen this yet it might be of interest: the WWF Wildlife Practice has a small amount of funding available to provide grants of around €10,000 to support key innovative ideas. Find out more here

What #Tech4Wildlife news has caught your attention this week? If you've seen some other news or have your own updates, please share it below!

Stephanie




Thanks Steph,  

I appreciate a semi-regular summary document - it might only occasionally throw up a 'new' story for community members (depends on other media followed / library access) but it could be that one story that gives someone the next bright idea for their project (and is also no doubt appreciated by the developers / contributors).

MJ