Subsea DIY Burnwire for Deep-sea BRUVS
6 December 2023 3:49am
Automatic extraction of temperature/moon phase from camera trap video
29 November 2023 1:15pm
1 December 2023 2:38pm
Do you need to do this for just one project? And do you use the same camera make/model for every deployment? Or at least a finite number of camera makes/models? If the number of camera makes/models you need to worry about is finite, even if it's large, I wouldn't try to solve this for the general case, I would just hard-code the pixel ranges where the temperature/moon information appears in each camera model, so you can crop out the relevant pixels without any fancy processing. From there it won't be trivial, exactly, but you won't need AI.
You may need separate pixel ranges for night/day images for each camera; I've seen cameras that capture video with different aspect ratios at night/day (or, more specifically, different aspect ratios for with-flash and no-flash images). If you need to determine whether an image is grayscale/color (i.e., flash/no-flash), I have a simple heuristic function for this that works pretty well.
Assuming you can manually define the relevant pixel ranges, which should just take a few minutes if it's less than a few dozen camera models, I would extract the first frame of each video to an image, then crop out the temperature/moon pixels.
Once you've cropped out the temperature/moon information, for the temperature, I would recommend using PyTesseract (an OCR library) to read the characters. For the moon information... I would either have a small library of images for all the possible moon phases for each model, and match new images against those, or maybe - depending on the exact style they use - you could just, e.g., count the total number of white/dark pixels in that cropped moon image, and have a table that maps "percentage of white pixels" to a moon phase. For all the cameras I've seen with a moon phase icon, this would work fine, and would be less work than a template matching approach.
FYI I recently wrote a function to do datetime extraction from camera trap images (it would work for video frames too), but there I was trying to handle the general case where I couldn't hard-code a pixel range. That task was both easier and harder than what you're doing here: harder because I was trying to make it work for future, unknown cameras, but easier because datetimes are relatively predictable strings, so you know when you find one, compared to, e.g., moon phase icons.
In fact maybe - as others have suggested - extracting the moon phase from pixels is unnecessary if you can extract datetimes (either from pixels or from metadata, if your metadata is reliable).
5 December 2023 10:09pm
camtrapR has a function that does what you want. i have not used it myself but it seems straightforward to use and it can run across directories of images:
https://jniedballa.github.io/camtrapR/reference/OCRdataFields.html
Which market-available microphones, accelerometers and GIS sensors for dogs / pets ?
7 September 2023 3:21pm
11 September 2023 4:33pm
Hi Luigi!
You should have a look at the μMoth
developed by @alex_rogers and others from Open Acoustics Devices:

As an alternative audiologger meant to be animal borne, check out the Audiologger developed by Simon Chamaillé-Jammes @schamaille et al :

Energy-Efficient Audio Processing at the Edge for Biologging Applications
Biologging refers to the use of animal-borne recording devices to study wildlife behavior. In the case of audio recording, such devices generate large amounts of data over several months, and thus require some level of processing automation for the raw data collected. Academics have widely adopted offline deep-learning-classification algorithms to extract meaningful information from large datasets, mainly using time-frequency signal representations such as spectrograms. Because of the high deployment costs of animal-borne devices, the autonomy/weight ratio remains by far the fundamental concern. Basically, power consumption is addressed using onboard mass storage (no wireless transmission), yet the energy cost associated with data storage activity is far from negligible. In this paper, we evaluate various strategies to reduce the amount of stored data, making the fair assumption that audio will be categorized using a deep-learning classifier at some point of the process. This assumption opens up several scenarios, from straightforward raw audio storage paired with further offline classification on one side, to a fully embedded AI engine on the other side, with embedded audio compression or feature extraction in between. This paper investigates three approaches focusing on data-dimension reduction: (i) traditional inline audio compression, namely ADPCM and MP3, (ii) full deep-learning classification at the edge, and (iii) embedded pre-processing that only computes and stores spectrograms for later offline classification. We characterized each approach in terms of total (sensor + CPU + mass-storage) edge power consumption (i.e., recorder autonomy) and classification accuracy. Our results demonstrate that ADPCM encoding brings 17.6% energy savings compared to the baseline system (i.e., uncompressed raw audio samples). Using such compressed data, a state-of-the-art spectrogram-based classification model still achieves 91.25% accuracy on open speech datasets. Performing inline data-preparation can significantly reduce the amount of stored data allowing for a 19.8% energy saving compared to the baseline system, while still achieving 89% accuracy during classification. These results show that while massive data reduction can be achieved through the use of inline computation of spectrograms, it translates to little benefit on device autonomy when compared to ADPCM encoding, with the added downside of losing original audio information.

This one can also log acceleration and magnetometry! We have recently deployed it on muskoxen in Greenland.
For a GPS tracker, you may want take a look at the SnapperGPS by @JonasBchrt & @alex_rogers :
As an alternative the i-gotU GPS logger may be of interest:

i-gotU GT-120B GPS / GNSS Data Logger - Water Resistant, 21g only, Managing Large Deployments with Ease (2022 Edition)
(USB / Wireless dual interfaces, GPS and QZSS multiple constellations, Windows, Android and IOS compatible) Compared to previous models (i.e. GT 120) which are GPS, GT-120B is a GNSS logger that utilizes both GPS and QZSS constellations. It has multi-path detection, which dramatically eliminates Ionospheric error and multi-path effects. Compared to previous GPS models, the data accuracy is significantly better. GT-120B has usb and wireless dual interfaces, which allows data to be downloaded either via either usb or wirelessly. Rather than using the proprietary USB cable for GT-120, GT-120B uses a standard micros-usb cable. GT-120B can be used as an usb GNSS receiver with 1-10Hz update rates. When used as a GNSS data logger, the update rate is 1 Hz. Managing large deployments of GT-120B with ease The I-gotU GT-120B comes with mobile and Windows apps which help manage a large number of loggers. 1. You can view all your GT-120B devices on Google maps from your mobile phone app. 2. You can self define a group, add loggers to the group and select tracks from the group. 3. From your mobile phone, you can keep track of battery and memory statuses of all your GT-120B devices. 4. Not only can you backup your device settings, you can also standardize the settings of a bunch of devices by using import / export features. 5. If you want to protect the GT-120B data from unauthorized downloads, you can enable password check settings. 6. GT-120B can be turned on / off by a predefined schedule. Battery runtime by GPS log interval GPS log interval Battery runtime 1 sec 5 sec 15hr 10 sec 25hr 15 sec 60hr 30 sec 120hr 60 sec 180hr 60 min 2 months Logging configuration options Configurable GPS Logging interval 1sec~60min59sec Circular Logging YES POI YES Scheduled Logging YES Merge scheduled waypoints YES Smart Tracking YES Power triggered auto-logging YES Technical Spec: Dimension 44.5x28.5x13.8mm Weight 21.5g Wireless connect with mobile phones Yes Wireless Chipset Nordic nRF 52820 Wireless range 20m GPS Chipset MTK MT3337 Antenna Patch Antenna Channels 22 tracking / 66 acquisition-channel GPS receiver; Supports up to 210 PRN channels; GNSS support GPS & QZSS SBAS support WAAS/EGNOS/MSAS/GAGAN Other enhancement 12 multi-tone active interference cancellers (with ISSCC2011 award); Indoor and outdoor multi-path detection and compensation; Internal real-time clock (RTC); RTCM ready YES NEMA support NMEA 0183 standard V3.01 and backward compliance. Supports 219 different data update rates for position 10 Hz GPS Sensitivity Acquisition: -148 dBm (cold) / -163 dBm(hot) Tracking: -165 dBm Cold Start < 35sec Warm Start < 34sec Hot Start <1 sec USB cable micro-usb, USB 2.0 Battery 380mAh LED Blue & Red Operating Temperature -10 ~ +50°C Water-resist YES GPS Logger YES GPS Receiver USB Memory 65000 Motion Detection NO Disable Button YES Disable Wireless YES Disable LED flashing YES Setup wireless download password YES Enable wireless upon schedule YES Configure wireless broadcast interval YES Broadcast latest GPS position YES Configure wireless TX Power (output power of wireless signal) YES Rename device (such as nickname) YES Power Saving Option above 7sec NO Firmware update via PC software Device configuration via USB or wirelessly Data download via USB or wirelessly Combined maps YES GPS Data Import format GPX GPS Data Export format GPX, CSV Software and compatibility: GT-120B comes with below software: Windows App: compatible with Windows 7, 8, 10 & 10 IOS App: compatible with IOS 12 and above Android App: compatible with Android 7 and above “I-gotU GPS” IOS / Android apps: GT-120B can connect to the “I-gotU GPS” app on iphone/Android wirelessly. The “I-gotU GPS” app has the below features: Wireless configurationInstead of connecting through a USB cable, you can now wirelessly change configuration settings of GT-120B through the iphone/Android app. Wireless data downloadYou can wirelessly download the log data from GT-120B to your smartphone. Battery and memory status on AppTo check the status of the device’s battery and memory, open the "i-gotU GPS" app. Find my Device on Google MapYou can view the locations of your devices on Google maps from your mobile phone. “I-gotU GPS” Windows application: The new “I-gotU GPS” Windows software has the below new features: playback group movement measure distance from waypoint A to B. measure distance from different anchor points. Package content: 1 x GPS logger; 1 x USB cable; (Does NOT comes with Jelly case or fastening strap.) Youtube videos: - i-gotU GT-120B / GT-600B Youtube video Documents: - User Guide - side by side comparison for GT-120, GT-600, GT-120B and GT600B Sample Data Downloads: - Sample Data File in CSV format (original data recorded by GT-120B) - Sample Data File in GPX format (original data recorded by GT-120B) Blogs: - How to reset the i-gotU GT-120B / GT-600B device? - Side by side comparison for GT-120, GT-600, GT-120B and GT600B - When charging multiple GT-120B/GT-600B devices simultaneously, please avoid the following. Software Downloads: (To download the app for Windows, iOS or Android devices, simply click the link below that corresponds to your device.)

DIY Instructions
After the two day acclimation period, with the GPS is programed, insert the GPS unit into the case and proceed to track your cat for a 10 day period.

Regarding your question on sampling frq: We have been using 8Hz (and 10 Hz on the Audiologger Acceleration logging) for our slow moving muskoxen. For an animal like a dog, you probably want to sample at somewhat higher frq. This group used 50Hz in a study of arctic fox:

Digging into the behaviour of an active hunting predator: arctic fox prey caching events revealed by accelerometry - Movement Ecology
Background Biologging now allows detailed recording of animal movement, thus informing behavioural ecology in ways unthinkable just a few years ago. In particular, combining GPS and accelerometry allows spatially explicit tracking of various behaviours, including predation events in large terrestrial mammalian predators. Specifically, identification of location clusters resulting from prey handling allows efficient location of killing events. For small predators with short prey handling times, however, identifying predation events through technology remains unresolved. We propose that a promising avenue emerges when specific foraging behaviours generate diagnostic acceleration patterns. One such example is the caching behaviour of the arctic fox (Vulpes lagopus), an active hunting predator strongly relying on food storage when living in proximity to bird colonies. Methods We equipped 16 Arctic foxes from Bylot Island (Nunavut, Canada) with GPS and accelerometers, yielding 23 fox-summers of movement data. Accelerometers recorded tri-axial acceleration at 50 Hz while we obtained a sample of simultaneous video recordings of fox behaviour. Multiple supervised machine learning algorithms were tested to classify accelerometry data into 4 behaviours: motionless, running, walking and digging, the latter being associated with food caching. Finally, we assessed the spatio-temporal concordance of fox digging and greater snow goose (Anser caerulescens antlanticus) nesting, to test the ecological relevance of our behavioural classification in a well-known study system dominated by top-down trophic interactions. Results The random forest model yielded the best behavioural classification, with accuracies for each behaviour over 96%. Overall, arctic foxes spent 49% of the time motionless, 34% running, 9% walking, and 8% digging. The probability of digging increased with goose nest density and this result held during both goose egg incubation and brooding periods. Conclusions Accelerometry combined with GPS allowed us to track across space and time a critical foraging behaviour from a small active hunting predator, informing on spatio-temporal distribution of predation risk in an Arctic vertebrate community. Our study opens new possibilities for assessing the foraging behaviour of terrestrial predators, a key step to disentangle the subtle mechanisms structuring many predator–prey interactions and trophic networks.

5 December 2023 7:46pm
@Lars_Holst_Hansen so sorry to have missed your reply! Thanks so much, I am going to check the links.
Successfully integrated deepfaune into video alerting system
2 December 2023 11:15am
4 December 2023 10:10am
That would be fun. Thanks for your support
5 December 2023 6:46pm
So my vision about using my StalkedByTheState software to deter the wolves away from the sheep can be represented something along the lines of this:
I'm pretty sure this is how Rob Appleby described it to me. But I think all the sheep should be inside the fence.
5 December 2023 6:53pm
And here's how I would explain it to the government officials in a power point. I'm making the point that the wolves have been shooed away.
Two-year postdoc in AI and remote sensing for citizen-science pollinator monitoring
4 December 2023 12:21pm
Shark BRUV annotated data needed! ML for automatic BRUV postprocessing
4 December 2023 11:55am
Bird Acoustic Surveys: Comparison with traditional transect methods
6 November 2023 9:32am
22 November 2023 3:24pm
Thank-you for sharing this study, I read it with interest! I was wondering, in doing this study did you also get a feel for how these methods compare in terms of time and costs and required skills? As a practitioner I am still a bit worried about the amount time required for set up, maintenance, data management, species identification, and analysis.
4 December 2023 11:36am
Hi Theresa. In comparison to traditional survey, I think that the time/cost benefits of acoustics are good. Certainly the set-up, maintenance, and data management requirements are minimal. And if there is significant travel time to site, and the recording period of acoustic survey is long, then I think the benefits are compounded (i.e. there are economies of scale to acoustics that you don't get with trad surveys).
Until the last year or two, the data analysis for species identification has been the time-consuming part. However, now that systems such as BirdNET are available, this issue is fairly well dealt with (but still needs a little bit of skill/experience).
A couple of scientific papers have assessed these costs/benefits - I hope these make an interesting read.
Carlos

Cost‐benefit analysis of acoustic recorders as a solution to sampling challenges experienced monitoring cryptic species
Ecology & Evolution is a broad open access journal welcoming research in ecology, evolution, and conservation science, and providing a forum for evidence-based views.
The Use of Automated Bioacoustic Recorders to Replace Human Wildlife Surveys: An Example Using Nightjars
To be able to monitor and protect endangered species, we need accurate information on their numbers and where they live. Survey methods using automated bioacoustic recorders offer significant promise, especially for species whose behaviour or ecology reduces their detectability during traditional surveys, such as the European nightjar. In this study we examined the utility of automated bioacoustic recorders and the associated classification software as a way to survey for wildlife, using the nightjar as an example. We compared traditional human surveys with results obtained from bioacoustic recorders. When we compared these two methods using the recordings made at the same time as the human surveys, we found that recorders were better at detecting nightjars. However, in practice fieldworkers are likely to deploy recorders for extended periods to make best use of them. Our comparison of this practical approach with human surveys revealed that recorders were significantly better at detecting nightjars than human surveyors: recorders detected nightjars during 19 of 22 survey periods, while surveyors detected nightjars on only six of these occasions. In addition, there was no correlation between the amount of vocalisation captured by the acoustic recorders and the abundance of nightjars as recorded by human surveyors. The data obtained from the recorders revealed that nightjars were most active just before dawn and just after dusk, and least active during the middle of the night. As a result, we found that recording at both dusk and dawn or only at dawn would give reasonably high levels of detection while significantly reducing recording time, preserving battery life. Our analyses suggest that automated bioacoustic recorders could increase the detection of other species, particularly those that are known to be difficult to detect using traditional survey methods. The accuracy of detection is especially important when the data are used to inform conservation.
Query regarding Biologgers for Freshwater crabs
16 November 2023 4:45am
18 November 2023 10:37pm
My pleasure @Abinesh and if you have any more questions etc., don't hesitate to ask. This is a great community with plenty of smart cookies that can help and also me if I am able!
All the best for your research.
Rob
1 December 2023 8:54pm
Star-Oddi in Iceland comes to mind, but I'm not 100% sure.
3 December 2023 2:43am
Thank you Thomas, you are absolutely right but when I Mailed them, I didn't get a response about the price , shipment and so on! Thus I arrived to find some loggers in India itself
eDNA Collaborative Microgrants Program!
2 December 2023 8:51pm
This round of microgrants will be awarded in collaboration with miniPCR bio, and each microgrant award will consist of a mini16x thermal cycler, a blueGel electrophoresis system with USB power adaptor, one pipette, and a field carrying case.
What is your favorite package or software for visualizing animal tracking data?
12 September 2023 6:52pm
17 November 2023 1:22pm
Is this just for PIT tags?
21 November 2023 8:56am
Hi Jennifer, Movebank can handle all kinds of location data, metadata or sensors (height, depth, speed, acceleration, IMU, heartrates) and Firetail ships with a module that can directly open Movebank data (your or other projects) and keep them up to date - most sensors will work out of the box.
If you have any specific questions or projects in mind, don't hesitate to contact me. Cheers, Tobias
2 December 2023 2:12pm
You might be interested in reaching out to the team at National Audubon Society's Migratory Bird Initiative. They just released the Bird Migration Explorer a year ago and likely have many thoughts and ideas. Erika Knight would likely be a good person to speak with.

Bird Migration Explorer
The Bird Migration Explorer is your guide to the heroic annual journeys made by over 450 bird species, and the challenges they face along the way.

Erika Knight
As a GIS and Data Science Specialist for Audubon's Migratory Bird Initiative, Erika Knight analyzes, synthesizes, and visualizes migration pathways and patterns of bird distribution throughout the...
Is anyone or platform supporting ML for camera trap video processing (id-ing jaguar)?
27 November 2023 10:49am
30 November 2023 8:45am
I can help you with that, please send us some of the videos and will build a pipeline for it
aakash at thinkevolvelabs dot com
- Aakash Gupta
Green AI Products — DeCaTron
DeCaTron is an AI-based platform for assessing and simulating the biodiversity of forested regions

1 December 2023 8:40pm
ffmpeg + TrapTragger (Camtraps + re-ID) and Whiskerbook(re-ID)?
TrapTagger About
Camera traps are a powerful tool for studying wildlife and are widely used by ecologists around the world. However, these motion-activated cameras can generate immense amounts of data, of which a large proportion is simply empty images generated by the wind and other disturbances. Traditionally, these thousands of images were manually sorted by the ecologists themselves in an error-prone and extremely time-consuming process – often taking longer than the camera-trap survey itself. This presented a massive obstacle to camera-trap-focused research. As such, we wanted to develop a way to make the process drastically shorter, and more accurate. In turn this would allow ecologists to run more camera trap surveys more often, resulting in more complete data, and more statistically robust results across greater and greater areas. Moreover, this would allow ecologists to spend their new-found time on the analysis of the data itself, and make the world a better place.

Whiskerbook
Whiskerbook is a visual database of jaguar encounters and of individually catalogued animals. The library is maintained and used by biologists to collect and analyse jaguar encounter data to learn more about these amazing creatures.
2 December 2023 9:01am
Hi All,
I understand Okala are doing a lot of work with ML and cameratraps, you could try and contact them.

Data integration platforms
28 March 2019 9:27am
20 November 2023 8:19pm
Hey Chris - you haven't missed Variety Hour! It's this week!
1 December 2023 8:56pm
Argos has an API
Iridium data either arrives via email/server IP
Globalstar (unknown)
If you'd like the web services document for Argos, shoot me an email (tgray at woodsholegroup - dotcom).
1 December 2023 8:56pm
(duplicate)
Alternative trackers for study of grey parrots movement patterns
3 November 2023 7:31pm
24 November 2023 9:18am
Hi Thomas thank you for your recommendation.
Yeah getting an appropriate tag for the parrots has really been challenging
24 November 2023 9:20am
Hi Rob,
thank you for the recommendations. I will checkout both options and choose the one which will be well suited for my zone of studies.
Warm Regards, Benedicta
1 December 2023 8:10pm
Hi Ninying,
One benefit of the Pinpoint tags is that they are user-rechargeable, something pretty much unheard of for satellite tags for decades! If you can recover the tags, you might be able to achieve a larger samples size with fewer tags (less $$) by redeploying the recovered tags - without the costs of having the manufacturer refurbish them.
cheers,
Kyler
Autonomizing Small Mammal Traps
29 November 2023 6:26pm
1 December 2023 2:42pm
The Australians have invented this already for a completely different purpose - if you fill it with hair dye instead of 1080 gel it will spary a unique pattern of droplets on the target animals. You need a post-spray picture, but that should be doeable.
1 December 2023 6:03pm
Thanks for the link I will check it out!
1 December 2023 7:37pm
One thing to keep in mind is that researchers often want or at least would like to get certain metadata on the tagged animals like sex, size, weight, apperent fitness etc. Without these the questions you can ask can get rather limited. Also, it will also often be highly desired to take samples like blood, hair, other tissue e.g.
In addition, there can be cases where it may be better not to tag the animal if it is not the right age group, is too small to carry the tag, seems like it is not in a good shape etc.
I think it will take quite an effort to get automated systems (capture robots) to make these decisions to a degree you can trust.
Cheers,
Lars
Remote weather stations
21 March 2018 7:48pm
28 November 2023 2:29pm
While pretty expensive, everyone I know has had good experiences with the HOBO weather stations and can be customizable if needed. Would still be my rec if you need something ASAP
30 November 2023 1:30am
Hi Carly, thanks for this. I forgot to mention HOBO in my original post, but I had them down as the main available option. Think it might have to be them, I'll have to check for $ down the back of the sofa...
1 December 2023 4:55pm
I came across this which looks like it might work for you.
Others have mentioned Davis. I used the Davis vantage pro2 in a previous life, and the cabled version was about USD200 IIRC. However, it must operate with a display console, which can take an optional data logger, but the console is intended to be kept indoors. This means providing a weatherproof enclosure for it in the forest, in addition to a mast for the anemometer etc.
My Experience and Takeaways at the 1st WildlifeScientific Conference
1 December 2023 4:42pm
Taking on the conservation Strides
1 December 2023 4:18pm
Video camera trap analysis help
21 November 2023 7:49am
28 November 2023 2:33pm
Zamba Cloud can handle video!
30 November 2023 8:44am
Hey there - we can help you, please send us some of the videos and will build a pipeline for it
aakash at thinkevolvelabs dot com
- Aakash Gupta
Green AI Products — DeCaTron
DeCaTron is an AI-based platform for assessing and simulating the biodiversity of forested regions

1 December 2023 2:42pm
Hi,
our software Deepfaune works with videos. It is trained for the European fauna and try to put all birds in a 'bird' class but it includes a detection stage so at least it can filter 'empty' images or videos. Performance on videos is not as great as it is on images, so I can't promise it will match your expectations. But the installation on windows is easy, so give it a try and see for yourself. It's a point&click software. You can use it in English, French, Spanish, Italian.

Conservation of Kikuyu Escarpment Forest
1 December 2023 12:38pm
WILDLABS Awards 2024: Supporting accessible, affordable, and effective innovation for nature
1 December 2023 11:00am
WILDLABS Awards 2024: Further Information
1 December 2023 11:00am
Impacts of Rhino Fence on other Wildlife Species
1 December 2023 9:25am
PhD Position- Nottingham Trent University UK
1 December 2023 7:13am
Into the Underwater Savanna: BRUV Surveys In Seagrass Beds
30 November 2023 6:12pm
Conservation of endangered birds in Kenya
30 November 2023 12:57pm
WILDLABS AWARDS Webinar
30 November 2023 12:17pm
How to Choose a Biologger - Marine animals with Samantha Andrzejaczek, Jessica Rudd and Lucy Hawkes
30 November 2023 12:03pm
Wishlist for kit in a field-based Research Station or tech testing space?
27 November 2023 10:41am
28 November 2023 3:25am
Regine Weckauf over on linkedin:
'Little to do with research and tech development, but given how hard it is to attract and retain experienced staff to field based positions, I know it makes a difference how nice the space is. Just because it's the "field", shouldn't mean staff living in basic conditions, regardless of how many times we've been told to see it as a badge of honor. If you have the money, put in nice bathrooms, kitchen, living spaces, and private accommodation. Maybe even a nursery? It creates more local employment opportunities and people genuinely want to visit.'
28 November 2023 2:25pm
Love the idea for in-house gear/supplies! It can be SOO difficult to travel with batteries, electronics parts with airline regs, country policies, etc. and shipping recorders/trail cams/etc. gets VERY (prohibitively) pricey in some countries with customs and taxes. Would be great to have an in-country place to source that kind of equipment.
Housing educational resources related to that tech (in the form of people, print materials, computer tutorials) in-house would be similarly awesome. Particularly/especially in local languages.
Having in-country wet labs as well helps the eDNA/genetics folks, since sample import/export permitting can be (always seems to be?!) a nightmare, so if you can even just do PCR and/or extractions in-country that helps a ton.
In terms of overall field-station-wishlist - honestly, just the promise of continued funding and staff. Every field station I've been to or worked at is in a constant search for enough money to get through the next month/year, because the funding comes in the establish a station but then not to maintain it long-term. It's not sexy for a wishlist per se, but boy is it over-looked and much-needed.
@hikinghack from Dinalab would probably have lots of good insights on this!
29 November 2023 7:59pm
My suggestion would probably be a 3D printer and Solder Station with a stock of common components. With those two things you can solve most problems.
Paving the Way for Women: LoRaWAN Technology in Akagera National Park with Clementine Uwamahoro

29 November 2023 5:22pm
Ekobot WEAI robot - autonomous weeding at farm scale
29 November 2023 12:11am
A little off the typical path for Wildlabs - and probably not the sort of drone people typically think about here 😄 - but I find robotic agriculture very interesting, with the potential to greatly reduce use of poison and improve effective yields. Anyone working on like things?
1 December 2023 2:35pm
Hi Lucy
As others have mentioned, camera trap temperature readouts are inaccurate, and you have the additional problem that the camera's temperature can rise 10C if the sun shines on it.
I would also agree with the suggestion of getting the moon phase data off the internet.