Researchers at the UNESCO Chair are working with state-of-the-art technologies for biodiversity monitoring. One technique is called ‘camera trapping’. You can think of a standard trail camera that outdoor enthusiasts use to learn which animals visit their gardens. Scientists use these cameras to document the animals that visit their research sites. Most of the science on camera trapping is focused on warm-blooded animals, which make the camera take pictures based on their movement in the landscape. For our application we are interested in the insects that visit flowers. Insects are small and cold-blooded, so we are using time-lapse photography instead of motion-based detection.
Time-lapse photography introduces a number of challenges: most pictures do not actually show any insects! This means that we train an algorithm whether an insect is on the image and what kind of insect it is. We use an open-source machine learning algorithm called YOLO. We first annotate the photograph where a person draws a box around the insect using a computer programme. We then label what type of insect is in the box (see image). With a large number of annotated pictures – a so-called ‘reference dataset’ – YOLO applies a convolutional neural network to ‘learn’ to identify the type of insect in new pictures.
We are using high-powered computers to test state-of-the-art of camera trapping methods. With our reference dataset, we are on the way to automatically identify some common pollinators in Carinthia. Hopefully the winter flies by so that we can test our system next year.