FOSS4G 2022 academic track

Multi-Sensor Feeder: Automated and Easy-To-Use Animal Monitoring Tool for Citizens
2022-08-24, 11:00–11:30 (Europe/Rome), Room Modulo 3

Environmental changes can have different causes on local level (e.g. soil sealing) as well as on global level (e.g. climate change). To detect these changes and to find patterns in the reasons for them it is necessary to collect broad environmental data, temporally and spatially. Thereto citizens can play an essential role to collect the data (Goodchild, 2007). In particular, we developed a system which enables citizens to monitor the occurrence and distribution of birds and provides the collected data to the public in order that both researchers and citizens can derive conclusions from them. With our automated approach we want to support other citizen science solutions like eBird (Sullivan et al. 2014) where contributors manually report their sightings.

Therefore, we built a prototypical bird feeder equipped with several sensors and the infrastructure to process the data collected by the feeder.
The feeder is easy to reproduce at a reasonable price by following an open available manual. This allows anyone to build the feeder on their own, enabling a large distribution at many locations. The feeder automatically detects when a bird is visiting it, takes an image of the bird, determines the species and connects the observation with environmental data like the temperature or light intensity. All the collected data are published on a developed open access platform. Incorporating other surrounding factors like the proximity of the feeder station to the next forest or a large street allows it to pursue various questions regarding the occurrence of birds. One of them might ask, how does the immediate environment affect bird abundance? Or do sealed surfaces have a negative effect compared to a flowering garden?

The developed weatherproof bird feeder is attached with multiple sensors. Thereby the standard equipment includes a motion sensor to detect if a bird is currently visiting the feeder, a camera to take images of the birds, a balance to weigh the birds and a sensor to measure the environment's temperature and air pressure. In addition to the standard sensors, further sensors were tested with the prototype, which usefully supplement the monitoring but are not absolutely necessary for the operation of the station. Thus, a microphone is suited to record the voice of the birds or generally the surrounding noises. A brightness sensor can be valuable to draw conclusions whether birds visit the feeder in relation to light conditions, or a sensor to measure the air pollution (e.g. PM10) to investigate if the air quality influences the bird occurrence. Besides, the usual camera can be replaced by an infrared camera to capture animals which visit the feeder at night. Thus, the station is expandable and customizable depending on the individual use cases or research questions.
The environmental sensor data is continuously logged and sent to the open access platform, whereby the corresponding interval can be set by the user. Once the motion sensor detects a movement, the camera recording starts as well as scale and microphone start to store values. As long as the motion sensor detects movement, camera, microphone and balance are running. After the movement is finished, a light-weighted recognition model is used to check whether a bird is depicted in the images. If this is the case, all data collected during the movement, including the respective environmental data, will be sent as a package to the open access platform.

In order to process the data collected by the station, we have developed various methods and software for data storage, analysis and sharing. The data processing is done on a centralized server. Communication with this server is enabled through a RESTful API and a website. On the server created entities of the feeders can receive environmental data as well as movement packages. When movements are sent, the server analyzes the amount of birds and identifies the species with artificial intelligence. In addition to the storage, the server makes the data available to users in two ways. First, the data is downloadable as raw JSON via the API, which enables others to use it for their own research. Second, the data is presented nicely on our website, to make it easily inspectable for everyone. However, not only via our stations a upload to the server is possible, it is also open for the upload of data gathered by other systems. Further, it is also possible to upload images of birds and receive the represented species.

The feeder is designed so that it can be replicated by anyone. The corresponding instructions will be published shortly. The code to run the station and the server is available via GitHub (

Moreover, different options for the validation of the data, especially the species classification, are implemented. One step is the automatic validation by the sensor values or metadata. For instance if a standard camera recognizes a bird but currently it is night (detected by light sensor or time of the day) or the balance detects nothing, the observation is discarded. Further validation can come from actual people. An interface is provided which is used to show people values and especially images recorded with the automatically recognized species. The depicted data can be validated to find corrupt sensors and wipe out mistakes made within the image classification. Additionally, the serverside evaluation of the data is supplemented by a validation of the recognized species. It is checked whether it is possible that the species can occur at that geographic region or at that time of the year.

As next steps we want to conduct workshops with citizens and experts, both for putting together the stations as well as evaluating the data and the station itself. In general a strength of our implemented approach is that it is easily adaptable to other use cases, especially to detect other animals. For example with small adaptations to the feeder it could be used to detect or count different mammals like squirrels or for insects like butterflies and bees.

Jan Stenkamp successfully completed an apprenticeship as Geomatiker and a Bachelor's
degree in Geoinformatics. After working as a GIS expert for administrative authorities, Jan is
currently a Master of Science student, studying Geoinformatics at the University of Münster.
He is now working as tutor in different Bachelor courses and as student assistant in the Dist-
KISS research project. Jan has specific interests of study in the topics of data science, remote
sensing, image recognition and citizen science. For a project in the last two domains he
gained funding for developing a multi-sensor tool.

Tom Niers holds a bachelor degree in Geoinformatics. He currently is a master student at the Institute for Geoinformatics (IfGI), where he also works as a student assistant. As a member in the research project Opening Reproducible Research, he develops possibilities to improve reproducible research. By further pursuing study projects of the master program supported by university funding programs, he has gained particular knowledge in working with environmental sensors and the application of artificial intelligence in the context of biodiversity monitoring.

Nick Jakuschona is a Bachelor of Science in Geoinformatics and currently following the
Master of Science program at the Institute for Geoinformatics, University of Münster. During
his studies, his work as student assistant and an internship at the ESRI GmbH he has gained
an expertise in collecting and analyzing spatial data. With his work in the project Opening
Reproducible Research he knows how to contribute profitably to a larger project. Through
different study projects and his bachelor thesis, he is especially experienced with processing
images, like detecting objects in images, training own image processing models or enhancing
the images with further information in Augmented Reality.