The project Bio-O-Ton develops a new machine-learning approach to the assessment of biodiversity by combining smartphone sound recordings with satellite imagery. The results will be of equal benefit to science, society and authorities with regard to the implementation of biodiversity conservation and will contribute significantly to the involvement of citizen science in biodiversity monitoring.
Research approach
The aim of this joint interdisciplinary project is to develop and evaluate artificial intelligence (AI) and machine learning approaches for an effective assessment of local biodiversity. Satellite images and audio recordings captured by mobile devices serve as a basis for this research project. The audio data recorded by citizens are to support biotope monitoring with high spatial and temporal resolution. In addition to promoting nature conservation and relieving the burden on authorities, the aim is to facilitate the systematic assessment of biotope types and their health to enable a quick response in case of changes and risks to biodiversity.
ISOE contributes to the joint project via a series of dialogues with stakeholders from municipal administration, associations and professional nature conservation. Together, practice-relevant biodiversity indicators and questions posed to the machine-learning tool will be defined and evaluated. Additionally, ISOE will address potential users with a survey to learn about the expected efficiency and implications of the machine-learning application developed in the project.
ISOE is furthermore conducting surveys with potential users to learn about the expected efficiency and implications of the new machine-learning approach to biodiversity assessment at the landscape level. Here, the focus is on the question of how a comprehensive recording and representation of the biological diversity of a landscape can change the stakeholders’ assessment and awareness of biodiversity.
Background
To counter the ongoing decline of biodiversity, it is important to develop and test new methodological approaches for a spatially and temporally comprehensive biodiversity monitoring. Citizen science already offers numerous opportunities for people from a wide range of backgrounds and with varied levels of expertise to participate in research while at the same time giving them the chance to expand their own knowledge. However, the effort required to acquire the necessary taxonomic knowledge for surveying specific groups of creatures currently sets limits to participation in biodiversity monitoring. As a result, citizen science data are rarely used in research or applied conservation.
While the soundscape is generally perceived unconsciously as an ambient noise, in reality it contains a whole range of information: In nature, sound creates spatial orientation, serves as a means of communication with conspecifics and non-conspecific creatures, and provides information about the structural characteristics and elements contained in the environment. The project therefore focuses on animals that emit sounds that are typical for a particular landscape unit – for example, birds, amphibians, bats, crickets and grasshoppers. While sound recordings are already established as a means of identifying species, further research is needed into how this information can be used to determine the overall quality of a biotope.
The project aims to create an innovative, scientifically validated and efficient mapping method that enables the quality of local cross-species biodiversity to be assessed. To this end, user-generated audio files are combined with high-resolution satellite images using artificial intelligence and machine learning. The purely data-based method is intended to help quantify and evaluate the cross-species biodiversity of different types of landscape.
Research and project partners
- Institute of Photogrammetry and Remote Sensing (IPF) at the Karlsruhe Institute of Technology (KIT), project management
- ci-tec GmbH, Karlsruhe
- Dr. Gisela Wachinger, Pro Re Partizipation und Mediation
Funding
The project “Bio-O-Ton - Biodiversity assessment of biotope types through machine learning based on citizen science sound recordings and satellite images” is funded by the Federal Ministry of Education and Research as part of the funding call “BiodivKI – Artificial Intelligence Methods as a Tool for Biodiversity Research” of the Research Initiative for the Preservation of Biodiversity (FEdA).