What is the ground truth? Reliability of multi-annotator data for audio tagging

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

1 Downloads (Pure)

Abstract

Crowdsourcing has become a common approach for annotating large amounts of data. It has the advantage of harnessing a large workforce to produce large amounts of data in a short time, but comes with the disadvantage of employing non-expert annotators with different backgrounds. This raises the problem of data reliability, in addition to the general question of how to combine the opinions of multiple annotators in order to estimate the ground truth. This paper presents a study of the annotations and annotators' reliability for audio tagging. We adapt the use of Krippendorf's alpha and multi-annotator competence estimation (MACE) for a multi-labeled data scenario, and present how MACE can be used to estimate a candidate ground truth based on annotations from non-expert users with different levels of expertise and competence.
Original languageEnglish
Title of host publication29th European Signal Processing Conference EUSIPCO 2021
Number of pages5
ISBN (Electronic)978-9-0827-9706-0
DOIs
Publication statusPublished - 2021
Publication typeA4 Article in conference proceedings
EventEuropean Signal Processing Conference - Dublin, Ireland
Duration: 23 Aug 202127 Aug 2021
https://eusipco2021.org

Conference

ConferenceEuropean Signal Processing Conference
Abbreviated titleEUSIPCO
Country/TerritoryIreland
CityDublin
Period23/08/2127/08/21
Internet address

Publication forum classification

  • Publication forum level 1

Fingerprint

Dive into the research topics of 'What is the ground truth? Reliability of multi-annotator data for audio tagging'. Together they form a unique fingerprint.

Cite this