Be the first to like this
Presentation slide deck by Anggarda Prameswari for her Master Project Information Sciences "ELEVATOR ANNOTATOR Local Crowdsourcing on Audio Annotation
for The Netherlands Institute for Sound and Vision"
Crowdsourcing and other human computation techniques have proven their use for collecting large numbers of annotations, including in the domain of cultural heritage. Most of the time, crowdsourcing campaigns are done through online tools. Local crowdsourcing is a variant where annotation activities are based on specific locations related to the task. This study investigates a local crowdsourcing application for audio annotations. We describe a specific use for annotating archival audio content to enrich its metadata. We developed a platform called “Elevator Annotator”, to be used on-site. The platform is designed as a standalone Raspberry Pi-powered box which can be placed in an on-site elevator for example. It features a speech recognition software and a button-based UI to communicate with participants. We evaluate the effectiveness of the platform in two different locations and modes of interaction through a local crowdsourcing experiment. The results show that the local crowdsourcing approach is able to achieve annotations with 61% of accuracy, up to 4 annotations per hour. Given that these results were acquired from one elevator, this practice can be a promising method of eliciting annotations from on-site participants.