Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Elevator Annotator
Local Crowdsourcing on Audio Annotation for
Netherlands Institute for Sound and Vision
Anggarda Pramesw...
Elevator Annotator
What is it about?
▷ The rise of crowdsourcing practice in the community
→ Crowdsourcing is seen to be an attractive solution to acquire
anno...
Audio file
▷ An audio file consists of multiple information (metadata)
▷ There are many musical instruments involved in a ...
Case study
▷ Netherlands Institute for Sound and Vision
→ Largest audiovisual archive in the Netherlands
→ To collect and ...
What do you hear?
Research Questions
“Can local crowdsourcing be used to enrich the audio collection
content with improved quality?”
▷ What ...
Approach
How will it be carried out?
Approach
(February - March)
▷ Implementation → Develop a system using Raspberry Pi as the platform
Approach
(April - May)
▷ Experiments
→ Locations: Offers
different institutional
properties
→ User interfaces:
Offers diff...
▷ Evaluation
→ Accuracy of musical instrument identification
→ Willingness of crowd to participate
→ Participation statist...
Thanks!
Any questions?
Upcoming SlideShare
Loading in …5
×

Elevator Annotator (MSc Project presentation slides by Anggarda Prameswari)

1,873 views

Published on

Presentation slide deck by Anggarda Prameswari for her Master Project Information Sciences "ELEVATOR ANNOTATOR Local Crowdsourcing on Audio Annotation
for The Netherlands Institute for Sound and Vision"

http://www.victordeboer.com/wp-content/uploads/2017/07/2589263_ElevatorAnnotator_FinalVersion.pdf

Crowdsourcing and other human computation techniques have proven their use for collecting large numbers of annotations, including in the domain of cultural heritage. Most of the time, crowdsourcing campaigns are done through online tools. Local crowdsourcing is a variant where annotation activities are based on specific locations related to the task. This study investigates a local crowdsourcing application for audio annotations. We describe a specific use for annotating archival audio content to enrich its metadata. We developed a platform called “Elevator Annotator”, to be used on-site. The platform is designed as a standalone Raspberry Pi-powered box which can be placed in an on-site elevator for example. It features a speech recognition software and a button-based UI to communicate with participants. We evaluate the effectiveness of the platform in two different locations and modes of interaction through a local crowdsourcing experiment. The results show that the local crowdsourcing approach is able to achieve annotations with 61% of accuracy, up to 4 annotations per hour. Given that these results were acquired from one elevator, this practice can be a promising method of eliciting annotations from on-site participants.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Elevator Annotator (MSc Project presentation slides by Anggarda Prameswari)

  1. 1. Elevator Annotator Local Crowdsourcing on Audio Annotation for Netherlands Institute for Sound and Vision Anggarda Prameswari 2589263@student.vu.nl Supervisor: Victor de Boer Daily Supervisor: Themistoklis Karavellas
  2. 2. Elevator Annotator What is it about?
  3. 3. ▷ The rise of crowdsourcing practice in the community → Crowdsourcing is seen to be an attractive solution to acquire annotations cheaply and quickly ▷ Local crowdsourcing: → Participants do the given tasks on site while overcoming challenges and influences posed by physical environment → Example: Local visitors in a museum help annotate the painting collections What about audio collections? Background
  4. 4. Audio file ▷ An audio file consists of multiple information (metadata) ▷ There are many musical instruments involved in a song ▷ In a fragment of seconds there will different instrument playing dominantly
  5. 5. Case study ▷ Netherlands Institute for Sound and Vision → Largest audiovisual archive in the Netherlands → To collect and preserve audiovisual heritage → Make it available to as many users as possible ▷ How? → Enriching the audio collection’s content by having additional information of the instruments → Make use of local crowdsourcing with pervasive computing → Experimenting in an elevator (Why?) → Used frequently, people will mostly be idle, thus ideal to ask for annotation
  6. 6. What do you hear?
  7. 7. Research Questions “Can local crowdsourcing be used to enrich the audio collection content with improved quality?” ▷ What are the requirements for the crowdsourced annotations according to its technical and ethical aspects? ▷ How do different physical locations of the local crowdsourcing affect the result? ▷ How do different type of user interactions of the local crowdsourcing affect the result?
  8. 8. Approach How will it be carried out?
  9. 9. Approach (February - March) ▷ Implementation → Develop a system using Raspberry Pi as the platform
  10. 10. Approach (April - May) ▷ Experiments → Locations: Offers different institutional properties → User interfaces: Offers different feedback to the user
  11. 11. ▷ Evaluation → Accuracy of musical instrument identification → Willingness of crowd to participate → Participation statistic ▷ Result → Source codes with the prototype → Comprehensive report of evaluation Approach (continuously - June)
  12. 12. Thanks! Any questions?

×