INSEMTIVES Tutorial ISWC2011 - Session3

579 views
573 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
579
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

INSEMTIVES Tutorial ISWC2011 - Session3

  1. 1. Crowdsourcing the annotation ofdynamic Web content at seekda Elena Simperl, University of Innsbruck, AT Markus Rohde, University of Siegen, DE ISWC 2010 www.insemtives.eu 1
  2. 2. Overview• Context• Prototyping• Participatory Design• User Challenges• Conclusions www.insemtives.eu 2
  3. 3. Context• Web services portal – Crawls for and indexes Web Services on the Web – Currently more than 28,500 indexed and monitored• Problems – Services are not annotated or described – Limited search results and possibilities – Web APIs need to be confirmed by users• Goal – Obtain more annotations by involving users in the annotation process – Validate existing annotations, if any – „Catch them & keep them“ www.insemtives.eu 3
  4. 4. Design decisions• Different annotation methods exist – Keywords/tags – Categories – Natural language descriptions – Lightweight/fully-fledged semantic web service descriptions (e.g. WSMO/Light, OWL-S, etc.)– Avoid complicated and demanding annotations (limit to tags, categories and NL descriptions) • Use lightweight RDF ontologies in the background (e.g. to ease the search)• SWS annotations might be integrated in the future – Most users are not familiar with SWS – Difficult to integrate within the search (diverse frameworks and variants) – May hamper performance & usability www.insemtives.eu 4
  5. 5. Prototype Creation www.insemtives.eu 5
  6. 6. Design Recommendations www.insemtives.eu 6
  7. 7. Participatory Design• Involve (end) users in prototyping• Users = experts of use/practice• Needs assessment -> Requirements Analysis• Exploit users‘ expertise and creativity in design processes• Integrate evaluation in design processes• Repeated prototyping cycles www.insemtives.eu 7
  8. 8. Online Participatory Design• Seekda‘s users are – anonymous – distributed worldwide• Online communication via website• Creating opportunities for online participation – Establish appropriate OPD process design – Develop adequate OPD tool (= dashboard) www.insemtives.eu 8
  9. 9. Stakeholders‘ BenefitsUsers‘ Benefit Seekda‘s Benefit• Design follows users‘ • Getting direct input from needs users/customers• Implement own ideas • Focussing on central• Insights in technology user requirements and development • Getting to know users/customers www.insemtives.eu 9
  10. 10. OPD Stakeholders/ Process Roles• Project owner – Initiation, Management, Coordination – Facilitation• Research/ Observer – Expert as neutral consultant• Technical committee – Developers/Designers and Users – Process Decisions• User committee www.insemtives.eu 10
  11. 11. OPD Workshop - Procedure• In General: 1. Technical committee chooses number of features out of forum discussions 2. Features open for user voting 3. Feature selection, implementation 4. Collect user feedback• Duration per cycle: 6 weeks www.insemtives.eu 11
  12. 12. OPD Workshop – Cycles•1st Cycle • Beginning November: Start of workshop (dashboard, technical committee, introduction) • 2 weeks later: Identification of 5 most important features/wishes • 1 week later: Selection of 2 most popular features • Beginning of Dec: Short tasks for users • Mid December: End of cycle, feedback analysis • OPD Dashboard Improvements•2nd Cycle • Execution dates: January – March • Goals for this cycle • Increase motivation • Increase activity of participants • Focus more on usability/design and incentives • Changes • Tasks first • Split into smaller parts, sequentially • Explained through screencasts • Example: go to the portal, search for xyz, identify redundant elements, most important, … • OPD Dashboard Improvements www.insemtives.eu 12
  13. 13. PD of the DashboardOn-site PD workshop: Requirements for the PD dashboard www.insemtives.eu 13
  14. 14. OPD DescriptionProcess description for participants+ Video Instruction www.insemtives.eu 14
  15. 15. OPD Announcement www.insemtives.eu 15 13
  16. 16. OPD Introduction www.insemtives.eu 16 13
  17. 17. OPD Dashboard (2nd Cycle)Improvements: awareness feature andweekly tasks for participants www.insemtives.eu 17
  18. 18. Feature Selection and Voting www.insemtives.eu 18
  19. 19. OPD Workshop - Results• Numbers ~ 250 votes ~ 160 forum posts 15-20 active users• User Background • Web Services experts • Developers • Random visitors• Feedback/Implementation • 18 suggested features • 6 concrete features implemented (ongoing) • Several implemented usability/design improvements• Conclusions & Next Steps (ongoing) • Introduce challenge procedures • Ask specifically about guided processes (wizards) • Integrate OPD workshop directly from the platform www.insemtives.eu 19
  20. 20. Evalution (I)• Six interviews (~60 min) with participants – Experiences – General evaluation – Critique, improvements• Limitations of written communication -> Multi-media• Performance problems• Positive: Video instruction• Improvement: Awareness features/ notifications www.insemtives.eu 20
  21. 21. Evaluation (II)• Central features and Usability have been improved• High quality feedback from users• Improved planning of features/ implementation based on early discussion with users• (Perceived) assistance/ support for developers/ designers• „Yeah, I think it succeeded. We got a lot of contribution from people […] and it showed this kind of workshop can work. This kind of methods.“ www.insemtives.eu 21
  22. 22. Challenge 1: Amazon Mechanical TurkGoal: Initial annotations for new and undescribed APIsTasks available • Confirm document is related to a Web API (yes/no) • Provide/improve description • Provide and confirm (bootstrapped) tags/concepts • Provide and confirm (bootstrapped) service categories • Rate document qualityParameters • Qualification test • Min. approval rate per worker • Approx. Time needed per task • Reward www.insemtives.eu 22
  23. 23. www.insemtives.eu 23
  24. 24. www.insemtives.eu 24
  25. 25. www.insemtives.eu 25
  26. 26. www.insemtives.eu 26
  27. 27. Challenge 1: MTurk Simple Annotation WizardPhase 1 1. Setup • Initial set of 70 Web APIs • No qualification needed • Approx. Time needed: 15 minutes • Reward: 0,10$ • Description + Screencast (Walkthrough) 2. Manual evaluation (seekda) • Main focus on description and Yes/No question • Determine whether qualification is needed for workers • Determine whether wizard is understandable (usability) • Determine whether review tasks are needed www.insemtives.eu 27
  28. 28. Challenge 1: Mturk Phase 1 ResultsTotal: 70 API documents, 23 distinct workersInitial Question (Document about Web API, Yes/No) • 49 documents correctly annotated (70%) • 21 as Yes, this document is about a Web API • 28 as No, this document is not related to a Web APIDescription, Category, Tags • ~ 15 submissions including all answers (description, category, tags) • 4 very good and extensive submissions • 8 complete and usable submissionsPhase 1, Problems • Spam (10% - 15%) • Only few added category and descriptions • Most workers did not add tags www.insemtives.eu 28
  29. 29. Mturk: Phase 2 Changes• Completion time decreased to 10min• Reward increased to 20c• Key questions are mandatory (description, tags, category)• More strict evaluation criteria – e.g.: at least, 1 category, 2 tags and a meaningful description have to be provided.• Submitted a batch of 100
  30. 30. MTurk: Phase 2 Results• 27 users (only 1 from the previous batch!)• Completion Times – Min: 10 sec – Max: 9 min• 10 wrong classifications – 5 of them are web pages with high quality annotations• For correct classifications: – Mostly annotated with 2 tags – Top level category identification accurate in most cases – Mostly meaningful descriptions – Over 80% are accurate/satisfying
  31. 31. MTurk: Phase 2 Results•Large number of confirmed Web APIs –Feed back to crawler/analysis framework –Improving the initial index quality• Large number of categorised/tagged Services/APIs –Feed back to bootstrapping Service –Improved search/navigation• Detailed service descriptions for many Services at once –Improved user experience –Improved user experience/satisfaction –Attract/motivate new visitors to participate
  32. 32. Challenge 2: Mashups• Overall Goal – Create and add mashup(s) using services / Web APIs listed on the seekda portal – Annotate used Services and APIs• Timeline – Duration: 4 weeks• Introduction of Task through: – Step by step guidelines – Set of rules – Example walkthrough• Reward – Gadget (Samsung Galaxy S) www.insemtives.eu 32
  33. 33. Challenge 3: Long-Term Competition• Provide annotations – become a top contributor• Collect Points • Changes and/or improvements to annotations • New annotations • Weighting according to annotation type• Rank contributors• Reputation is main award• Allow users to donate money for good annotations • Donated money will be awarded to the top annotators• The more and better annotations provided…• …the higher the reputation• …the higher the financial incentive www.insemtives.eu 33
  34. 34. Conclusion• Devising motivation methods for annotating Web services is challenging• Different possibilities were/are being explored through challenges – Mechanical Turk – Mashups Challenge – Long-Term Competition• Users were closely kept in the development loop through OPD – Ensures that implemented features are usable – Keeps users engaged in a “community”-like way www.insemtives.eu 34
  35. 35. Questions & Annotations Thank You! www.insemtives.eu 35

×