Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
seekda Web Services portal                 SeekdaUniversity of Trento, University of Siegen
Overview• Web Services Portal      – Crawls for and indexes Web Services on the Web      – Currently more than 28,500 inde...
Design Decisions• Different annotation methods exist   – Keywords/tags   – Categories   – Natural language descriptions   ...
Component Dependancy4/14/11            www.insemtives.eu   4
Phase 1 and 2: Field and domain analysis                  • Domain analysis                     – Site visit, semi-structu...
Phase 3: Prototype Creation4/14/11                  www.insemtives.eu   6
Phase 4: Prototyping and design analysis• Design recommendations   –   Usability Design   –   Sociability Design   –   Emo...
Phase 4: Design Recommendations4/14/11              www.insemtives.eu      8
OPD Workshop - ObjectivesClassical PD:• User centric, cyclic development approach - usersprovide tips and feedback for/on ...
OPD Workshop - Procedure• In General:     1.   Technical committee chooses number of features out of forum          discus...
OPD Workshop – Cycles•1st Cycle      •   Beginning November: Start of workshop (dashboard, technical committee, introducti...
OPD Workshop - Results• Numbers     ~ 250 votes     ~ 160 forum posts     15-20 active users• User Background •   Web Serv...
Challenge 1: Amazon          Mechanical TurkGoal: Initial annotations for new and undescribed APIsTasks available   • Conf...
Challenge 1: Amazon          Mechanical Turk Simple Annotation WizardPhase 1   1. Setup        • Initial set of 70 Web AP...
Challenge 1: Amazon          Mechanical TurkPhase 1, ResultsTotal: 70 API documents, 23 distinct workersInitial Question (...
Challenge 1: Amazon          Mechanical TurkPhase 1, ResultsTop Workers     •    15 Tasks finished (2 very good/complete, ...
Challenge 1: Amazon          Mechanical TurkConclusions     •    Answers have to be mandatory              E.g. You have t...
Challenge 1: Amazon          Mechanical TurkPhase 2          •   New set of 100 Web APIs          •   No qualification nee...
Challenge 1: Amazon          Mechanical TurkExtensions     •    Additionally split Wizard into atomic tasks (e.g. Descript...
Challenge 1: Amazon          Mechanical Turk - Benefits• Evaluation of usability/design• Large number of confirmed Web API...
Challenge 2: Mashups• Overall Goal    – Create and add mashup(s) using services / Web APIs listed on the    seekda portal....
Challenge 2: Mashups -          Procedure• Step 1: Registration• Step 2: Brainstorming• Step 3: Implementation• Step 4: Fi...
Challenge 2: Mashups – Winner          Selection• Challenge lasts 4 weeks• Portal users vote for Mashups (1 week)• Selecti...
Challenge 3: Long-Term          Competition•     Provide annotations – become a top contributor•     Collect Points     • ...
Challenge 3: Annotations• Allow users to donate money for good annotations       Donated money will be awarded to the top...
Conclusion•     Devising motivation methods for annotating Web services is      challenging•     Different possibilities w...
Upcoming SlideShare
Loading in …5
×

UAB 2011 - Seekda Webservices Portal

793 views

Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

UAB 2011 - Seekda Webservices Portal

  1. 1. seekda Web Services portal SeekdaUniversity of Trento, University of Siegen
  2. 2. Overview• Web Services Portal – Crawls for and indexes Web Services on the Web – Currently more than 28,500 indexed and monitored• Problems – Found services are not annotated – For many, descriptions are not available – Limited search results and possibilities – Web APIs need to be confirmed by users• Target – Obtain more annotations by involving users in the annotation process – Validate existing annotations, if any – „Catch Them & Keep Them“4/14/11 www.insemtives.eu 2
  3. 3. Design Decisions• Different annotation methods exist – Keywords/tags – Categories – Natural language descriptions – Lightweight/fully-fledged semantic web service descriptions (e.g. WSMO/Light, OWL-S, etc.)• Annotate the K.I.S. way – Avoid complicated and demanding annotations (limit to tags, categories and NL descriptions)• Use lightweight RDF ontologies in the background (e.g. to ease the search)• SWS annotations might be integrated in the future4/14/11 www.insemtives.eu 3 – Most users are not familiar with SWS
  4. 4. Component Dependancy4/14/11 www.insemtives.eu 4
  5. 5. Phase 1 and 2: Field and domain analysis • Domain analysis – Site visit, semi-structured, qualitative interviews • Communication processes • Existing usage practices and problems • Semantic annotation solutions – Tape recording, transcription – Data analysis per ex-post categorization • Focus group discussion • Usability lab tests • Expert walkthroughs4/14/11 www.insemtives.eu 5
  6. 6. Phase 3: Prototype Creation4/14/11 www.insemtives.eu 6
  7. 7. Phase 4: Prototyping and design analysis• Design recommendations – Usability Design – Sociability Design – Emotional Design – Value Sensitive Design• OPD Workshop• Challenges 4/14/11 www.insemtives.eu 7
  8. 8. Phase 4: Design Recommendations4/14/11 www.insemtives.eu 8
  9. 9. OPD Workshop - ObjectivesClassical PD:• User centric, cyclic development approach - usersprovide tips and feedback for/on the prototype itself.• Problem in the seekda case: •Workshops can not be conducted because users are potentially distributed all over world.OPD as solution for distributed target groups:4/14/11 www.insemtives.eu 9
  10. 10. OPD Workshop - Procedure• In General: 1. Technical committee chooses number of features out of forum discussions 2. Features open for user voting 3. Feature selection, implementation 4. Collect user feedback• Duration per cycle: 4-6 weeks4/14/11 www.insemtives.eu 10
  11. 11. OPD Workshop – Cycles•1st Cycle • Beginning November: Start of workshop (dashboard, technical committee, introduction) • 2 weeks later: Identification of 5 most important features/wishes • 1 week later: Selection of 2 most popular features • Beginning of Dec: Short tasks for users • Mid December: End of cycle, feedback analysis•2nd Cycle • Execution dates: January – March • Goals for this cycle • Increase motivation • Increase activity of participants • Focus more on usability/design and incentives • Changes • Tasks first • Split into smaller parts, sequentially • Explained through screencasts • Example: go to the portal, search for xyz, identify redundant elements, most important, … • OPD Dashboard Improvements4/14/11 www.insemtives.eu 11
  12. 12. OPD Workshop - Results• Numbers ~ 250 votes ~ 160 forum posts 15-20 active users• User Background • Web Services experts • Developers • Random visitors• Feedback/Implementation • 18 suggested features • 6 concrete features implemented (ongoing) • Several implemented usability/design improvements• Conclusions & Next Steps (ongoing) • Introduce challenge procedures • Ask specifically about guided processes (wizards) • Integrate OPD workshop directly from the platform4/14/11 www.insemtives.eu 12
  13. 13. Challenge 1: Amazon Mechanical TurkGoal: Initial annotations for new and undescribed APIsTasks available • Confirm document is related to a Web API (yes/no) • Provide/improve description • Provide and confirm (bootstrapped) tags/concepts • Provide and confirm (bootstrapped) service categories • Rate document qualityParameters • Qualification test • Min. approval rate per worker • Approx Time needed per task • Reward4/14/11 www.insemtives.eu 13
  14. 14. Challenge 1: Amazon Mechanical Turk Simple Annotation WizardPhase 1 1. Setup • Initial set of 70 Web APIs • No qualification needed • Approx. Time needed: 15 minutes • Reward: 0,10$ • Description + Screencast (Walkthrough) 2. Manual evaluation (seekda) • Main focus on description and Yes/No question • Determine whether qualification is needed for workers • Determine whether wizard is understandable (usability) • Determine whether review tasks are needed4/14/11 www.insemtives.eu 14
  15. 15. Challenge 1: Amazon Mechanical TurkPhase 1, ResultsTotal: 70 API documents, 23 distinct workersInitial Question (Document about Web API, Yes/No) • 49 documents correctly annotated (70%) • 21 as Yes, this document is about a Web API • 28 as No, this document is not related to a Web APIDescription, Category, Tags • ~ 15 submissions including all answers (description, category, tags) • 4 very good and extensive submissions • 8 complete and usable submissions4/14/11 www.insemtives.eu 15
  16. 16. Challenge 1: Amazon Mechanical TurkPhase 1, ResultsTop Workers • 15 Tasks finished (2 very good/complete, 2 complete) • 10 Tasks finished (1 very good, 2 complete) • 5 Tasks finished (2 complete)Average time needed: 2,5 minutes (min: 9 seconds, max: 13 minutes)Phase 1, Problems • Spam (10% - 15%) • Only few added category and descriptions • Most workers did not add tags4/14/11 www.insemtives.eu 16
  17. 17. Challenge 1: Amazon Mechanical TurkConclusions • Answers have to be mandatory E.g. You have to specify/select at least ... • Estimated time for completion can be decreased • → Increase reward • Clearly point out in the description what exactly needs to be achieved “In order to be approved, your submission needs to contain at least... “ 2-3 appropriate tags, A short but meaningful description, etc4/14/11 www.insemtives.eu 17
  18. 18. Challenge 1: Amazon Mechanical TurkPhase 2 • New set of 100 Web APIs • No qualification needed • Approx. Time needed: 10 minutes (5 minutes less) • Reward: 0,20$ (+ 0,10$) • Improved description and explicit goals • All questions are mandatory • Multiple workers per Task (majority vote) Evaluation: Automatic4/14/11 www.insemtives.eu 18
  19. 19. Challenge 1: Amazon Mechanical TurkExtensions • Additionally split Wizard into atomic tasks (e.g. Description) • Iterative Tasks • Using an existing description as seed • Read descriptions, if you think a description is appropriate, click “Vote” • If not appropriate submit a better description and get a chance to win a $xx bonus • Pre-selection of documents that need manual (seekda) approval • Spam prevention4/14/11 www.insemtives.eu 19
  20. 20. Challenge 1: Amazon Mechanical Turk - Benefits• Evaluation of usability/design• Large number of confirmed Web APIs –Feed back to crawler/analysis framework –Improving the initial index quality• Large number of categorised/tagged Services/APIs –Feed back to bootstrapping Service –Improved search/navigation• Detailed service descriptions for many Services at once –Improved user experience –Improved user experience/satisfaction –Attract/motivate new visitors to participate4/14/11 www.insemtives.eu 20
  21. 21. Challenge 2: Mashups• Overall Goal – Create and add mashup(s) using services / Web APIs listed on the seekda portal. – Annotate used Services and APIs.• Timeline – Duration: 4 weeks• Introduction of Task through: – Step by step guidelines – Set of rules – Example walkthrough• Reward – Gadget (Samsung Galaxy S)4/14/11 www.insemtives.eu 21
  22. 22. Challenge 2: Mashups - Procedure• Step 1: Registration• Step 2: Brainstorming• Step 3: Implementation• Step 4: Finalizing• Step 5: Add Services and APIs (optional)• Step 6: Annotate Services and APIs4/14/11 www.insemtives.eu 22
  23. 23. Challenge 2: Mashups – Winner Selection• Challenge lasts 4 weeks• Portal users vote for Mashups (1 week)• Selection of winner is done by seekda engineers and a group of external experts• Criteria for evaluation: • User voting • Overall quality • Originality and usefulness • Technical details/implementation • Annotations of used Services/APIs• Presentation of top 3 submissions4/14/11 www.insemtives.eu 23
  24. 24. Challenge 3: Long-Term Competition• Provide annotations – become a top contributor• Collect Points • Changes and/or improvements to annotations • New annotations • Weighting according to annotation type• Rank contributors• Reputation is main award• Annotation quality will evolve4/14/11 www.insemtives.eu 24
  25. 25. Challenge 3: Annotations• Allow users to donate money for good annotations  Donated money will be awarded to the top annotators• The more and better annotations provided,  The higher the reputation  The higher the financial incentive4/14/11 www.insemtives.eu 25
  26. 26. Conclusion• Devising motivation methods for annotating Web services is challenging• Different possibilities were/are being explored through challenges – Mechanical Turk – Mashups Challenge – Long-Term Competition• Users were closely kept in the development loop through OPD – Ensures that implemented features are usable – Keeps users engaged in a “community”-like way4/14/11 www.insemtives.eu 26

×