Successfully reported this slideshow.

Handson 2 (6/6)

1,047 views

Published on

ISWC 2010: TUTORIAL: Ten Ways to Make your Semantic App Addictive

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

Handson 2 (6/6)

  1. 1. Hands-on experiences with incentives and mechanism design www.insemtives.eu 1 Roberta Cuel, University of Trento, IT; Markus Rohde, University of Siegen, DE and Germán Toro del Valle, Telefonica I+D, ES ISWC 2010
  2. 2. How to design effective incentives/rules 1. Analyze the domain • What? • Working environment • Job descriptions • Organization (tasks, hierarchy, compensation, social, communication) • How? • Qualitative face-to-face interviews and questionnaires • Observations with selected individuals • Quantitative analysis (data collections) 1/30/2015 www.insemtives.eu 2
  3. 3. How to design effective incentives/rules (2) 2. Identify the preferences and motivations that drive users Concentrate on every-day uses for those specific users 3. Formalize the existing reward system Find yourself in the matrix 4. Design the simplest possible solution that can effectively support those uses Translate into a small number of alternative testable hypothesis 5. Fine tune the rewarding system 1/30/2015 www.insemtives.eu 3
  4. 4. Fine-tuning incentives with mechanism design: a step by step procedure • Mimic situation in the lab • Set up experiment as close to real life situation as possible • Run experiment with volunteer subjects, with random allocation of subjects • Test alternative hypothesis regarding effect of incentive schemes on behavior • Check differences in outcome If happy go to next slide, otherwise re-design hypothesis and run a new trial.
  5. 5. Fine-tuning part II • Start adding realism components: – Move to real subjects (field test) – Move to real tasks (with real subjects) – Move to real subjects handling real tasks – Move to real situation (field experiment) • During the process you: – Lose control over ability to manipulate variables – Gain awareness of interaction between variables • Let’s look at what we are doing with case studies!
  6. 6. Telefonica I+D case study • Corporate portal • What is the most obvious incentive from economic point of view? • What can we do with a small budget to be dedicated to incentivize users? • How do we know which system is the best for our setting? 1/30/2015 www.insemtives.eu 6
  7. 7. Basic experiment • Test Two rewarding/incentives systems • Pay per click: – 0,03 € per tag added (up to 3 € maximum). • Winner takes all model: – The person who adds the higher number of tags/annotation wins 20€ What would you choose? (Participation fee – 5 €) 1/30/2015 www.insemtives.eu 7
  8. 8. The experiment (setting) • 36 students – Random assignment to the two “treatments” • Individual task: annotation of images • Clear set of Instructions • Training (guided) session to give basic understanding of annotation tool • 8 minutes clocked session (time pressure) • Goal: produce maximum amount of tags in allotted time on a random set of images 1/30/2015 www.insemtives.eu 8
  9. 9. The lab 1/30/2015 www.insemtives.eu 9
  10. 10. The experiment: screenshots 1/30/2015 www.insemtives.eu 10
  11. 11. 1/30/2015 www.insemtives.eu 11
  12. 12. 1/30/2015 www.insemtives.eu 12
  13. 13. 1/30/2015 www.insemtives.eu 13
  14. 14. 1/30/2015 www.insemtives.eu 14
  15. 15. ID N. Tags Reward Tot. € 2 48 1,5 6,5 3 41 1,5 6,5 4 78 2,5 7,5 5 54 2 7 6 51 2 7 7 36 1,5 6,5 8 44 1,5 6,5 9 54 2 7 10 64 2 7 11 41 1,5 6,5 12 63 2 7 13 58 2 7 14 60 2 7 15 25 1 6 16 43 1,5 6,5 17 50 1,5 6,5 18 24 1 6 19 30 1 6 20 37 1,5 6,5 tot. 901 31,5 126,5 av. 47,42 1,66 6,66 15 ID N. Tags Reward Tot. € 2 86 0 5 3 40 0 5 4 68 0 5 5 88 0 5 6 87 0 5 7 65 0 5 8 67 0 5 9 31 0 5 10 62 0 5 11 79 0 5 12 45 0 5 13 96 20 25 14 51 0 5 15 68 0 5 17 73 0 5 18 26 0 5 19 35 0 5 tot. 1067 20 105 av. 62,76 1,18 6,18
  16. 16. Number of tags Pay per tag (N=19) – Total amount of tags: 901 – Max n. of tags: 78 – N. tags (avg.)= 47.42 – € (avg. per person)= 6.66 – € (avg. per tag) =0.1404 – € total = 126,5 € (31,5 € flexible compensation) Winner takes all model (N=17) – Total amount of tags: 1067 – Max n. of tags: 96 – N. tags (avg.)= 62.76 (32% increase!) – € (avg. per person)= 6.18 – € (avg. per tag)=0.098407 – € total = 105 € (20 € flexible compensation) 1/30/2015 www.insemtives.eu 16
  17. 17. The results 1/30/2015 www.insemtives.eu 17 T-test and F-test are significant
  18. 18. Tags distribution: interface matters! Pay per tag • Tag “nature”  24 times • “snow”  22 times • “green”  20 times • … • 134 tags repeated only 2 times • 437 unique tags Winner take all • Tag “green” 18 times • “snow”  14 times • “butterfly”  13 times • … • 118 tags repeated only 2 times • 390 unique tags 1/30/2015 www.insemtives.eu 18
  19. 19. Some biases • Students are – Volunteers who are used participating in experiments – Strong web users and game players – Paid to show up • Quality of the tags – Quality of tagging has been controlled for: no obvious ‘mistakes’ or ‘cheating’ 1/30/2015 www.insemtives.eu 19
  20. 20. Summary of results & next lab steps • Basic hypothesis confirmed • More work needed: – Effort directed to producing a good (tags) that are not consumed by users (used to achieve other goals)  change structure of the game to let users exploit tagging to achieve results (treasure hunt!) – Re-run experiment with new structure. Now users produce tags to get money and to use tags to perform more tasks)
  21. 21. Next steps: Telefonica I+D 1/30/2015 www.insemtives.eu 21 • Replicate experiment with real users – Main change 1: task becomes relevant in terms of practical usefulness for users – Main change 2: task has social implications – Main change 3: expectations change dramatically (workers vs. students  5 Euros to participate???) • Add realism – Mimic social structure in the company: • Run experiment with teammates • Use real tasks • Try alternative pay for performance schemes

×