Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Reviews and awards


Published on

Workshop "Weaving Relations of Trust in Crowd Work: Transparency and Reputation across Platforms" co-located with ACM Web Science Conference 2016.

trustincrowdwork trust crowdsourcing websci16

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Reviews and awards

  1. 1. Weaving Relations of Trust in Crowd Work: Transparency and Reputation across Platforms Reviews and Awards
  2. 2. Can we crowdsource paper reviews?
  3. 3. We did ● Ask 3 CrowdFlower contributors to read your papers and provide a textual review + recommendation in [-3,3] and gave them $2 ● Ask 3 PC members to read your papers and provide a textual review + recommendation in [-3,3] and thanked them ● Not include quality checks: some short reviews ● Decide paper acceptance informed solely by PC member feedback
  4. 4. Disclaimer ● Very few data points ● No statistical significance! ● Crowd members know the domain ● See the NIPS Experiment: Disagreement over 25.9% papers
  5. 5. Results True Positives Best Paper! Pearson correlation: 0.37 Disagreement over 22% papers
  6. 6. ● Positive: ○ “Some of the stats were pretty interesting. I'm actually going to look up some the papers they got their information from. I enjoyed it.” ○ “[...] is just perfect.” ● Recommendations: ○ “Maybe some more explicit examples might be needed on the paper in order to make the point clearer.” ○ “Unfortunately for me the article a lot of obscure technical terms.” Example Feedback from the Crowd
  7. 7. Example Feedback from the Crowd ● Critical: ○ “... also without offending anyone tell them that it is extremely unfair discrimination made the authors as the best tasks are carried up US, ENGLAND, AUSTRALIA and the worst and the less you pay goes to the underdeveloped countries like mine [Argentina] that must be solved is immoral and unjust” ○ “An academic workshop on crowdsourcing should have papers talking about crowdsourcing all over the world and not focused only in one ethnic group, in my opinion that won't be positive.”
  8. 8. Best Paper Award
  9. 9. ● Ranked top by both crowd workers and peer reviewers ● Workers said ○ “[we] must devise novel ways to support trustworthy actions” ○ “Very correct article and the questions raised.” ○ “The article outlines the basic and necessary questions, it's good!” ● PC Members said ○ “it would produce some interesting discussions not only within the TRUSTINCW workshop, but within the greater community as well.” Best Paper
  10. 10. The best paper award goes to…
  11. 11. On the Improvement of Quality and Reliability of Trust Cues in Micro-task Crowdsourcing (Position paper) Jie Yang and Alessandro Bozzon
  12. 12. Workshop Web Site Workshop Twitter Hashtag #trustincrowdwork Workshop Twitter Account Conference Twitter Hashtag #websci16
  13. 13. Agenda Time Session 9:00 - 9:15 Hello! session 9:15 - 10:30 Invited talk 10:30 - 10:45 Coffee break 10:45- 11:30 Paper presentations (part I - studies on workers) 11:30-12:00 Crowd Statement Marathon 12:00-13:00 Lunch 13:00-14:00 Paper presentations (part II - platforms) 14:00-15:00 Panel & discussions 15:00 -15:15 Coffee break 15:15 - 16:00 Crowdsourced reviews & awards 16:00 -16:30 Closing session