Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Crowd statement marathon


Published on

Workshop "Weaving Relations of Trust in Crowd Work: Transparency and Reputation across Platforms" co-located with ACM Web Science Conference 2016.

trustincrowdwork trust crowdsourcing websci16

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Crowd statement marathon

  1. 1. Weaving Relations of Trust in Crowd Work: Transparency and Reputation across Platforms Crowd Statement Marathon
  2. 2. What do people think about trust in crowd work?
  3. 3. Our call
  4. 4. Results ● 14 statements ● Topics a. Transparency, communication and relationships b. Knowing each other, reputation c. Misuse, unfairness in crowd work d. Who to trust e. Using trust measurements f. Crowdsourcing in general
  5. 5. Results (II) Communication, transparency and relationships ● “Trust in crowdsourcing comes through human relationships even though they are diminished by the technology: the qualities of the exchanges determine the level of trust.” David Martin ● “Requesters too often do not trust honest, ethical workers to do quality work.” Rochelle LaPlante ● “In order to build and reinforce trust, the final end and intention of the work presented to crowd workers should be shown to them, along with information about the work author and relevant details.” Wilmer Viana ● “We need to listen to crowd workers and requesters, who ask openly on the Web about the problems in order to understand them and improve.” Yuan Sun
  6. 6. Results (III) Knowing each other, reputation ● “I wish there was a mutual automatic statistics display system built into whatever labor platform, so both requesters and workers could easily see each other's track records (or eliminate requesters/workers below whatever threshold) without having to rely on an outside review site... trust goes both ways, you know?” Laura Nolan ● “I am a worker on Mturk. I am always apprehensive when I perform a new HIT. Until I know the requester, I only submit perhaps 2 or 3 HITS and then wait to see if they pay or reject and how long it takes to get paid. I don't like getting screwed by the scammers.” Cheryl Finger ● “trust worthy - especially when there is transparency to workers prior experience.” Mike D
  7. 7. Results (IV) Misuse, unfairness in crowd work ● “Online work is not profitable in third world countries there is discrimination authors of the works create jobs for countries like the United States , England, Germany, South American countries are outside is grossly unfair discrimination by ip .” ELIAS ● “Often when working for academic requesters I feel like there is a lack of trust, so many of them use deception of one sort or another for their research methods.” Kathryn Porto ● “I don't think there is trust in crowd work, obviously with the use of scripting and hoarding.” Jeremy Grisham
  8. 8. Results (V) Who to trust ● “I trust my fellow workers and trust to help one another to do reliable work .” Amanda Holmes Use of trust measurements ● “Trust is a metric used to determine the value of the data being collected.” Kevin Dodds ● “Trusting the crowd is vital to getting quality results, from initially vetting workers to constantly maintaining communication and QA.” Selene Arrazolo Meaning of crowdsourcing ● “Overall, crowd work is an important source of income to me; that being said, I worry about the permanence and stability of it.” Mandey Daly
  9. 9. Interview with Saiph Savage Assistant Professor at West Virginia University ● What does “trust” mean for you, in the scenario of paid crowdsourcing? ● How would you measure trust in crowd work? ● Could you please mention a major problem that hinders relations of trust between crowd workers, requesters and crowdsourcing platforms? ● How would you solve the aforementioned problem? ● What should, in your opinion, the research community pay more attention to? ● Please, make a wish (related to trust in crowd work) Play video
  10. 10. Thank you!
  11. 11. Do you have a colleague who is becoming interested in crowdsourcing and does not know how to use effectively and efficiently? Tell him/her to visit the tutorial “It's Getting Crowded! How to Use Crowdsourcing Effectively for Web Science Research” ! Warning: it’s in parallel to our 2nd paper session, and discussion session.
  12. 12. Workshop Web Site Workshop Twitter Hashtag #trustincrowdwork Workshop Twitter Account Conference Twitter Hashtag #websci16
  13. 13. Agenda Time Session 9:00 - 9:15 Hello! session 9:15 - 10:30 Invited talk 10:30 - 10:45 Coffee break 10:45- 11:30 Paper presentations (part I - studies on workers) 11:30-12:00 Crowd Statement Marathon 12:00-13:00 Lunch 13:00-14:00 Paper presentations (part II - platforms) 14:00-15:00 Panel & discussions 15:00 -15:15 Coffee break 15:15 - 16:00 Crowdsourced reviews & awards 16:00 -16:30 Closing session