Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Crystal Clear or Very Vague ?
Effects of Task Clarity in the
Microtask Crowdsourcing Ecosystem
Ujwal Gadiraju
Web Science ...
Vignettes from Previous Studies (½)
Sumita from Gujarat, India : "The requester
gives the link, but when you open the link...
Vignettes from Case Studies (2/2)
From an AMT-related forum
general65: "I don’t like it. Another
idiot professor who think...
Marketplace Dynamics & Reputation
Crowd workers aim to
maintain high reputation
scores.
● A worker’s good
reputation gives...
Are
crowdsourced
tasks always
CLEAR?
STUDY - Methodology
● Survey deployed on
CrowdFlower
● 100 independent workers
● Quality control (highest
level workers)
●...
Experience of Workers
● 36% workers ⇒ Primary
source of income
● 32.6 % workers ⇒ Crowd
work since 3-6 months
● 63.2 % wor...
What factors make tasks unclear?
What factors make tasks unclear?
Instructions & Task Description
vague, blank, unclear,
inconsistent,
imprecise, ambiguous...
Task Clarity & Influence on
Worker Performance (½)
● 49% workers claimed up
to 30% tasks were unclear
● 37% workers claime...
Task Clarity & Influence on
Worker Performance (2/2)
How do workers deal with
unclear tasks? (½)
● 18% of workers used
dictionaries or other
tools to help them
complete 50% of...
How do workers deal with
unclear tasks? (2/2)
Conclusions & Future Work
● Workers confront unclear
tasks on a regular basis.
● Workers attempt to
overcome the difficult...
WE (workers, requesters,
platform) need mechanisms
that can ensure task clarity
so that workers are not
penalized for subo...
Contact Details :
gadiraju@l3s.de
Twitter: @UjLaw
http://www.l3s.
de/~gadiraju/
More here . . .
https://www.l3s.de/~gadiraju/TrustInCW/
Paid Microtask Crowdsourcing
18
Crowd Workers
Task Administrators / Requesters
Payment vs. Quality of Work Conundrum:
How Requesters Deal With Crowd Work
Qualification Tests to Select a
Reliable Crowd
...
Payment vs. Quality of Work Conundrum:
How Requesters Deal With Crowd Work
Impact of Poor Task Design
● Is a requester wit...
Payment vs. Quality of Work Conundrum:
How Requesters Deal With Crowd Work
Reject Work With/Without Paying
Workers
● Is a ...
How Workers Deal With Crowd Work
Aspects that Hinder Crowd
Workers
● Barriers such as language,
technology and poor task d...
How Workers Deal With Crowd Work
Risks Crowd Workers Take
● Work environments may not always
be appropriate, and the devic...
How Workers Deal With Crowd Work
Risks Crowd Workers Take
● Should a worker who is using
poorer equipment to provide a hig...
Non-aggressive
Filtering of Crowd
Workers in Paid
Crowdsourcing
Microtasks
Upcoming SlideShare
Loading in …5
×

Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

232 views

Published on

Microtask crowdsourcing marketplaces encompass a novel
learning environment where crowd workers can be thought
of as learners, who attempt to learn through their experiences in order to become effective and reap bigger monetary rewards. Prior work has not considered the impact of task clarity, which is a function of the task description and instructions, on worker performance and learning. We hypothesize that unclear tasks tend to have a negative impact on the microtask crowdsourcing ecosystem, by sowing seeds of distrust in the minds of crowd workers. Prior work has shown that workers often complete tasks that they are not comfortable with due to a lack of alternatives. Unclear tasks in such contexts can further demotivate crowd workers, and deteriorate the quality of work produced. However, penalizing the reputation of crowd workers despite a lack of clarity in some tasks is unfair. In this paper, we present results from a study of 100 workers on CrowdFlower, regarding their experience of contributing to piecework in the paid microtask crowdsourcing paradigm. Our findings indicate a
definitive presence of unclear tasks in crowdsourcing market-
places, raising the urgent need for mechanisms to improve
task clarity, build trust and foster healthy relationships between requesters and workers.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Crystal Clear or Very Vague? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem

  1. 1. Crystal Clear or Very Vague ? Effects of Task Clarity in the Microtask Crowdsourcing Ecosystem Ujwal Gadiraju Web Science 2016 Hannover, Germany May 22nd , 2016 1
  2. 2. Vignettes from Previous Studies (½) Sumita from Gujarat, India : "The requester gives the link, but when you open the link it shows that the survey is already closed; but first we accept the task only then we can open the link. [..] Sometimes we don’t get the completion code (at the end of a survey), then we cannot understand, we spent so much time – 1 hour, and wasted our time and we are not getting the code, then how are we going to submit the survey. [..] At that time, we have to return the HIT and write to the requester. Some requesters are very good and give immediate reply – ’sorry this happened, we will see to it’, but some don’t even bother to give us a reply whether it was their fault".
  3. 3. Vignettes from Case Studies (2/2) From an AMT-related forum general65: "I don’t like it. Another idiot professor who thinks he knows what’s best for the private market. This will only mean the government getting involved and regulating the requester’ s which in turn will end up in less pay for us. Someone please tell this idiot professor to stay in the classroom."
  4. 4. Marketplace Dynamics & Reputation Crowd workers aim to maintain high reputation scores. ● A worker’s good reputation gives access to more tasks.
  5. 5. Are crowdsourced tasks always CLEAR?
  6. 6. STUDY - Methodology ● Survey deployed on CrowdFlower ● 100 independent workers ● Quality control (highest level workers) ● Attention-check questions ● 3 workers failed to pass at least one attention-check question
  7. 7. Experience of Workers ● 36% workers ⇒ Primary source of income ● 32.6 % workers ⇒ Crowd work since 3-6 months ● 63.2 % workers ⇒ Crowd work since 1-2 years ● 3.2 % workers ⇒ Crowd work since 3-5 years ● Remaining workers > 5 years ● 74% completed > 500 tasks
  8. 8. What factors make tasks unclear?
  9. 9. What factors make tasks unclear? Instructions & Task Description vague, blank, unclear, inconsistent, imprecise, ambiguous, or poor Language Used in Description too many words, high standard of English, broken English, spelling
  10. 10. Task Clarity & Influence on Worker Performance (½) ● 49% workers claimed up to 30% tasks were unclear ● 37% workers claimed that between 31% - 60% of tasks were unclear ● 14% workers claimed that more than 60% tasks were unclear
  11. 11. Task Clarity & Influence on Worker Performance (2/2)
  12. 12. How do workers deal with unclear tasks? (½) ● 18% of workers used dictionaries or other tools to help them complete 50% of tasks ● 20% of workers used language translators to help them complete over 50% of tasks
  13. 13. How do workers deal with unclear tasks? (2/2)
  14. 14. Conclusions & Future Work ● Workers confront unclear tasks on a regular basis. ● Workers attempt to overcome the difficulties by using external help, dictionaries or translators. ● Workers tend to complete unclear tasks despite not understanding the objectives.
  15. 15. WE (workers, requesters, platform) need mechanisms that can ensure task clarity so that workers are not penalized for suboptimal performance on poorly designed tasks.
  16. 16. Contact Details : gadiraju@l3s.de Twitter: @UjLaw http://www.l3s. de/~gadiraju/
  17. 17. More here . . . https://www.l3s.de/~gadiraju/TrustInCW/
  18. 18. Paid Microtask Crowdsourcing 18 Crowd Workers Task Administrators / Requesters
  19. 19. Payment vs. Quality of Work Conundrum: How Requesters Deal With Crowd Work Qualification Tests to Select a Reliable Crowd ⇒ Using a sample/simulating real task as a qualification test. ● Should requesters pay workers for participating in pre-screening tests? ● Should this be proportional to the length & effort required during the pre-screening phase?
  20. 20. Payment vs. Quality of Work Conundrum: How Requesters Deal With Crowd Work Impact of Poor Task Design ● Is a requester within her ethical rights to reject work without paying when the task design is poor? ● How can requesters share accountability of poor quality work? ● Could crowd workers rate the requesters’ task design and clarity of instructions before requesters are allowed to deploy the HITs?
  21. 21. Payment vs. Quality of Work Conundrum: How Requesters Deal With Crowd Work Reject Work With/Without Paying Workers ● Is a piece of work acceptance or rejection? Requesters have complete authority in adjudication. ● Is it fair on the part of the workers to accept full pay despite providing sub-optimal work? ● What happens when optimal work is rejected by the requesters?
  22. 22. How Workers Deal With Crowd Work Aspects that Hinder Crowd Workers ● Barriers such as language, technology and poor task design. ● High variability of task types pose challenges with regards to workers’ familiarity with tasks. ● Where work is rejected unfairly, workers make efforts to get in touch with the platform / the requesters to state their case. This does not always receive a positive response.
  23. 23. How Workers Deal With Crowd Work Risks Crowd Workers Take ● Work environments may not always be appropriate, and the devices that workers use to complete tasks may not be ergonomically suitable. ● Daily chores in tandem with completing available work, subjecting workers to distractions and disrupting concentration and their flow of work. ● Risks and issues generally remain invisible to the requesters, whose focus is usually on drawing out good work from the crowd.
  24. 24. How Workers Deal With Crowd Work Risks Crowd Workers Take ● Should a worker who is using poorer equipment to provide a high quality of work be awarded a bonus (similar to corporates rewarding employees for working overtime or exhibiting extra efforts)? ● Should monetary incentives be a function of socio-economic aspects to an extent? Should requesters consider this in their task design?
  25. 25. Non-aggressive Filtering of Crowd Workers in Paid Crowdsourcing Microtasks

×