SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Successfully reported this slideshow.
Activate your 14 day free trial to unlock unlimited reading.
Investigating the Influence of Crowdworker Attitudes on Document Annotations
Investigating the Influence of Crowdworker Attitudes on Document Annotations
1.
1
WIS
Web
Information
Systems
Investigating the Influence of
Crowdworker Attitudes on
Document Annotations
Tim Draws, Nava Tintarev, Ujwal Gadiraju
TU Delft, The Netherlands
t.a.draws@tudelft.nl
timdraws.net
2.
2
WIS
Web
Information
Systems
Biases in web search
My work: measuring and mitigating algorithmic
and cognitive biases in the context of web
search on debated topics
Needed: data sets of search results with
viewpoint annotations
Yes!
Yes!
Yes!
Yes!
Yes!
No!
No!
Problem: potential bias in the viewpoint
annotations due to crowdworkers’ personal
attitudes
3.
3
WIS
Web
Information
Systems
Biased annotations?
Concern: tendency to annotate in line with personal stance
– Confirmation bias
– False consensus effect
Opposing!
Supporting!
4.
4
WIS
Web
Information
Systems
This work
RQ: Do crowdworkers have a tendency to label in
line with their personal attitude when annotating
search results for viewpoints?
Crowdsourcing study to collect viewpoint
annotations
Analyzed the relationship between crowdworker
attitudes and their annotations
5.
5
WIS
Web
Information
Systems
Crowdsourcing viewpoints
• Retrieved search results from Bing for two
different debated topics
– Should zoos exist?
– Are social networking sites good for our society?
• Top 50 results for 14 queries per topic
• Set up task on Amazon Mechanical Turk
6.
6
WIS
Web
Information
Systems
Viewpoint labels
Should we all be vegan?
Extremely
opposing
Opposing Somewhat
opposing
Neutral Somewhat
supporting
Supporting Extremely
supporting
-3 -2 -1 0 +1 +2 +3
7.
7
WIS
Web
Information
Systems
Viewpoint annotation task
• Step 1. Instructions; personal knowledge & attitude
• Step 2. Annotate 14 search results on one topic and
complete two attention checks
8.
8
WIS
Web
Information
Systems
Results
Descriptive
• 717 search result items
• 140 annotators
Spearman correlation analysis
• IV: Crowdworker attitude [-3,3]
• DV: Mean annotation [-3,3]
• ρ = 0.26, p = 0.003
−2
0
2
−2 0 2
Crowdworker Attitude
MeanAnnotation
9.
9
WIS
Web
Information
Systems
A difference between topics?
−2
0
2
−2 0 2
Crowdworker Attitude
MeanAnnotation
−2
0
2
−2 0 2
Crowdworker Attitude
MeanAnnotation
Social Media Zoos
ρ = 0.26, p = 0.025 ρ = 0.27, p = 0.041
10.
10
WIS
Web
Information
Systems
Mild vs. strong attitudes
• Divided crowdworkers into mild and strong attitudes
• Mild attitudes: ρ = -0.03, p = 0.829
• Strong attitudes: ρ = 0.26, p = 0.035
Extremely
opposing
Opposing Somewhat
opposing
Neutral Somewhat
supporting
Supporting Extremely
supporting
-3 -2 -1 0 +1 +2 +3
mildstrong strong
11.
11
WIS
Web
Information
Systems
Worker requirements too low?
• Worker requirements
– HIT approval rate > 95%
– Location: United States
• Second study; higher worker requirements
– HIT approval rate > 98%
– Location: US, AUS, NZ, UK, GER, FIN, SWE, NO, CH, AUT
– Master workers
• Comparing subsets of overlapping items (112)
– Only difference was requirements
– Subset of first study: ρ = 0.22, p = 0.022 (n = 114)
– Subset of second study: ρ = 0.06, p = 0.77 (n = 25)
12.
12
WIS
Web
Information
Systems
Discussion
• Crowdworker attitudes affected viewpoint annotations
– Irrespective of topic
– Workers with stronger opinions and lower qualifications seem to be more
prone to bias
• Other considerations
– Influence of self-reported knowledge?
– Asking workers for sincerity?
– Type of document could also play a role (more ambiguous, more bias?)
• Future work
– What specifically causes the bias?
– Mitigation strategies
13.
13
WIS
Web
Information
Systems
Take home and future work
• Cognitive biases can affect (viewpoint)
annotations of crowdworkers
– Be aware (test!)
– Design task to make crowdworkers aware of biases
– If possible, remove ambiguous items
Material related to this research
available at https://osf.io/kbjgp/.
t.a.draws@tudelft.nl
timdraws.net
Editor's Notes
Introduce myself Second year PhD (talk 20 min + 5 min questions)
My work: viewpoint diversity in search results Advance this work we need data sets of search results with viewpoint annotations
Concern in this task is that viewpoint annotations might be biased Confirmation bias: look for evidence in the article False consensus effect: everyone thinks like I do
Categorisation into 7 viewpoints Task: classify search results into this taxonomy
Introduced them to a topic personal knowledge and stance (7-point Likert) Show what task looked like + attention check
Re-state research question / hypothesis here
To mitigate such bias it’s interesting to see why it occurs
Hard to say, but looks like worker requirements could play a role here
Summary Considerations: other areas We did not find anything for knowledge but similar things (confirmation of sincerity) could be interesting Future work
0 likes
Be the first to like this
Views
Total views
17
On SlideShare
0
From Embeds
0
Number of Embeds
0
You have now unlocked unlimited access to 20M+ documents!
Unlimited Reading
Learn faster and smarter from top experts
Unlimited Downloading
Download to take your learnings offline and on the go
You also get free access to Scribd!
Instant access to millions of ebooks, audiobooks, magazines, podcasts and more.
Read and listen offline with any device.
Free access to premium services like Tuneln, Mubi and more.