Introducing the Cognitive-Biases-in-Crowdsourcing Checklist

Oct. 27, 2021
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
Introducing the Cognitive-Biases-in-Crowdsourcing Checklist
1 of 7

More Related Content

What's hot

Code4 lib2012Code4 lib2012
Code4 lib2012William Gunn
The Analytics and Data Science LandscapeThe Analytics and Data Science Landscape
The Analytics and Data Science LandscapePhilip Bourne
tools for communicating in the computational sciencestools for communicating in the computational sciences
tools for communicating in the computational sciencesBrian Bot
How can machine learning and AI in the cloud improve research?How can machine learning and AI in the cloud improve research?
How can machine learning and AI in the cloud improve research?Jisc
The Future of FAIR Data: An international social, legal and technological inf...The Future of FAIR Data: An international social, legal and technological inf...
The Future of FAIR Data: An international social, legal and technological inf...Michel Dumontier
Keynote Talk - Gaining Powerful Insights into Social Media ListeningKeynote Talk - Gaining Powerful Insights into Social Media Listening
Keynote Talk - Gaining Powerful Insights into Social Media ListeningDr Wasim Ahmed

Similar to Introducing the Cognitive-Biases-in-Crowdsourcing Checklist

The Internet of Things: What's next? The Internet of Things: What's next?
The Internet of Things: What's next? PayamBarnaghi
Opportunities and methodological challenges of  Big Data for official statist...Opportunities and methodological challenges of  Big Data for official statist...
Opportunities and methodological challenges of Big Data for official statist...Piet J.H. Daas
Service-oriented Cognitive Analytics in Smart Service SystemsService-oriented Cognitive Analytics in Smart Service Systems
Service-oriented Cognitive Analytics in Smart Service SystemsDr.-Ing. Robin Hirt
Distributed Trust Architecture: The New Foundation of EverythingDistributed Trust Architecture: The New Foundation of Everything
Distributed Trust Architecture: The New Foundation of EverythingLiming Zhu
wireless sensor networkwireless sensor network
wireless sensor networkparry prabhu
The Science of Data Science The Science of Data Science
The Science of Data Science James Hendler

Similar to Introducing the Cognitive-Biases-in-Crowdsourcing Checklist(20)

Recently uploaded

The role of CRISPR CAS-9 in the treatment of HIVThe role of CRISPR CAS-9 in the treatment of HIV
The role of CRISPR CAS-9 in the treatment of HIVSindhBiotech
Glycan-related Reagents for Extracellular Vesicles (EVs) ResearchGlycan-related Reagents for Extracellular Vesicles (EVs) Research
Glycan-related Reagents for Extracellular Vesicles (EVs) ResearchTokyo Chemicals Industry (TCI)
Blood BankBlood Bank
Blood BankKatie593784
ScooterLab - Griffin.pptxScooterLab - Griffin.pptx
ScooterLab - Griffin.pptxnathancone5537
Characterising the true descendants of the first starsCharacterising the true descendants of the first stars
Characterising the true descendants of the first starsSérgio Sacani
Enzyme-catalysis.(pdf)Enzyme-catalysis.(pdf)
Enzyme-catalysis.(pdf)NajlaaJaffarali

Introducing the Cognitive-Biases-in-Crowdsourcing Checklist

Editor's Notes

  1. 1. Cognitive biases of crowd workers are an impactful but often neglected type of systemic bias that can reduce the quality of crowdsourced data labels. 2. These cognitive biases are general human tendencies towards irrational decision-making that often occur subconsciously. 3. For example, crowd workers may fall prey to the *anchoring effect* when they are overly influenced by information they encounter first or the *confirmation bias* when judging in line with some (false) preexisting beliefs. 4. Previous work that a plurality of cognitive biases can occur in different types of crowdsourcing tasks. 5. So why do requesters rarely consider the influence of cognitive biases in the tasks they design? 6. One important reason here may be that combating cognitive biases is simply tricky. 7. A large number of different cognitive biases have been identified in psychological literature and may often be unclear which specific cognitive bias may occur in a given crowdsourcing task. To efficiently document, assess, and mitigate cognitive biases in crowdsourcing, requesters need a practical tool to help them navigate this space.
  2. 1. Proposing such a practical tool is what we aimed to do in this research. 2. As a starting point, we used a checklist proposed by Kahneman, Lovallo, and Sibony (2011) for the context of business psychology. 3. The big advantage of using a checklist is that it can reduce a complex space (e.g., cognitive biases) to a considerable degree while retaining useful information. 4. Specifically, the checklist proposed by Kahneman, Lovallo, and Sibony aims to help business decision-makers to avoid falling prey to cognitive biases using 12 items. 5. These 12 items cover the vast majority of potential rational mistakes by focusing on most commonly occurring ones and grouping biases together (give example). 6. To develop a similar checklist tool for our context, we adapted the checklist proposed by Kahneman, Lovallo, and Sibony to the context of cognitive biases in crowdsourcing; which we propose in an upcoming paper at HCOMP this year. 7. Our proposed checklist similarly contains 12 items that requesters can consider to identify potential cognitive biases elicited by their crowdsourcing task.
  3. 8. Each of the 12 items covers a specific cognitive bias or family of biases that may occur in the crowdsourcing context. 9. For example, the third item in our checklist concerns *groupthink* or the *bandwagon effect*. (read out loud) 10. Going through each of the 12 items in this format should help requesters efficiently combat cognitive biases in crowdsourcing.
  4. What can requesters do with the information they get from the checklist? 1. Having identified one or more cognitive biases that may affect crowd workers in the task at hand (ideally before collecting the data), the requester may use this information for three different purposes. 2. First, they may wish to *measure* the cognitive biases in question to assess whether crowd workers are actually affected by them. This may require adding additional items or metrics to the crowdsourcing task. 3. Second, requesters could *mitigate* the cognitive biases. Earlier work has already proposed a couple of solutions for this. 4. Third, requesters may *document* the cognitive biases so that the collected data is put in the right perspective. Pointing out such potential limitations can help others when interpreting results or re-using the data.
  5. Of course, checklist is limited That’s why we have a live version of it In our upcoming HCOMP paper, we illustrate the use of our checklist at the hand of a case study. We there also present a retrospective analysis of past HCOMP papers to show that cognitive biases may affect crowd workers in a majority of crowdsourcing tasks but are rarely dealt with. We hope that our proposed checklist can meaningfully contribute to general efforts towards more reliable human-labeled data.