Successfully reported this slideshow.
Your SlideShare is downloading. ×

This Is Not What We Ordered: Exploring Why Biased Search Result Rankings Affect User Attitudes on Debated Topics

Ad

1
WIS
Web
Information
Systems
This Is Not What We Ordered
Exploring Why Biased Search Result Rankings
Affect User Attitude...

Ad

2
WIS
Web
Information
Systems
Bias in Web Search
References: Allam, Schulz, and Nakamoto (2014); Baeza-Yates (2018); Epste...

Ad

3
WIS
Web
Information
Systems
Our Study
RQ1. Ranking of viewpoint-balanced top 10 search results à attitude change?
RQ2. I...

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Check these out next

1 of 13 Ad
1 of 13 Ad
Advertisement

More Related Content

Advertisement

This Is Not What We Ordered: Exploring Why Biased Search Result Rankings Affect User Attitudes on Debated Topics

  1. 1. 1 WIS Web Information Systems This Is Not What We Ordered Exploring Why Biased Search Result Rankings Affect User Attitudes on Debated Topics Tim Draws1, Nava Tintarev2, Ujwal Gadiraju1, Alessandro Bozzon1, Benjamin Timmermans3 t.a.draws@tudelft.nl https://timdraws.net 1Delft University of Technology 2Maastricht University 3IBM Research
  2. 2. 2 WIS Web Information Systems Bias in Web Search References: Allam, Schulz, and Nakamoto (2014); Baeza-Yates (2018); Epstein and Robertson (2015); Gao and Shah (2020); Pogacar et al. (2017); White (2013) Yes! Yes! Yes! Yes! Yes! No! No! Search Engine Manipulation Effect (SEME): when search results favor a particular viewpoint, users tend to adopt it (attitude change) Strongly opposing Opposing Somewhat opposing Neutral Somewhat supporting Supporting Strongly supporting
  3. 3. 3 WIS Web Information Systems Our Study RQ1. Ranking of viewpoint-balanced top 10 search results à attitude change? RQ2. Individual user characteristics (AOT, user engagement) à attitude change? RQ3. Interaction user factors & search result rankings à attitude change? RQ4. Ranking of viewpoint-balanced top 10 search results à perceived bias? Method: (pre-registered) online between-subjects user study
  4. 4. 4 WIS Web Information Systems Method Crowdsourced viewpoint annotations for search results on five different debated topics; used those to assemble five viewpoint-balanced top 10 SERPs 1. Are social networking sites good for our society? 2. Should zoos exist? 3. Is cell phone radiation safe? 4. Should bottled water be banned? 5. Is obesity a disease? à Exposed users to different rankings of these top 10 SERPs and measured attitude change + user factors, perceived bias
  5. 5. 5 WIS Web Information Systems Procedure Attitude Measurement (7-point Likert scale) Imagine the government is seeking informed opinions from the population related to a number of debated topics…
  6. 6. 6 WIS Web Information Systems Procedure Attitude Measurement (7-point Likert scale) Exploration Topic Condition Bias Direction Assignment
  7. 7. 7 WIS Web Information Systems Procedure Attitude Measurement (7-point Likert scale) Exploration Attitude Measurement Surveys + search behavior AOT User eng. Perceived Bias Topic Condition Bias Direction Random Assignment (7-point Likert scale)
  8. 8. 8 WIS Web Information Systems Results Descriptives: 391 Participants (from Prolific); 70% expressed attitude change Hypothesis tests: • No evidence for a difference in attitude change across conditions (SEME) – Bayesian analysis: moderate evidence (BF = 8.56) in favor of null hypothesis (no difference) • No evidence for individual differences or interactions related to attitude change • No evidence for a difference in perceived bias across conditions – Bayesian analysis: strong evidence (BF = 31.23) in favor of null hypothesis (no difference)
  9. 9. 9 WIS Web Information Systems Which Cognitive Biases Cause SEME? References: Azzopardi (2021) Yes! Yes! Yes! Yes! Yes! No! No! Exposure effects: 5/7 search results are supporting veganism Order effects: Higher-ranked search results are supporting veganism
  10. 10. 10 WIS Web Information Systems Exploratory Analyses • Order effects – 77 (20% of) users clicked on exactly as many supporting as opposing results; still no evidence for a difference between conditions • Exposure effects – Positive relationship between proportion of supporting results clicked and attitude change (r = 0.34, p < 0.001)
  11. 11. 11 WIS Web Information Systems Exploratory Analyses Difference in attitude change between different SERPs in extreme bias condition (t = 2.61, p = 0.01). 0.0 0.5 1.0 1 2 3 4 5 6 7 8 9 10 Ranks Proportion clicked Condition: Little Bias Moderate Bias Extreme Bias
  12. 12. 12 WIS Web Information Systems Discussion & Conclusion • Expected to find SEME for viewpoint-balanced top 10 search results + individual differences, found no evidence for that • Exploratory analyses: exposure effects may explain SEME better than order effects • Implications for fair (re-)ranking metrics and algorithms as well as UI interventions – Potential goal: make users read more – Future research: confirm exposure effects t.a.draws@tudelft.nl https://timdraws.net Check our preregistration and supplementary material: https://osf.io/6tbvw/
  13. 13. 13 WIS Web Information Systems References • Ahmed Allam, Peter Johannes Schulz, and Kent Nakamoto. 2014. The impact of search engine selection and sorting criteria on vaccination beliefs and attitudes: Two experiments manipulating google output. Journal of Medical Internet Research 16, 4 (2014), e100. https://doi.org/10.2196/jmir.2642 • Leif Azzopardi. 2021. Cognitive Biases in Search: A Review and Reflection of Cognitive Biases in Information Retrieval. In Proceedings of the 2021 Conference on Human Information Interaction and Retrieval (CHIIR ’21). Association for Computing Machinery, New York, NY, USA, 27–37. https://doi.org/10.1145/3406522.3446023 • Ricardo Baeza-Yates. 2018. Bias on the web. Commun. ACM 61, 6 (2018), 54–61. https://doi.org/10.1145/3209581 • Robert Epstein and Ronald E. Robertson. 2015. The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proceedings of the National Academy of Sciences of the United States of America 112, 33 (2015), E4512– E4521. https://doi.org/10.1073/pnas.1419828112 • Ruoyuan Gao and Chirag Shah. 2020. Toward creating a fairer ranking in search engine results. Information Processing and Management 57, 1 (2020), 102138. https://doi.org/10.1016/j.ipm.2019.102138 • Frances A. Pogacar, Amira Ghenai, Mark D. Smucker, and Charles L.A. Clarke. 2017. The Positive and Negative Influence of Search Results on People’s De- cisions about the Efficacy of Medical Treatments. In Proceedings of the ACM SIGIR International Conference on Theory of Information Retrieval (ICTIR ’17). Association for Computing Machinery, New York, NY, USA, 209–216. https: //doi.org/10.1145/3121050.3121074 • Ryen W. White. 2013. Beliefs and biases in web search. In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’13). Association for Computing Machinery, New York, NY, USA, 3–12. https://doi.org/10.1145/2484028.2484053

×