Comprehensive Viewpoint Representations for a Deeper Understanding of User Interactions With Debated Topics
1. 1
WIS
Web
Information
Systems
Comprehensive Viewpoint Representations for
a Deeper Understanding of User Interactions
With Debated Topics
CHIIR - March 15th, 2022 - Regensburg, Germany (Virtual)
Tim Draws1, Oana Inel1, Nava Tintarev2, Christian Baden3, Benjamin Timmermans4
t.a.draws@tudelft.nl
@timdr4ws
https://timdraws.net
1Delft University of Technology, 2Maastricht University, 3The Hebrew University of Jerusalem, 4IBM
5. 5
WIS
Web
Information
Systems
Document
Stance
Ordinal [-3, 3]
Viewpoint Label
References: Draws et al. (2021); Rieger et al. (2021)
-3 3
2
1
0
-1
-2
Strongly opposing Strongly supporting
Neutral
“Pineapple on pizza is the
most scandalous thing. It
doesn’t taste good, but
also it’s just WRONG.”
-3
6. 6
WIS
Web
Information
Systems
Document
Stance
Ordinal [-3, 3]
Viewpoint Label
References: Baden and Springer (2014; 2017); Boltanski and Thévenot (2006)
Logic
Multi-Categorical
“Pineapple on pizza is the
most scandalous thing. It
doesn’t taste good, but
also it’s just WRONG.”
-3
civic
functional
inspired
7. 7
WIS
Web
Information
Systems
Summary
• Contributions
– Novel viewpoint label
– Annotated data set of 169 tweets; guidelines for crowdsourcing
– Viewpoint diversity analysis and user study
• Future work
– Obtaining viewpoint labels automatically
– New ways of analyzing online content
t.a.draws@tudelft.nl
@timdr4ws
https://timdraws.net
8. 8
WIS
Web
Information
Systems
References
Christian Baden and Nina Springer. 2014. Com(ple)menting the news on the financial crisis: The contribution of news users’ commentary to the diversity of viewpoints in the public debate. European Journal of Communication
29, 5 (Oct. 2014), 529–548. https://doi.org/10.1177/0267323114538724
Christian Baden and Nina Springer. 2017. Conceptualizing viewpoint diversity in news discourse. Journalism 18, 2 (Feb. 2017), 176–194. https://doi.org/10.1177/ 1464884915605028
Luc Boltanski and Laurent Thévenot. 2006. On Justification: Economies of Worth. Vol. 27. Princeton University Press.
Tim Draws, Nava Tintarev, Ujwal Gadiraju, Alessandro Bozzon, and Benjamin Timmermans. 2021. This Is Not What We Ordered: Exploring Why Biased Search Result Rankings Affect User Attitudes on Debated Topics. In
Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. ACM, Virtual Event Canada, 295–305. https://doi.org/10.1145/3404835.3462851
Küçük, Dilek, and Fazli Can. "Stance detection: A survey." ACM Computing Surveys (CSUR) 53.1 (2020): 1-37.
Juhi Kulshrestha, Motahhare Eslami, Johnnatan Messias, Muhammad Bilal Zafar, Saptarshi Ghosh, Krishna P. Gummadi, and Karrie Karahalios. 2019. Search bias quantification: investigating political bias in social media and
web search. Information Retrieval Journal 22, 1-2 (April 2019), 188–227. https://doi.org/10.1007/s10791-018-9341-2
Saif M. Mohammad, Svetlana Kiritchenko, Parinaz Sobhani, Xiaodan Zhu, and Colin Cherry. 2016. Semeval-2016 Task 6: Detecting Stance in Tweets. In Pro- ceedings of the International Workshop on Semantic Evaluation
(SemEval ’16). San Diego, California.
Alisa Rieger, Tim Draws, Mariët Theune, and Nava Tintarev. 2021. This Item Might Reinforce Your Opinion: Obfuscation and Labeling of Search Results to Mitigate Confirmation Bias. In Proceedings of the 32nd ACM
Conference on Hypertext and Social Media (Virtual Event, USA) (HT ’21). Association for Computing Machinery, New York, NY, USA, 189–199. https://doi.org/10.1145/3465336.3475101
Images
Photo on first slide by Volodymyr Hryshchenko. Retrieved from https://unsplash.com/photos/V5vqWC9gyEU.