Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Fact Checking & Information Retrieval

147 views

Published on

Talk given August 29, 2018 at the 1st Biannual Conference on Design of Experimental Search & Information Retrieval Systems (DESIRES 2018). Paper: https://www.ischool.utexas.edu/~ml/papers/lease-desires18.pdf

Published in: Technology
  • Be the first to comment

Fact Checking & Information Retrieval

  1. 1. Fact Checking and Information Retrieval Matt Lease School of Information @mattlease University of Texas at Austin ml@utexas.edu Slides: slideshare.net/mattlease
  2. 2. “The place where people & technology meet” ~ Wobbrock et al., 2009 “iSchools” now exist at 65 universities around the world www.ischools.org What’s an Information School? 2
  3. 3. Fake News? What News is Real? 3
  4. 4. “Truthiness” is an Old Problem “Truthiness is tearing apart our country... It used to be, everyone was entitled to their own opinion, but not their own facts. But that’s not the case anymore.” – Stephen Colbert (Jan. 25, 2006) “You furnish the pictures and I’ll furnish the war.” – William Randolph Hearst (Jan. 25, 1898) 4
  5. 5. Information Literacy National Information Literacy Awareness Month, US Presidential Proclamation, October 1, 2009. “Though we may know how to find the information we need, we must also know how to evaluate it. Over the past decade, we have seen a crisis of authenticity emerge. We now live in a world where anyone can publish an opinion or perspective, whether true or not, and have that opinion amplified…” 5
  6. 6. Journalist Fact Checking 6
  7. 7. Crowd Fact Checking 7
  8. 8. Related Work: Automatic Fact Checking (NLP) @mattlease
  9. 9. Fake News Challenge 9
  10. 10. 10
  11. 11. 11
  12. 12. 12
  13. 13. Fact Checking & Search @mattlease
  14. 14. Danny Sullivan, April 7, 2017 14
  15. 15. 15
  16. 16. http://people.mpi-inf.mpg.de/~kpopat/publications/www18_demo.pdf 16
  17. 17. CredEye 17
  18. 18. Work at UT Austin @mattlease
  19. 19. Ryu et al., HyperText 2012 http://odyssey.ischool.utexas.edu/mb/ MemeBrowser (2012) 19
  20. 20. Nguyen et al., AAAI’18 & UIST’18 http://fcweb.pythonanywhere.com 20 Explainable AI for Fact Checking
  21. 21. Automatic/Hybrid Fact Checking 21 Nguyen et al., AAAI’18 & UIST’18 http://fcweb.pythonanywhere.com
  22. 22. 22 Explaining Source Reputation Nguyen et al., AAAI’18 & UIST’18 http://fcweb.pythonanywhere.com
  23. 23. A Few Questions (1 of 2) • What can IR bring to fact checking that NLP and ML do not, and what new questions does fact checking raise for IR? • Which results should we return, how should we present them, what modes of interaction should we provide, and how should we evaluate success? • How do ranking & filtering decisions potentially impacting user trust of the system and fears of being manipulated? • How to diversify political (or other forms of) bias to provide diverse perspectives, esp. on controversial topics? How should we rank & evaluate results based on such diversity? 23
  24. 24. A Few Questions (2 of 2) • How do we balance giving users search results which match their existing beliefs vs. challenging users with alternative perspectives? • Just as people choose news outlets in part due to political leanings, will new vertical search engines rank and filter results to match audience views? • How do we frame, measure, and address potential harm of search results (e.g. “alternative” facts), be they errors or intended diversity? How do we evaluate information gain alongside potential varying severity of harm to some number of users? 24
  25. 25. Matt Lease - ml@utexas.edu - @mattlease Thank You! Slides: slideshare.net/mattlease Lab: ir.ischool.utexas.edu
  26. 26. 26
  27. 27. The Age of “Post-truth Politics” 27

×