• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Google Tech Talk: Reconsidering Relevance
 

Google Tech Talk: Reconsidering Relevance

on

  • 13,106 views

Reconsidering Relevance...

Reconsidering Relevance

We've become complacent about relevance. The overwhelming success of web search engines has lulled even information retrieval (IR) researchers to expect only incremental improvements in relevance in the near future. And beyond web search, there are still broad search problems where relevance still feels hopelessly like the pre-Google web.

But even some of the most basic IR questions about relevance are unresolved. We take for granted the very idea that a computer can determine which documents are relevant to a person's needs. And we still rely on two-word queries (on average) to communicate a user's information need. But this approach is a contrivance; in reality, we need to think of information-seeking as a problem of optimizing the communication between people and machines.

We can do better. In fact, there are a variety of ongoing efforts to do so, often under the banners of "interactive information retrieval", "exploratory search", and "human computer information retrieval". In this talk, I'll discuss these initiatives and how they are helping to move "relevance" beyond today's outdated assumptions.


About the Speaker

Daniel Tunkelang is co-founder and Chief Scientist at Endeca, a leading provider of enterprise information access solutions. He leads Endeca's efforts to develop features and capabilities that emphasize user interaction. Daniel has spearheaded the annual Workshops on Human Computer Information Retrieval (HCIR) and is organizing the Industry Track for SIGIR '09. Daniel also publishes The Noisy Channel, a widely read and cited blog that focuses on how people interact with information.

Daniel holds undergraduate degrees in mathematics and computer science from the Massachusetts Institute of Technology, with a minor in psychology. He completed a PhD at Carnegie Mellon University for his work on information visualization. His work previous to Endeca includes stints at the IBM T. J. Watson Research Center and AT&T Labs,

Statistics

Views

Total Views
13,106
Views on SlideShare
11,927
Embed Views
1,179

Actions

Likes
31
Downloads
345
Comments
3

28 Embeds 1,179

http://thenoisychannel.com 429
http://blogs.oracle.com 282
http://www.cs.cmu.edu 185
http://tantrieuf31.blogspot.com 60
http://www.techgig.com 37
http://catacroquer.blogspot.com 33
http://www.linkedin.com 30
http://pimlog.tumblr.com 26
http://vielmetti.typepad.com 23
http://catacroquer.blogspot.com.es 17
https://blogs.oracle.com 11
http://www.slideshare.net 11
http://translate.googleusercontent.com 7
http://www.agglom.com 4
http://catacroquer.blogspot.mx 3
http://www.techgig.timesjobs.com 3
http://skydivedadpublishing.com 3
http://search.yahoo.com 2
http://catacroquer.blogspot.de 2
http://sambalewa.web.id 2
http://gmsvmd48 2
http://smartdatacollective.com 1
http://www.google.com 1
http://cmc.l50sw.com 1
http://www.javaoracleblog.com 1
http://www.bloggen.be 1
http://static.slideshare.net 1
http://catacroquer.blogspot.com.ar 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

13 of 3 previous next Post a comment

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Friends, Countrymen, Googlers, I come to bury relevance, not to praise it. Well, that’s overstating the case. But I am here today to challenge your approach to information access, and more importantly to tease out and question its philosophical underpinnings. I realize that I’m singling you out as Googlers for holding a belief that is far more widely held, but you are the standard bearers of relevance. And you invited me.  Notes: this presentation was delivered at the Google NYC office on 1/7/09. The title is an allusion to Tefko Saracevic’s article, “Relevance Reconsidered”. If you are interested in learning more about the history of relevance I highly recommend his 2007 Lazerow Memorial Lecture on “Relevance in information science” (http://mediabeast.ites.utk.edu/mediasite4/Viewer/?peid=fb8f84cb-9f82-499f-b12c-9a56ab5cf5ba).

Google Tech Talk: Reconsidering Relevance Google Tech Talk: Reconsidering Relevance Presentation Transcript

  • Reconsidering Relevance Daniel Tunkelang Chief Scientist, Endeca
  • howdy!
    • 1988 – 1992
    • 1993 – 1998
    • 1999 -
  • overview
    • what is relevance?
    • what’s wrong with relevance?
    • what are the alternatives?
  • but first let’s set the stage
  • iconic businesses of the 20 th and 21 st centuries I’m Feeling Lucky
  • process and scale orchestration
  • but there’s a dark side
  • users are satisfied
  • an interesting contrast
    • “ Search on the internet is solved. I always find what I need. But why not in the enterprise? Seems like a solution waiting to happen.”
    • - a Fortune 500 CTO
  • the real questions
    • What is “search on the internet” and why is it perceived a solved problem?
    • What is “search in the enterprise” and why is it perceived as an unsolved problem?
    • And what does this have to do with relevance?
  • easy vs. hard search problems
    • easy where to buy Ender in Exile ?
    • hard good novel to read on the beach?
    • easy proof that sorting has n log n lower bound?
    • hard algorithm to sort partially ordered set, given a constant-time comparator?
    • what is relevance?
    • what’s wrong with relevance?
    • what are the alternatives?
  • defining relevance
    • Relevance is defined as a measure of information conveyed by a document relative to a query. It is shown that the relationship between the document and the query, though necessary, is not sufficient to determine relevance.
    • William Goffman, On relevance as a measure, 1964.
  • we need more definitions
  • let’s work top-down
    • information retrieval (IR) = study of retrieval of information (not data) from collection of written documents retrieved documents aim at satisfying user information need
  • IR assumes information needs
    • user information need = natural language declaration of informational need of user
    • query = expression of user information need in input language provided by information system
  • relevance drives IR modeling
    • modeling = studies algorithms used for ranking documents according to system assigned likelihood of relevance
    • model = a set of premises and an algorithm for ranking documents with regard to a user query
  • a relevance-centric approach information Need query select from results rank using IR model USER: SYSTEM: tf-idf PageRank
    • what is relevance?
    • what’s wrong with relevance?
    • what are the alternatives?
  • our first communication problem information need query
    • 2 words?
    • natural language?
    • telepathy?
  • and the game of telephone continues query rank using IR model
    • cumulative error
    • relevance is subjective
    • what Goffman said
  • and hopefully users feel lucky rank using IR model
    • selection bias
    • inefficient channel
    • backup plan?
    select from results
  • queries are misinterpreted Results 1-10 out of about 344,000,000 for ir
  • ranked lists are inefficient
  • assumptions of relevance-centric approach
    • self-awareness
    • self-expression
    • model knows best
    • answer is a document
    • one-shot query
  • can we do better?
    • what is relevance?
    • what’s wrong with relevance?
    • what are the alternatives?
  • human-computer information retrieval
    • don’t just guess the user’s intent
      • optimize communication
    • increase user responsibility and control
      • require and reward human intellectual effort
    “ Toward Human-Computer Information Retrieval” Gary Marchionini
  • human computer information retrieval
  • a concrete use case
    • Colleague: Hey Daniel! You should check out what this guy Steve Pollitt’s been researching. Sounds right up your alley.
    • Daniel: Sure thing, I’ll look into it.
  • google him!
  • google scholar him?
  • rexa him?
  • getting better
  • hcir-inspired interface
  • tags provide summarization and guidance
  • my information need evolves as i learn
  • hcir – implementing the vision
  • scatter/gather: a search for “star”
  • faceted search
  • practical considerations
    • which facets to show
    • which facet values to show
    • when to suggest faceted refinement
    • how to automate faceted classification
  • showing the right facets: microwaves
  • showing the right facets: ceiling fans
  • query-driven clarification before refinement
    • Matching Categories include:
      • Appliances > Small Appliances > Irons & Steamers
      • Appliances > Small Appliances > Microwaves & Steamers
      • Bath > Sauna & Spas > Steamers
      • Kitchen > Bakeware & Cookware > Cookware >
      • Open Stock Pots > Double Boilers & Steamers
      • Kitchen > Small Appliances > Steamers
  • results-driven clarification before refinement Search : storage
  • crowd-sourcing to tag documents
  • hcir cheats the precision / recall trade-off recall precision
  • set retrieval 2.0
    • set retrieval that responds to queries with
      • overview of the user's current context
      • organized set of options for exploration
    • contextual summaries of document sets
      • optimize system’s communication with user
    • query refinement options
      • optimize user’s communication with system
  • hcir using set retrieval 2.0
    • emphasize set summaries over ranked lists
    • establish a dialog between the user and the data
    • enable exploration and discovery
  • think outside the (search) box
    • relevance-centric search solves many use cases
    • but not some of the most valuable ones
    • support interaction, exploration
    • human-computer information retrieval
  • one more thing …
  • “ Google's mission is to organize the world's information and make it universally accessible and useful.”
  • organizer or referee?
  • thank you
    • communication 1.0
    • email: [email_address]
    • communication 2.0
    • blog: http://thenoisychannel.com
    • twitter: http://twitter.com/dtunkelang