Exploring session search
Upcoming SlideShare
Loading in...5
×
 

Exploring session search

on

  • 558 views

Slides from my presentation at the ECIR 2012 workshop on "Information Retrieval Over Query Sessions" (SIR2012) held in Barcelona, Spain. ...

Slides from my presentation at the ECIR 2012 workshop on "Information Retrieval Over Query Sessions" (SIR2012) held in Barcelona, Spain.

Title: Exploring Session Search

Abstract:

Exploratory search is typically characterized by recall-oriented information needs and by uncertainty and evolution of the information need. As searchers interact with the system, their understanding of the topic evolves in response to found information. These two characteristics – uncertainty of information need and the desire to find multiple documents – drive the need to run multiple queries. Furthermore, these queries are not independent of each other because they often retrieve overlapping sets of documents. Yet traditional information retrieval systems often treat searchers’ queries in isolation, ignoring the evolution of a person’s understanding of the information need and the historical coupling among queries.

I this talk, I will describe some interface ideas we're exploring to help people incorporate their search history into their ongoing retrieval and sense-making tasks, and will touch on some issues related to retrieval algorithms and evaluation.

Statistics

Views

Total Views
558
Views on SlideShare
556
Embed Views
2

Actions

Likes
1
Downloads
9
Comments
0

2 Embeds 2

http://tweetedtimes.com 1
https://twitter.com 1

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Exploring session search Exploring session search Presentation Transcript

    • Exploring Session Search Gene Golovchinsky FX Palo Alto Laboratory, Inc. @HCIR_GeneG
    • Thanks to:Jeremy Pickens, Abdigani Diriye, Tony Dunnigan
    • Exploratory search Interactive Information seeking Anomalous state of knowledge Evolving information need Often recall-oriented
    • One Query to Rule Them AllNo single query satisfies a typical exploratory searchinformation needSearch strategies involve many queriesQueries return overlapping results
    • Why we’re here1. How do we know what’s a session?2. How do we help people deal with this complex task?3. How do we evaluate systems and algorithms?
    • WarningTHIS TALK CONTAINS EXPLICITCONTENT
    • Explicit vs. implicit sessionsExplicit sessions 1. We ask the person 2. We infer it from structural aspects of the search context Task context may provide strong organizing queues For example, genealogical searches are often tied to a person in a family treeWhat about implicitsessions?
    • Implicit section detection is based on implicit assumptionsHow do we detect a session? – Time heuristics – Client connection heuristics – Query similarity heuristicsWhat are we assuming? – Person works continuously – Person does not switch tasks – Enough overlap in queriesHow good are these assumptions?
    • TradeoffsImplicit sessions Explicit sessionsPros Pros No explicit user input required Accurate Needed for collaborationCons Durable over time Effectiveness relies on precision- oriented information needs and inter-query similarity, i.e., on Cons redundancy Requires manual input in some cases More difficult to connect recurring or ongoing instances of the same information need
    • Dealing with redundancy
    • Strategies Ignore it The traditional approach Manage redundancy in the UI Ancestry.com, Querium Increase diversity through scoring Some algorithmic evaluation, but are such interactive systems deployed?
    • Manage redundancy in the UICOPING WITH REDUNDANCY
    • Some UI examplesGoogle +1 but no session awareness & no good persistent visual feedbackBing Visible query history but no help with documentsAncestry.com Flags previously saved records for current personQuerium user interface Variety of document- and query-centric displays
    • Ancestry.com: Query overlapHow can we help peoplemake sense of searchresults? What’s new? What’s redundant? What’s useful? What’s not useful?
    • Querium: Filtering by process metadataHistory of interactionduring a search can beprojected onto currentresults
    • Querium: Visualizing re-retrievalDocument-centeredretrieval history can beprojected onto each searchresultIndicates “important”documentsIndicates new documents
    • Querium: Query-centric view
    • Querium: Query-centric view
    • Query-centric view
    • Increasing diversityPREVENTING REDUNDANCY
    • Some (cor)related metrics Novelty Precision Diversity Recall The exact relationshipRedundancy is hard to pin down
    • Increasing {diversity} with scoringPros Query – Can incorporate prior explicit and implicit relevance assessments Black box – More focused queries may retrieve more pertinent documents at a given Rank Session docs state cutoffCons – Relies on accurate assessment of relevance Displayed User – No way to recover “organic” results, feedback ranking so hard for people to understand effect of personalization Stop
    • Increasing {diversity} with post-processing Rank QueryPros docs – Can recover “organic” results – Supports feedback on incorrect inference “organic” If user selects demoted doc ranking – Accommodates shifting info needs better Session – Can be applied interactively state Re-rankCons docs – Limited document set Displayed User feedback ranking Stop
    • A holistic approachEVALUATION
    • Vague generalitiesSession-based search must be evaluated as a human-machine system Hard to account for real human behavior through simulations onlyRecall and precision do not tell the whole story Exploratory search is inherently a learning process Effort, knowledge gain, frustration, serendipity importantLook at patterns of interaction that led to discovery Hard to evaluate marginal contribution of each query due to negative results, learning, information need drift
    • Some thoughts on evaluating algorithmsSmall gains in retrieval effectiveness will be swamped byinteraction, good or bad Small statistically-significant effects are meaningless in practiceEvaluation “in the wild” relies on users for ground truth Use post-hoc analysis to test how algorithms predicted users’ choicesLook at system’s ability to help people recognize usefuldocuments How many times was a document retrieved before it was seen? This works for lab and naturalistic studies
    • In closing… Information needs evolve Queries are approximations Knowledge is uncertain Design challenge: Help people planfuture actions by understanding the present in the context of the past
    • While I have your attention…There is a pending proposal to create a StackExchangesite for information retrieval. Think of it as Stack Overflow for IR geeks. We need more people to vote & promote.http://area51.stackexchange.com/proposals/39142/information-retrieval-and-search-engines
    • Do I still have your attention?IIiX 2012 August 21-24, 2012, Nijmegen, The Netherlands Deadline for papers April 9, 2012EuroHCIR 2012 Same place, August 25 Deadline for papers is June 22, 2012HCIR 2012: The 6th Symposium on Human Computer InformationRetrieval October 4-5, 2012, Boston, Massachusetts, USA Submission deadline mid-summer Will publish works in progress and archival, full-length papers
    • Image creditshttp://www.flickr.com/photos/torremountain/6831414535/http://www.flickr.com/photos/bigtallguy/233176326/http://www.flickr.com/photos/77074420@N00/198347900/http://www.flickr.com/photos/racatumba/93569705/http://www.flickr.com/photos/chrisolson/3595815374/http://www.flickr.com/photos/brymo/2813028454/http://www.flickr.com/photos/computix/108732248/http://www.flickr.com/photos/funadium/913303959/http://www.flickr.com/photos/moriza/189890016/http://www.flickr.com/photos/uhdigital/6802789537/
    • Hiding unwanted results
    • Hiding unwanted results