Interactivity and feedback Gene Golovchinsky FX Palo Alto Laboratory @HCIR_GeneG Thanks to Tony Dunnigan for the drawings
A half-assed analogy A trivial taxonomySome self-serving examples
What I mean by “task” Information Multiple Evolvingseeking occurs interactions information over time with the system needs Human Computer
Two kinds of feedbackPerson System Person trains system to find documentsSystem Person System indicates possibilities to guide person
Person SystemPeople don’t use relevance feedback, right?They do when suitably motivated.Two examples: Ancestry.com Predictive coding
Relevance feedback at Ancestry.comSearch People find historical records about specific individuals Facts from records are saved to individuals in family treesRelevance feedback Saved facts are automatically incorporated into subsequent queries Relevance feedback is inferred from saved recordsMany motivated users Hundreds of hours of system use Lots of interaction Person System
Relevance Feedback in Predictive CodingPredictive coding is a technique for training a classifier to findrelevant documentsUsed in e-discovery to increase accuracy and reduce costsMachine learning algorithm is trained through hundreds ofrelevance judgments; applied to millions of documentsBig (and getting bigger) business Person System
System PersonSystem provides hints about potential actions Information scent Which documents are new Which terms are effective Ways to expand/reformulate the queryExamples Facets indicating numbers of matching documents Previously-saved or viewed documents History of queries, related queries Previews, hints, etc.
Interacting with the pastPreviouslysavedrecords forthis person Ancestry.com System Person
Interacting with the past Newly- retrieved Re-retrieved Querium System Person
Interacting with the present mspaceCommerical faceted browsing UI RelationBrowser
Interacting with the futureQuery preview Query nudgesAs searcher types, shows As searcher types, changesdistribution of new vs. re- halo color to encourage longerretrieved documents in a queriessearch session 7 6 5 Number of Query Terms 4 3 2 No instr. 1 0 Instruction No halo System Person Halo
Design ChallengesHow do we get people to userelevance feedback?How do we help people discoverwhich queries will be effective?How do we help people plan?