• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content







Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Presentation Presentation Presentation Transcript

    • Document Usefulness: taking Genre to Task Luanne Freund University of British Columbia Vancouver, Canada HCIR 2011 – Mountain View, CA, October 21, 2011
    • Motivation
      • An interest in the non-topical, pragmatic aspects of human information interaction in digital domains
      • What does it mean to say that a document is useful?
      • What is the effect of the motivating task and the document genre in making these assessments?
      • E-Government Domain
        • high need, life events centred
        • low expertise
        • complex, diverse, genred communication
    • Methods
      • Experimental User Study – Document Assessment
      • 25 participants: university students, Canadian residents
      • Each participant – 5 tasks, 40 (5x8) documents
      • Usefulness assessments & Genre labelling
    • Tasks and Scenarios
      • 20 Situated Work Task Scenarios
      • 5 Types of Information Tasks:
        • Fact-Finding
        • Deciding
        • Doing
        • Learning
        • Problem-Solving
      (Doing) An elderly uncle has had a stroke and is now confined to a wheelchair. He and your aunt want to continue to live in their own home, but would like to do some minor renovations to make it wheelchair accessible as well as safer and more convenient for them as they grow older. They have asked you to help them with the project.
    • Documents Canadian federal Web domain (gc.ca) 160 documents - mixed genres
    • Usefulness Assessments 0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1.0 High Low Deciding Problem Solving Learning Fact Finding Doing Mid
      • Low level of consistency in ratings
      • Intraclass Correlation Coefficients (ICCs)
        • .284 overall
    • Effect of Task
      • Usefulness scores vary significantly by task
      Mean Usefulness Scores
    • Effect of Genre
      • Usefulness varies significantly by genre
        • News & reference material < homepage, guide
    • Effect of Task and Genre
      • Task-Genre Interaction Effect
        • Genres vary in usefulness for Doing, Deciding and Learning tasks.
        • No genre effect for Fact-Finding or Problem-Solving.
    • Genre Identification
      • Challenging task: “ [I] didn’t really know how to describe them other than information. They were reports of web site pages with info to me.”
      • Participants chose the same label as the expert for 52% of documents;
      • But, only 25% of all labels matched the expert assessment due to heavy use of multiple labels.
      • For most genres, the label most commonly applied by assessors as a group matched the expert assessment.
    • Summary
      • People do not agree about usefulness; as searches become difficult/complex usefulness scores drop and agreement deteriorates;
      • Genre matters! – but not for very simple (F-F) or very difficult (P-S) tasks;
      • Genre knowledge is implicit and classification is challenging.
    • Acknowledgements Thanks to all participants and to the following research assistants at the School of Library, Archival and Information Studies, UBC: Justyna Berzowska, Francesca de Freitas, Amanda Leinberger & Christina Nilsen This research was funded by a UBC Hampton Grant and a SSHRC Standard Research Grant to the first author.