Document Usefulness: taking  Genre to Task Luanne Freund University of British Columbia Vancouver, Canada HCIR 2011 – Mountain View, CA, October 21, 2011
Motivation An interest in the non-topical, pragmatic aspects of human information interaction in digital domains What does it mean to say that a document is useful?  What is the effect of the motivating task and the document genre in making these assessments? E-Government Domain high need, life events centred low expertise complex, diverse, genred communication
Methods Experimental User Study – Document Assessment 25 participants: university students, Canadian residents Each participant – 5 tasks, 40 (5x8) documents  Usefulness assessments & Genre labelling
Tasks and Scenarios 20 Situated Work Task Scenarios 5 Types of Information Tasks: Fact-Finding Deciding Doing Learning Problem-Solving (Doing)  An elderly uncle has had a stroke and is now confined to a wheelchair. He and your aunt want to continue to live in their own home, but would like to do some minor renovations to make it wheelchair accessible as well as safer and more convenient for them as they grow older. They have asked you to help them with the project.
Documents Canadian federal Web domain (gc.ca) 160 documents - mixed genres
Usefulness Assessments 0  .1  .2  .3  .4  .5  .6  .7  .8  .9  1.0 High Low Deciding Problem Solving Learning Fact Finding Doing Mid Low level of consistency in ratings  Intraclass Correlation Coefficients (ICCs) .284 overall
Effect of Task  Usefulness scores vary significantly by task Mean Usefulness Scores
Effect of Genre  Usefulness varies significantly by genre News & reference material < homepage, guide
Effect of Task and Genre  Task-Genre Interaction Effect Genres vary in usefulness for Doing, Deciding and Learning tasks. No genre effect for Fact-Finding or Problem-Solving.
Genre Identification Challenging task:  “ [I] didn’t  really know how to describe them other than information. They were reports of web site pages with info to me.” Participants chose the same label as the expert for 52% of documents; But, only 25% of all labels matched the expert assessment due to heavy use of multiple labels.  For most genres, the label most commonly applied by assessors  as a group  matched the expert assessment.
Summary People do not agree about usefulness; as searches become difficult/complex usefulness scores drop and agreement deteriorates; Genre matters! – but not for very simple (F-F) or very difficult (P-S) tasks;  Genre knowledge is implicit and classification is challenging.
Acknowledgements Thanks to all participants and to the following research assistants at the School of Library, Archival and Information Studies, UBC:  Justyna Berzowska, Francesca de Freitas, Amanda Leinberger & Christina Nilsen This  research was funded by a UBC Hampton Grant and a SSHRC Standard Research Grant to the first author.

Presentation

  • 1.
    Document Usefulness: taking Genre to Task Luanne Freund University of British Columbia Vancouver, Canada HCIR 2011 – Mountain View, CA, October 21, 2011
  • 2.
    Motivation An interestin the non-topical, pragmatic aspects of human information interaction in digital domains What does it mean to say that a document is useful? What is the effect of the motivating task and the document genre in making these assessments? E-Government Domain high need, life events centred low expertise complex, diverse, genred communication
  • 3.
    Methods Experimental UserStudy – Document Assessment 25 participants: university students, Canadian residents Each participant – 5 tasks, 40 (5x8) documents Usefulness assessments & Genre labelling
  • 4.
    Tasks and Scenarios20 Situated Work Task Scenarios 5 Types of Information Tasks: Fact-Finding Deciding Doing Learning Problem-Solving (Doing) An elderly uncle has had a stroke and is now confined to a wheelchair. He and your aunt want to continue to live in their own home, but would like to do some minor renovations to make it wheelchair accessible as well as safer and more convenient for them as they grow older. They have asked you to help them with the project.
  • 5.
    Documents Canadian federalWeb domain (gc.ca) 160 documents - mixed genres
  • 6.
    Usefulness Assessments 0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1.0 High Low Deciding Problem Solving Learning Fact Finding Doing Mid Low level of consistency in ratings Intraclass Correlation Coefficients (ICCs) .284 overall
  • 7.
    Effect of Task Usefulness scores vary significantly by task Mean Usefulness Scores
  • 8.
    Effect of Genre Usefulness varies significantly by genre News & reference material < homepage, guide
  • 9.
    Effect of Taskand Genre Task-Genre Interaction Effect Genres vary in usefulness for Doing, Deciding and Learning tasks. No genre effect for Fact-Finding or Problem-Solving.
  • 10.
    Genre Identification Challengingtask: “ [I] didn’t really know how to describe them other than information. They were reports of web site pages with info to me.” Participants chose the same label as the expert for 52% of documents; But, only 25% of all labels matched the expert assessment due to heavy use of multiple labels. For most genres, the label most commonly applied by assessors as a group matched the expert assessment.
  • 11.
    Summary People donot agree about usefulness; as searches become difficult/complex usefulness scores drop and agreement deteriorates; Genre matters! – but not for very simple (F-F) or very difficult (P-S) tasks; Genre knowledge is implicit and classification is challenging.
  • 12.
    Acknowledgements Thanks toall participants and to the following research assistants at the School of Library, Archival and Information Studies, UBC: Justyna Berzowska, Francesca de Freitas, Amanda Leinberger & Christina Nilsen This research was funded by a UBC Hampton Grant and a SSHRC Standard Research Grant to the first author.