Apples and Oranges: Lessons From a Usability Study of Two Library FAQ              Web Sites      Susan [Gardner] Archamba...
Loyola Marymount University• Private Catholic  University in Los  Angeles, California• 5900+ undergraduates  and 1900+ gra...
Research Question• What is the most effective  way to provide access to  our Library FAQs?• A comparison of two  products:...
How Do I?
LibAnswers
Auto-Suggest Feature
Related Questions Feature
Methodology• Conducted usability  testing on 20  undergraduate students  at LMU• Population equally  represented each clas...
Methodology• Used a combination of  the Performance Test  methodology and the  Think-Aloud  methodology
Methodology• Students given 10 performance tasks to  complete at a computer twice - once using  LibAnswers as starting poi...
Performance Task QuestionsHow to print in the library from a laptop   How to request a research consultationHow long can a...
Satisfaction Scale
Methodology• Audio recorded and  computer screen  activity captured via  “ScreenFlow”  screencasting  software
Additional Questions• How likely would you be to use each page  again?• What was your favorite aspect of each site?• What ...
Performance Scoring: Speed• Start the clock when  the person begins  searching for the  answer to a new  question on the h...
Performance Scoring: Accuracy                                             Check off the one thatWas the Answer…           ...
Performance Scoring: Efficiency• Count the number of  times the person made  a new attempt, or  started down a new  path, ...
Sample Scoring Video                          bit.ly/usabilityvideoSite         Speed        Accuracy             Efficien...
Performance ResultsSpeed         Average (seconds)   Efficiency   Total Wrong PathsLibAnswers    40.55               LibAn...
Performance ResultsAccuracy                     LibAnswers      How Do I?Completely accurate          182     (91%)   175 ...
LibAnswers Features UsedFeature                  Number Who Used   PercentSearch Box               16                80%Au...
SatisfactionLikely to use   Very       Unlikely   Undecided   Likely    Veryagain           unlikely                      ...
SatisfactionOverall preference         ResponseLibAnswers                 40% (8)How Do I?                  60% (12)
Patterns• Overall, 9 of 20 performed worse with the site  they said they preferred.• 4 of 5 freshmen performed worse with ...
LibAnswersLikes                      Dislikes                           • Overwhelming interface /• Keyword search “like a...
How Do I?Likes                         Dislikes• Fast / efficient to use     • Less efficient than the                    ...
Sharing results with Springshare• Retain question asked in search results screen.• Add stopwords to search, so typing “How...
Take the best of… How Do I
Take the best of… LibAnswers
But wait…There is another.
Take the best of… Get Help
The best of all worlds
Conclusions• Ended up with a        • Sitting in silence  balance between two      watching the  extremes rather than     ...
AcknowledgementsThank you:• Shannon Billimore• Jennifer Masunaga• LMU Office of Assessment/Christine Chavez• Springshare
Bibliography• Ericsson, K.A. and Simon, H.A.      • Porter, J. (2003). Testing the  (1980). Verbal Reports as Data.       ...
Additional InformationPresentation Slides     Contact Us• bit.ly/gardnersimon   Ken Simon                          Referen...
Upcoming SlideShare
Loading in …5
×

Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites

707 views

Published on

Presentation from the 18th Annual Reference Research Forum (ALA) in Anaheim, CA, June 2012.

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
707
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites

  1. 1. Apples and Oranges: Lessons From a Usability Study of Two Library FAQ Web Sites Susan [Gardner] Archambault Kenneth Simon
  2. 2. Loyola Marymount University• Private Catholic University in Los Angeles, California• 5900+ undergraduates and 1900+ graduates• William H. Hannon Library Information Desk open 24/5
  3. 3. Research Question• What is the most effective way to provide access to our Library FAQs?• A comparison of two products: How Do I? and LibAnswers. Which features do students prefer, and which features lead to better performance?
  4. 4. How Do I?
  5. 5. LibAnswers
  6. 6. Auto-Suggest Feature
  7. 7. Related Questions Feature
  8. 8. Methodology• Conducted usability testing on 20 undergraduate students at LMU• Population equally represented each class (freshmen through seniors) and had a ratio of 60:40 females to males
  9. 9. Methodology• Used a combination of the Performance Test methodology and the Think-Aloud methodology
  10. 10. Methodology• Students given 10 performance tasks to complete at a computer twice - once using LibAnswers as starting point, and once using How Do I?• After each performance task, students given questionnaire measuring satisfaction with site
  11. 11. Performance Task QuestionsHow to print in the library from a laptop How to request a research consultationHow long can a graduate student check How to search for a book by the author’sout a book nameWhere are the library copy machines How to tell what books are on reserve for a classHow to request a book from basement Where to access CRSPSift software in thestorage libraryCan a Loyola law school student reserve How much does it cost for an undergrada group study room in advance to request a magazine article from another library
  12. 12. Satisfaction Scale
  13. 13. Methodology• Audio recorded and computer screen activity captured via “ScreenFlow” screencasting software
  14. 14. Additional Questions• How likely would you be to use each page again?• What was your favorite aspect of each site?• What was your least favorite aspect?• Overall, do you prefer LibAnswers or How Do I?
  15. 15. Performance Scoring: Speed• Start the clock when the person begins searching for the answer to a new question on the home page of the site they are testing• Stop the clock when they copy the URL with the answer
  16. 16. Performance Scoring: Accuracy Check off the one thatWas the Answer… applies: Pointed to a related question under the Completely Accurate: found the answer correct category, but incorrect page On the correct path to the information, but did not go far enough or Incorrect and off topic took wrong subsequent path On the correct page, but did not see the answer (supersedes everything else they Gave up: never found an answer tried on other attempts to answer)
  17. 17. Performance Scoring: Efficiency• Count the number of times the person made a new attempt, or started down a new path, by returning to the home page *after* a previous attempt away from or on the homepage failed
  18. 18. Sample Scoring Video bit.ly/usabilityvideoSite Speed Accuracy EfficiencyHow Do I? 46 seconds Completely Accurate +1 (clicked 1 wrong path)LibAnswers 36 seconds Completely Accurate +1 (clicked 1 wrong path)
  19. 19. Performance ResultsSpeed Average (seconds) Efficiency Total Wrong PathsLibAnswers 40.55 LibAnswers 30How Do I? 33.90 How Do I? 40
  20. 20. Performance ResultsAccuracy LibAnswers How Do I?Completely accurate 182 (91%) 175 (87.5%)Correct path but did not go 5 (2.5%) 15 (7.5%)far enough or took a wrongsubsequent pathCorrect page, but did not 3 (1.5%) 3 (1.5%)see the answerPointed to a related 6 (3%) 3 (1.5%)question under the correctcategory, but incorrectpageIncorrect and off-topic 0 3 (1.5%)Gave up: never found 4 (2%) 1 (.005%)answer
  21. 21. LibAnswers Features UsedFeature Number Who Used PercentSearch Box 16 80%Auto-Suggest 12 60%Popular Answers 9 45%Cloud Tag 8 40%Related Questions 4 20%Change Topic Drop-down 2 10%Recent Answers 2 10%
  22. 22. SatisfactionLikely to use Very Unlikely Undecided Likely Veryagain unlikely LikelyLibAnswers 0 15% (3) 5 (25%) 5 (25%) 7 (35%)How Do I? 0 15% (3) 3 (15%) 5 (25%) 9 (45%)
  23. 23. SatisfactionOverall preference ResponseLibAnswers 40% (8)How Do I? 60% (12)
  24. 24. Patterns• Overall, 9 of 20 performed worse with the site they said they preferred.• 4 of 5 freshmen performed worse with the site they said they preferred. Upperclassmen were more consistent.• Females tended to perform better with their preferred site; males did not.• 75% of the males preferred How Do I? over LibAnswers, while females were evenly divided.
  25. 25. LibAnswersLikes Dislikes • Overwhelming interface /• Keyword search “like a cluttered search engine” • Long list of specific• Autosuggest in search questions but hard to find the info you want bar • Less efficient than the• Popular topics list “How Do I” page• Friendly / pleasant to • Once you do a search, you lose your original question use • Autosuggestions are• Don’t have to read ambiguous or too through categories broad, and sometimes don’t function properly
  26. 26. How Do I?Likes Dislikes• Fast / efficient to use • Less efficient than the LibAnswers page: have to• Everything is right read a lot there in front of you: “I • Too restricted: needs a don’t have to type, just search box click” • Have to guess a category• Simple, clearly laid out to decide where to look categories • Limited number of too- broad questions• Organized and clean • Boring / basic looking appearance
  27. 27. Sharing results with Springshare• Retain question asked in search results screen.• Add stopwords to search, so typing “How do I” doesn’t drop down a long list of irrelevant stuff, and “Where is” and “where are” aren’t mutually exclusive.• Remove “related LibGuides” content to reduce clutter.• Control the list of “related questions” below an answer: they seem to be based only on the first topic assigned to a given question.
  28. 28. Take the best of… How Do I
  29. 29. Take the best of… LibAnswers
  30. 30. But wait…There is another.
  31. 31. Take the best of… Get Help
  32. 32. The best of all worlds
  33. 33. Conclusions• Ended up with a • Sitting in silence balance between two watching the extremes rather than participants made them one or the other nervous. Next time maybe leave the room• Think-aloud method: and have a self-guided gave up control; no test preconceived ideas could influence • Efficiency is difficult to outcome measure: moved away from counting clicks
  34. 34. AcknowledgementsThank you:• Shannon Billimore• Jennifer Masunaga• LMU Office of Assessment/Christine Chavez• Springshare
  35. 35. Bibliography• Ericsson, K.A. and Simon, H.A. • Porter, J. (2003). Testing the (1980). Verbal Reports as Data. Three-Click Rule. Retrieved from Psychological Review, 87(3), 215- http://www.uie.com/articles/thre 251. e_click_rule/.• Smith, Ashleigh, Magner, Brian, a • Willis, G.B. (2005). Cognitive nd Phelan, Paraic. (2008, Nov. Interviewing: A Tool for Improving 20). Think Aloud Protocol Part 2. Questionnaire Design. Thousand Retrieved May 3, 2012 from Oaks, CA: Sage Publications. http://www.youtube.com/watch? v=dyQ_rtylJ3c&feature=related• Norlin, Elaina. (2002). Usability Testing for Library Web Sites: A Hands-On Guide. Chicago: American Library Association.
  36. 36. Additional InformationPresentation Slides Contact Us• bit.ly/gardnersimon Ken Simon Reference & Instruction Technologies Librarian Loyola Marymount University Twitter: @ksimon Email: ksimon@me.com Susan [Gardner] Archambault Head of Reference & Instruction Loyola Marymount University Twitter: @susanLMU Email: susan.gardner@lmu.edu

×