NCSU Libraries Usability Testing


Published on

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Several participants rated rated their comfort level on a 1-10 scale, all rating themselves at 7-8.We see these same “most frequent uses” play out over both the tests we did.
  • Menus were learnable: a number of users scanned menus to orient themselves to options before making selections, and went back to easily find the correct links.Find seems to resonate as a broader term than Search for this user group. Many started there when asked to find almost anything. Users seemed to quickly develop a mental model for “Find” and moved on to explore other menus.Services worked well as a catch-all. Many users seemed to understand the concept of Services.
  • Users seemed to quickly adapt a mental model for Find.
  • In general, users seemed to understand the nature of the Services category. Some users started here for things that were on the Research Help menu, but quickly adapted their understanding of the Services menu.
  • Task 9: You’ve written a paper but need to make sure the citations are correctly formatted. How would you locate the tool that can help you with this?While users in this test did not identify the Citation Builder label, we scored the task as a success if they identified the Citation Tools tab because Citation Builder would be listed prominently on the target page.However, because we know from user interviews and search logs that Citation Builder has strong name recognition, we recommend retaining the trigger words in the sub-link on this menu.
  • There were only two tasks that required use of the About menu because it was assumed that navigation to core applications and functions on the Web site are included on the other three menus, so we focused our testing efforts there.We will be doing a card sort to further test menu composition including how users group items within menus.
  • Task #8: 8 of 10 participants selected the About menu for locating maps, but only 1 located the View Library Maps link in the bottom section of the menu. --Links in the bottom section of the menus may not be visible. --Most people went to D.Hill library for maps. May indicate that users expect to find wayfinding information organized by location.--This could be a design issue; should test again after design is applied; test across multiple menus. If this is still a problem, consider incorporating these links into the menu lists.Task #10: --4/8 of the participants correctlyidentified the Find  Databases link.--This confusion among remaining participantsabout how to locate databases/article search tools highlights the perpetual problem that libraries have in describing and providing access to article databases. We found related problems in our search testing.
  • Task #11: We were looking for Tripsaver name recognition here. -- 4 of 9 participants found Tripsaver readily, though 1 user indicated that he would normally get to Tripsaver via the catalog.-- Of the remaining 5: 1 chose Find  Course Reserves 4 remaining users went to Services  Borrow, Renew, Request--8 of 9 users went to the Services menu to find Interlibrary loan. --Tripsaver should be featured on the Borrow, Renew, Request target page.Task #13:6 of 8 users did not see the Tripsaver link in the 2nd column of the Find menu -- likely because the trigger word Tripsaver was not in the task.Note that 8 of 9 users went to the Services menu
  • Task #14: Only 2 participants selected Research Help  Course Tools for this task. Users seemed to expect to find all course-related information under Course Reserves.Promote link to Library Tools for Courses on Course Reserves target page. Continue to evaluate placement and labeling for Library Tools for Courses
  • Task #11: We were looking for Tripsaver name recognition here. -- 4 of 9 participants found Tripsaver readily, though 1 user indicated that he would normally get to Tripsaver via the catalog.-- Of the remaining 5: 1 chose Find  Course Reserves 4 remaining users went to Services  Borrow, Renew, RequestTask #14: Only 2 participants selected Research Help  Course Tools for this task.
  • Carefully consider—and if possible, test specific links or labels--before prominently featuring links to services that are primarily used in the physical space.Most participants focused on the “help” aspect of the Research Help label, but a couple expected to find research tools or databases here. This label has been changed to Get Help in later iterations to clarify the intent of the menu.More testing will be needed to make sure that the Citation Tools and Course Tools labels still work on a Get Help menu.
  • Tripsaver link – in determining how prominently to feature Tripsaver, consider how often users might look for Tripsaver outside the context of catalog searches. Our interview research indicated good name recognition for Tripsaver, but most folks found it at the point of need during catalog searches.
  • What were our research questions starting out? What were our assumptions?Some assumptions:Users would stay on the ‘All’ tabUsers would switch between tabs in the search results
  • What models were we testing?Model 1: Tabbed Search Box Interface with Tabbed Search Results where the user could switch between search results and the search would re-execute within the target siloModel 3: Tabbed Search Box Interface with Non-Tabbed Search Results where the user could remain in a silo. To re-execute the search in a different silo, the user would need to click on the ‘Home’ button or the browser back button to return to the Tabbed Search Box Interface
  • We had a high rate of response doing guerilla testing in the lobby. 14 participants performed 46 tasks using Model 1 and 14 participants performed 38 tasks using Model 3.
  • Who were the participants?Over 19 departments in 3 overall fields were represented in a pool of 28 participants.
  • What did our users use on the current Web site? Research activities – finding journal articles and books (17/28) Service based activities – booking study rooms, borrowing and renewing laptops (10/28)Course Reserves (2/28)
  • We ask users if you could improve one thing about the web sit what would it be?Reducing clutter was a common responseProviding better access to room reservations system and laptop borrowing and renewalTwo participants mentioned advanced search (a way to get to advanced search from the homepage)
  • We segmented the results into 3 high-level observations and then outlined some of the problem areas in the testing. The first observation was that in well over half of the tasks in both search models, participants selected a search tab before beginning their search task. As we’ll see later on, this did not necessarily predict success in the task. In 60 tasks, user selected a tab before searching; in only 24 tasks did users stay on the ‘All’ tabUsers tended to stay within silos – very rarely did they search in both ‘All’ and a silo
  • In over half of the tasks where users selected a tab, they selected the tab that was appropriate for that search
  • What we don’t know is if they refined their searches in the silo, were they expecting that their search was executed in the same silo This gets at whether or not the interface is learnable. Did the user not understand (in the case of Model 1) that the search would carry through to the various silos?
  • What worked for our users?For books, journal titles and services tasks, most users went to the correct tab or searched in all.i.e., users used the ‘Web site’ and ‘Books & Media’ tabs appropriatelyTrigger words in the questions directed people towards the correct tabAgain, selected the correct tab did not predict whether the user was successful in their search task
  • Journals articles and databases questions were problematic.In many cases for the database question, users mentioned that they were looking for a database tab or link. They understood the term databases, but were confused about where to go to search for a database in either Model 1 or Model 2. Users were looking for the trigger word databases.
  • The following chart outlines task difficulty across models, across tasksFor the DVD question, results could be skewed as several participants commented that they would use the facets in the catalog to narrow by format type. Almost all users scanned results for format ‘DVD’ rather than typing ‘DVD’ into their search. The photocopy question (#3) scored low in model 3 due to the fact that users scanned results so quickly that they often missed the link to photocopies – Also, if they were using the Web site search, best bets were not part of this.In many cases, the correct tab was chosen, but either the search term was incorrectly constructed or users scanned results so quickly that they ended up failing the task
  • Tasks difficulty chartedSuccess rates for tasks 6, 7 & 9 were quite different between the 2 models. Data from each of those tests are below. In looking back at observations from the study itself, search terms were quite varied between the 2 models for these questions. For question 6, general confusion about databases led to failed searches. For tasks 7&9 users tended to scan search results so quickly that they quite often missed the correct answer in the search results. Task 6: Web of Science databaseModel 1: Fail, Fail,Fail,Hard, EasyAll, Articles, Articles, Articles,Library WebsiteModel 3: Hard, Easy, Fail, FailAll, All, Articles, JournalsTask 7: Lessig Article. Model 1:Med, Fail, Fail, FailArticles, All, Journals, JournalsModel 3: Med, Easy, Fail, EasyArticles, Articles, Journals, ArticlesTask 9: Borrow a Laptop. Model 1:Hard, Med, Easy, FailWebsite, Website, Website, BooksModel 3:Easy, Easy, EasyAll, Website, Website
  • We also segmented the data by user status. In the case of undergraduates, their average task difficulty rating was 2.5.
  • Issues in the test itself: scenarios were too difficult to read aloud in a guerrilla usability testing setting. Tasks should be shorter and more precise.
  • Journals / journal titles will appeal to faculty, a group we did not test in this usability testDatabases will be visibly highlighted on the new homepage design.
  • There’s nothing to indicate that search tabs hindered the users’ search. Build 2 prototypes for the new site – one with a tabbed search box and one with a single search box and links to silos. Conduct a second usability test to determine effectiveness.
  • NCSU Libraries Usability Testing

    1. 1. NCSU Libraries Usability Testing<br />April, 2010<br />Angie Ballard & Susan Teague Rector<br />1<br />
    2. 2. Navigation Testing<br />2<br />
    3. 3. Research question<br />Do users navigate using the expected menu items?<br />3<br />
    4. 4. Background<br />Users were recruited in situ in the D.H. Hill Library lobby and asked to complete 4 of 15 tasks using a working prototype of navigation menus only.<br />Each task required the user to open a navigation menu and indicate which menu item they would select to look for the specified information.<br />Facilitators recorded up to 4 of each user’s menu selections, in order of selection.<br />A facilitator assessment of task difficulty was also recorded.<br />4<br />
    5. 5. Who were the participants?<br />24 Undergraduates<br />3 Graduates<br />2 Staff<br />1 Visitor<br />2 Library Staff Members<br />Business<br /><ul><li>Management
    6. 6. Agri-Business</li></ul>Science & Engineering<br /><ul><li>Biology
    7. 7. Textiles
    8. 8. Computer Networking
    9. 9. Environmental Engineering
    10. 10. Environmental Technology
    11. 11. Biochemistry</li></ul>Humanities/Social Sciences<br /><ul><li>International Studies
    12. 12. Anthropology
    13. 13. Creative Writing
    14. 14. History Education
    15. 15. Library Science
    16. 16. 1st Year College</li></ul>5<br />
    17. 17. Use of the Web site<br />Nearly all participants described themselves as “pretty comfortable” or “very comfortable” with the current site.<br />Users reported that they used the current site most frequently for:<br />Research (finding articles/books)<br />Reserving Rooms<br />Reserving/Renewing laptops<br />Course Reserves<br />6<br />
    18. 18. IF YOU COULD IMPROVE ONE THING…<br />Most users expressed general satisfaction with the current site or described positive experiences. <br />Three users said the current site seems cluttered.<br />7<br />
    19. 19. Observations<br />Menus were learnable<br />Find resonated as a broader term than Search<br />Services menu worked well as a catch-all<br />Research Help was the most ambiguous<br />About was generally used as expected<br />8<br />
    20. 20. What worked well: Find<br />Find was very learnable<br />Books & Media label<br />Users readily used this to search for DVDs<br />9<br />
    21. 21. What worked well: Services<br />Scan/Copy/Print<br />for locating photocopy pricing<br />Digital Media Lab<br />DML had good name and function recognition. <br />10<br />
    22. 22. What worked well: Research Help<br />Citation Tools label<br />Users easily found the citation tools here, but few saw the Citation Builder sub-link.<br />11<br />
    23. 23. What worked well: About<br />Hours<br />Users did not hesitate to look for hours information on the About menu.<br />12<br />
    24. 24. Problem Areas<br />13<br />
    25. 25. Problem Areas<br />14<br />
    26. 26. Problem Areas<br />15<br />
    27. 27. Task Difficulty<br />16<br />
    28. 28. Other Observations<br />Some users insisted that would not use the website to complete certain tasks.<br />Photocopy cost; flash drive loans; ask a librarian; info on video editing<br />When asked what they would expect to see under a Research Help menu (n=6)<br />4 mentioned chat, tutorials, citation help, and/or research guides<br />2 expected to find links to resources or search tools.<br />17<br />
    29. 29. Recommendations<br />Feature a Tripsaver link on the Borrow, Renew, Request target page.<br />Make access to Databases prominent on the homepage for those users who identify the term.<br />Change Research Help label to Get Help to better reflect the intent of the menu. <br />Test to see if Citation Tools & Course Tools still work<br />18<br />
    30. 30. Recommendations<br />Promote link to Library Tools for Courses on Course Reserves target page. <br />Continue to evaluate placement and labeling for Library Tools for Courses<br />Organize maps/wayfinding information by location.<br />Include maps links on target pages for specific libraries/locations.<br />19<br />
    31. 31. Recommendations<br />Make links to these top tasks easy to find on the homepage:<br />Reserve a Study Room<br />Renew laptop<br />Course Reserves<br />20<br />
    32. 32. Search Testing<br />21<br />
    33. 33. Research Questions<br />Do users pre-select tabs in a tabbed search box before entering search terms?<br />How do users interact with tabbed or non-tabbed search results?<br />22<br />
    34. 34. Search Models<br />23<br />Model 1<br />Tabbed Search Results<br />Model 3<br />Non-Tabbed Search Results<br />
    35. 35. Background<br />28 undergraduates, graduates and library staff, participated in a round of guerrilla usability testing for 2 proposed search models<br />Participants were recruited from the pool of patrons that walked into the lobby of DH Hill<br /> <br />Each participant was asked to complete 2 tasks using one of the search models; many participants volunteered to answer more than 2 questions<br />14 participants performed 46 tasks using Model 1 <br />14 participants performed 38 tasks using Model 3<br />24<br />
    36. 36. Who Were the Participants?<br /><ul><li>Science/Engineering:
    37. 37. Aerospace Engineering
    38. 38. Animal Science/Biology
    39. 39. Biology
    40. 40. Biomedical Engineering
    41. 41. Chemistry
    42. 42. Civil Engineering
    43. 43. Computer Science
    44. 44. Electrical Engineering
    45. 45. Food Science
    46. 46. Human Biology/Nutritional Science
    47. 47. Integrated Management & Systems
    48. 48. Meteorology
    49. 49. Textiles Engineering</li></ul>22 Undergraduates<br />3 Graduates<br />2 Library Staff Members<br />1 Non-Traditional Student<br />Represented:<br /><ul><li>Humanities:
    50. 50. English/Creative Writing
    51. 51. History
    52. 52. Psychology
    53. 53. Business:
    54. 54. Business
    55. 55. Business Administration</li></ul>25<br />
    56. 56. Use of the Current Web site<br />Finding Journal Articles & Books<br />Booking Study Rooms<br />Laptop Borrowing & Renewal<br />Course Reserves<br />26<br />
    57. 57. IF YOU COULD IMPROVE ONE THING…<br />Highlight reservations (room, laptop) and renewal options<br />Make the site less cluttered<br />Provide advanced search<br />27<br />
    58. 58. Observations<br /><ul><li>In both search models, in well over half the tasks, participants pre-selected a search tab before beginning their search. </li></ul>28<br />
    59. 59. Observations<br />Across both search models, in 45 of 60 tasks in which users selected a tab, they selected the appropriate tab for that task.<br />29<br />
    60. 60. Observations<br />In both models, users rarely switched silos in the search results once in a silo. Users tended to stay in the same silo and refine searches within those silos.<br />30<br />
    61. 61. WHAT WORKED<br />31<br />
    62. 62. Problem Areas<br />32<br />
    63. 63. Task Difficulty<br />33<br />
    64. 64. Task Difficulty<br />34<br />
    65. 65. Task Difficulty by Status<br />35<br />
    66. 66. Other Issues<br />Almost all participants only scanned the first page of search results for a quick answer to the task<br />Even if users executed a successful search, they often did not scroll down far enough in the search results to find the answer <br />Spelling was problematic<br />36<br />
    67. 67. Recommendations<br />Although there was confusion with the Journals and Articles tabs, consider keeping both tabs to accommodate both novice and advanced users. Consider changing the ‘Journals’ tab label to ‘Journal Titles.’<br />Look to the redesign team to better highlight an option for databases on the homepage. Do not add a tab for databases as this could further highlight issues between databases, articles & journals. <br />37<br />
    68. 68. Recommendations<br />Consider implementing a tabbed search box in the new design. <br />Conduct more usability testing in the context of the new site design. <br />38<br />
    69. 69. Next Steps<br />Take recommendations from usability testing and incorporate into homepage wireframes<br />Build 2 search prototypes in the context of the new Web site homepage – one with a tabbed interface and one with a single search box with links to silos. <br />Conduct more usability testing on the working homepage prototypes.<br />39<br />
    70. 70. Thank you!Questions?<br />40<br />