Several participants rated rated their comfort level on a 1-10 scale, all rating themselves at 7-8.We see these same “most frequent uses” play out over both the tests we did.
Menus were learnable: a number of users scanned menus to orient themselves to options before making selections, and went back to easily find the correct links.Find seems to resonate as a broader term than Search for this user group. Many started there when asked to find almost anything. Users seemed to quickly develop a mental model for “Find” and moved on to explore other menus.Services worked well as a catch-all. Many users seemed to understand the concept of Services.
Users seemed to quickly adapt a mental model for Find.
In general, users seemed to understand the nature of the Services category. Some users started here for things that were on the Research Help menu, but quickly adapted their understanding of the Services menu.
Task 9: You’ve written a paper but need to make sure the citations are correctly formatted. How would you locate the tool that can help you with this?While users in this test did not identify the Citation Builder label, we scored the task as a success if they identified the Citation Tools tab because Citation Builder would be listed prominently on the target page.However, because we know from user interviews and search logs that Citation Builder has strong name recognition, we recommend retaining the trigger words in the sub-link on this menu.
There were only two tasks that required use of the About menu because it was assumed that navigation to core applications and functions on the Web site are included on the other three menus, so we focused our testing efforts there.We will be doing a card sort to further test menu composition including how users group items within menus.
Task #8: 8 of 10 participants selected the About menu for locating maps, but only 1 located the View Library Maps link in the bottom section of the menu. --Links in the bottom section of the menus may not be visible. --Most people went to D.Hill library for maps. May indicate that users expect to find wayfinding information organized by location.--This could be a design issue; should test again after design is applied; test across multiple menus. If this is still a problem, consider incorporating these links into the menu lists.Task #10: --4/8 of the participants correctlyidentified the Find Databases link.--This confusion among remaining participantsabout how to locate databases/article search tools highlights the perpetual problem that libraries have in describing and providing access to article databases. We found related problems in our search testing.
Task #11: We were looking for Tripsaver name recognition here. -- 4 of 9 participants found Tripsaver readily, though 1 user indicated that he would normally get to Tripsaver via the catalog.-- Of the remaining 5: 1 chose Find Course Reserves 4 remaining users went to Services Borrow, Renew, Request--8 of 9 users went to the Services menu to find Interlibrary loan. --Tripsaver should be featured on the Borrow, Renew, Request target page.Task #13:6 of 8 users did not see the Tripsaver link in the 2nd column of the Find menu -- likely because the trigger word Tripsaver was not in the task.Note that 8 of 9 users went to the Services menu
Task #14: Only 2 participants selected Research Help Course Tools for this task. Users seemed to expect to find all course-related information under Course Reserves.Promote link to Library Tools for Courses on Course Reserves target page. Continue to evaluate placement and labeling for Library Tools for Courses
Task #11: We were looking for Tripsaver name recognition here. -- 4 of 9 participants found Tripsaver readily, though 1 user indicated that he would normally get to Tripsaver via the catalog.-- Of the remaining 5: 1 chose Find Course Reserves 4 remaining users went to Services Borrow, Renew, RequestTask #14: Only 2 participants selected Research Help Course Tools for this task.
Carefully consider—and if possible, test specific links or labels--before prominently featuring links to services that are primarily used in the physical space.Most participants focused on the “help” aspect of the Research Help label, but a couple expected to find research tools or databases here. This label has been changed to Get Help in later iterations to clarify the intent of the menu.More testing will be needed to make sure that the Citation Tools and Course Tools labels still work on a Get Help menu.
Tripsaver link – in determining how prominently to feature Tripsaver, consider how often users might look for Tripsaver outside the context of catalog searches. Our interview research indicated good name recognition for Tripsaver, but most folks found it at the point of need during catalog searches.
What were our research questions starting out? What were our assumptions?Some assumptions:Users would stay on the ‘All’ tabUsers would switch between tabs in the search results
What models were we testing?Model 1: Tabbed Search Box Interface with Tabbed Search Results where the user could switch between search results and the search would re-execute within the target siloModel 3: Tabbed Search Box Interface with Non-Tabbed Search Results where the user could remain in a silo. To re-execute the search in a different silo, the user would need to click on the ‘Home’ button or the browser back button to return to the Tabbed Search Box Interface
We had a high rate of response doing guerilla testing in the lobby. 14 participants performed 46 tasks using Model 1 and 14 participants performed 38 tasks using Model 3.
Who were the participants?Over 19 departments in 3 overall fields were represented in a pool of 28 participants.
What did our users use on the current Web site? Research activities – finding journal articles and books (17/28) Service based activities – booking study rooms, borrowing and renewing laptops (10/28)Course Reserves (2/28)
We ask users if you could improve one thing about the web sit what would it be?Reducing clutter was a common responseProviding better access to room reservations system and laptop borrowing and renewalTwo participants mentioned advanced search (a way to get to advanced search from the homepage)
We segmented the results into 3 high-level observations and then outlined some of the problem areas in the testing. The first observation was that in well over half of the tasks in both search models, participants selected a search tab before beginning their search task. As we’ll see later on, this did not necessarily predict success in the task. In 60 tasks, user selected a tab before searching; in only 24 tasks did users stay on the ‘All’ tabUsers tended to stay within silos – very rarely did they search in both ‘All’ and a silo
In over half of the tasks where users selected a tab, they selected the tab that was appropriate for that search
What we don’t know is if they refined their searches in the silo, were they expecting that their search was executed in the same silo This gets at whether or not the interface is learnable. Did the user not understand (in the case of Model 1) that the search would carry through to the various silos?
What worked for our users?For books, journal titles and services tasks, most users went to the correct tab or searched in all.i.e., users used the ‘Web site’ and ‘Books & Media’ tabs appropriatelyTrigger words in the questions directed people towards the correct tabAgain, selected the correct tab did not predict whether the user was successful in their search task
Journals articles and databases questions were problematic.In many cases for the database question, users mentioned that they were looking for a database tab or link. They understood the term databases, but were confused about where to go to search for a database in either Model 1 or Model 2. Users were looking for the trigger word databases.
The following chart outlines task difficulty across models, across tasksFor the DVD question, results could be skewed as several participants commented that they would use the facets in the catalog to narrow by format type. Almost all users scanned results for format ‘DVD’ rather than typing ‘DVD’ into their search. The photocopy question (#3) scored low in model 3 due to the fact that users scanned results so quickly that they often missed the link to photocopies – Also, if they were using the Web site search, best bets were not part of this.In many cases, the correct tab was chosen, but either the search term was incorrectly constructed or users scanned results so quickly that they ended up failing the task
Tasks difficulty chartedSuccess rates for tasks 6, 7 & 9 were quite different between the 2 models. Data from each of those tests are below. In looking back at observations from the study itself, search terms were quite varied between the 2 models for these questions. For question 6, general confusion about databases led to failed searches. For tasks 7&9 users tended to scan search results so quickly that they quite often missed the correct answer in the search results. Task 6: Web of Science databaseModel 1: Fail, Fail,Fail,Hard, EasyAll, Articles, Articles, Articles,Library WebsiteModel 3: Hard, Easy, Fail, FailAll, All, Articles, JournalsTask 7: Lessig Article. Model 1:Med, Fail, Fail, FailArticles, All, Journals, JournalsModel 3: Med, Easy, Fail, EasyArticles, Articles, Journals, ArticlesTask 9: Borrow a Laptop. Model 1:Hard, Med, Easy, FailWebsite, Website, Website, BooksModel 3:Easy, Easy, EasyAll, Website, Website
We also segmented the data by user status. In the case of undergraduates, their average task difficulty rating was 2.5.
Issues in the test itself: scenarios were too difficult to read aloud in a guerrilla usability testing setting. Tasks should be shorter and more precise.
Journals / journal titles will appeal to faculty, a group we did not test in this usability testDatabases will be visibly highlighted on the new homepage design.
There’s nothing to indicate that search tabs hindered the users’ search. Build 2 prototypes for the new site – one with a tabbed search box and one with a single search box and links to silos. Conduct a second usability test to determine effectiveness.
Research question<br />Do users navigate using the expected menu items?<br />3<br />
Background<br />Users were recruited in situ in the D.H. Hill Library lobby and asked to complete 4 of 15 tasks using a working prototype of navigation menus only.<br />Each task required the user to open a navigation menu and indicate which menu item they would select to look for the specified information.<br />Facilitators recorded up to 4 of each user’s menu selections, in order of selection.<br />A facilitator assessment of task difficulty was also recorded.<br />4<br />
Who were the participants?<br />24 Undergraduates<br />3 Graduates<br />2 Staff<br />1 Visitor<br />2 Library Staff Members<br />Business<br /><ul><li>Management
Use of the Web site<br />Nearly all participants described themselves as “pretty comfortable” or “very comfortable” with the current site.<br />Users reported that they used the current site most frequently for:<br />Research (finding articles/books)<br />Reserving Rooms<br />Reserving/Renewing laptops<br />Course Reserves<br />6<br />
IF YOU COULD IMPROVE ONE THING…<br />Most users expressed general satisfaction with the current site or described positive experiences. <br />Three users said the current site seems cluttered.<br />7<br />
Observations<br />Menus were learnable<br />Find resonated as a broader term than Search<br />Services menu worked well as a catch-all<br />Research Help was the most ambiguous<br />About was generally used as expected<br />8<br />
What worked well: Find<br />Find was very learnable<br />Books & Media label<br />Users readily used this to search for DVDs<br />9<br />
What worked well: Services<br />Scan/Copy/Print<br />for locating photocopy pricing<br />Digital Media Lab<br />DML had good name and function recognition. <br />10<br />
What worked well: Research Help<br />Citation Tools label<br />Users easily found the citation tools here, but few saw the Citation Builder sub-link.<br />11<br />
What worked well: About<br />Hours<br />Users did not hesitate to look for hours information on the About menu.<br />12<br />
Other Observations<br />Some users insisted that would not use the website to complete certain tasks.<br />Photocopy cost; flash drive loans; ask a librarian; info on video editing<br />When asked what they would expect to see under a Research Help menu (n=6)<br />4 mentioned chat, tutorials, citation help, and/or research guides<br />2 expected to find links to resources or search tools.<br />17<br />
Recommendations<br />Feature a Tripsaver link on the Borrow, Renew, Request target page.<br />Make access to Databases prominent on the homepage for those users who identify the term.<br />Change Research Help label to Get Help to better reflect the intent of the menu. <br />Test to see if Citation Tools & Course Tools still work<br />18<br />
Recommendations<br />Promote link to Library Tools for Courses on Course Reserves target page. <br />Continue to evaluate placement and labeling for Library Tools for Courses<br />Organize maps/wayfinding information by location.<br />Include maps links on target pages for specific libraries/locations.<br />19<br />
Recommendations<br />Make links to these top tasks easy to find on the homepage:<br />Reserve a Study Room<br />Renew laptop<br />Course Reserves<br />20<br />
Background<br />28 undergraduates, graduates and library staff, participated in a round of guerrilla usability testing for 2 proposed search models<br />Participants were recruited from the pool of patrons that walked into the lobby of DH Hill<br /> <br />Each participant was asked to complete 2 tasks using one of the search models; many participants volunteered to answer more than 2 questions<br />14 participants performed 46 tasks using Model 1 <br />14 participants performed 38 tasks using Model 3<br />24<br />
Who Were the Participants?<br /><ul><li>Science/Engineering:
Other Issues<br />Almost all participants only scanned the first page of search results for a quick answer to the task<br />Even if users executed a successful search, they often did not scroll down far enough in the search results to find the answer <br />Spelling was problematic<br />36<br />
Recommendations<br />Although there was confusion with the Journals and Articles tabs, consider keeping both tabs to accommodate both novice and advanced users. Consider changing the ‘Journals’ tab label to ‘Journal Titles.’<br />Look to the redesign team to better highlight an option for databases on the homepage. Do not add a tab for databases as this could further highlight issues between databases, articles & journals. <br />37<br />
Recommendations<br />Consider implementing a tabbed search box in the new design. <br />Conduct more usability testing in the context of the new site design. <br />38<br />
Next Steps<br />Take recommendations from usability testing and incorporate into homepage wireframes<br />Build 2 search prototypes in the context of the new Web site homepage – one with a tabbed interface and one with a single search box with links to silos. <br />Conduct more usability testing on the working homepage prototypes.<br />39<br />