Successfully reported this slideshow.
Your SlideShare is downloading. ×

HYDRA Usability Testing

Ad

HYDRA USABILITY TESTING
West Virginia University Libraries

Ad

USABILITY TEST INTRODUCTION
Problem Overview:
In an effort to improve and streamline the digital collection user experienc...

Ad

USABILITY TEST STRATEGY
Testing Parameters:
WVU Libraries used TechSmith Morae to conduct usability testing of the Clarysv...

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Check these out next

1 of 31 Ad
1 of 31 Ad

HYDRA Usability Testing

Download to read offline

WVU Libraries used TechSmith Morae to conduct usability testing of the Clarysville Civil War Hospital Digital Collection user interface on both a DELL Windows 7 laptop - using Internet Explorer 11 at a browser width of 1366 pixels - and on various smartphone devices/browsers as provided by individual users.

WVU Libraries used TechSmith Morae to conduct usability testing of the Clarysville Civil War Hospital Digital Collection user interface on both a DELL Windows 7 laptop - using Internet Explorer 11 at a browser width of 1366 pixels - and on various smartphone devices/browsers as provided by individual users.

Advertisement
Advertisement

More Related Content

Advertisement
Advertisement

HYDRA Usability Testing

  1. 1. HYDRA USABILITY TESTING West Virginia University Libraries
  2. 2. USABILITY TEST INTRODUCTION Problem Overview: In an effort to improve and streamline the digital collection user experience for West Virginia University (WVU) Libraries’ patrons, with both responsive website and web application user experiences, WVU Libraries implemented a tabbar-based interaction module for mobile devices on new HYDRA repositories. The effectiveness of this user interface change from the standpoint of both institutional research, special collections, and its importance to the HYDRA UX Interest Group needs to be tested and vetted.
  3. 3. USABILITY TEST STRATEGY Testing Parameters: WVU Libraries used TechSmith Morae to conduct usability testing of the Clarysville Civil War Hospital Digital Collection user interface on both a DELL Windows 7 laptop - using Internet Explorer 11 at a browser width of 1366 pixels - and on various smartphone devices/browsers as provided by individual users. These devices were used based on a report generated from 37,296 visits to websites that used W3Counter's free web stats. Based on the W3Counter report, since 93% of browsing occurs at a 1,366 or less pixel width - with 17% being the largest portion for a 1,366 by 768 pixel laptop resolution – both laptops and various smartphones were identified as being optimal testing devices.
  4. 4. USABILITY TEST TARGET AUDIENCE Target Audience: Breakdown of target audiences is based on numbers populated by the WVU Office of the University Registrar, qualitative survey polls, and quantitative server statistics. This round of usability testing was focused on WVU Libraries primary target audience, and internal to WVU Libraries by utilizing undergraduate student employees. Primary Target Audience (73% / 107,062 students per month) WVU undergraduate students who are 48% female or 52% male, are primarily Caucasian, African American, Hispanic, or Asian in ethnicity, are United States citizens or are international students, are 21 years in their average age, and are 49% from the state of West Virginia or 51% from other United States, territories, or countries. Secondary Target Audience (19% / 27,865 students per month) WVU graduate students who 41% male or 59% female, are 83% US citizens, 14% nonimmigrants mostly from India or the People’s Republic of China, and are 31 years in their average age. Tertiary Target Audience (8% / 11,733 students per month) WVU administration, faculty, staff, and Morgantown, West Virginia community members.
  5. 5. USABILITY TEST PARTICIPANTS Alex Chris Colin Devyn Erica Kayla Ryan Taija          
  6. 6. USABILITY TEST FULL VIDEOS https://testing.lib.wvu.edu/hydra/alex_m.wmv https://testing.lib.wvu.edu/hydra/chris_d.wmv https://testing.lib.wvu.edu/hydra/chris_m.wmv https://testing.lib.wvu.edu/hydra/colin_d.wmv https://testing.lib.wvu.edu/hydra/devyn_d.wmv https://testing.lib.wvu.edu/hydra/erica_d.wmv https://testing.lib.wvu.edu/hydra/kayla_d.wmv https://testing.lib.wvu.edu/hydra/kayla_m.wmv https://testing.lib.wvu.edu/hydra/ryan_m.wmv https://testing.lib.wvu.edu/hydra/taija_m.wmv
  7. 7. USABILITY TEST TASKS 1. Go to the Clarysville Civil War Hospital Digital Collection 2. How many records are there for people from Ohio were treated for gunshot wounds? 14 3. What was the last name of the only record for a man with the first name Anthony? 4. How many records are there for Corporals with typhoid fever, from Pennsylvania? 3 5. What was the first name on the record of a man from Kentucky with the last name Smith? 6. On the record of Matthew Wilson, was he married? 7. Who was the only Orderly treated at Clarysville Hospital? 8. For the record of Michael Weaver, in what town did his relatives live? 9. When was the Clarysville Hospital Established? 10. What is the age of the sergeant from New York who was married?
  8. 8. OVERALL TASK: AVERAGE TIME (  Desktop ) * Average time includes delivery of tasks.
  9. 9. OVERALL TASK: AVERAGE TIME (  Mobile ) * Average time includes delivery of tasks.
  10. 10. OVERALL TASK: AVERAGE TIME ( Comparison ) * Average time includes delivery of tasks. 0 0.5 1 1.5 2 2.5 Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10 AverageTime(Minutes) Tasks Desktop Mobile 
  11. 11. OVERALL TASK: SUCCESS RATE (  Desktop ) Failed to complete Completed with difficulty Completed Completed with ease
  12. 12. OVERALL TASK: SUCCESS RATE (  Mobile ) Failed to complete Completed with difficulty Completed Completed with ease
  13. 13. OVERALL TASK: SUCCESS RATE (Comparison ) Dektop Mobile Failed to complete Completed with difficulty Completed Completed with ease Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10
  14. 14. 1. I think that I would like to use this system frequently 2. I found the system unnecessarily complex 3. I thought that the system was easy to use 4. I think that I would need the support of a technical person to be able to use this system 5. I found the various functions in this system were well integrated 6. I thought there was too much inconsistency in this system 7. I would imagine that most people would learn to use this system very quickly 8. I found the system very cumbersome to use 9. I felt very confident using the system 10. I needed to learn a lot of things before I could get going with this system USABILITY TEST POST-ASSESSMENT
  15. 15. POST-ASSESSMENT RESULTS (  Desktop ) I needed to learn a lot of things before I could get going with this system I felt very confident using the system I found the system very cumbersome to use I would imagine that most people would learn to use this system very quickly I thought there was too much inconsistency in this system I found the various functions in this system were well integrated I think that I would need the support of a technical person to be able to use this system I thought that the system was easy to use I found the system unnecessarily complex I think that I would like to use this system frequently      60% 80% 100% 80% 80% 40% 20% 20% 20% 80% 40% 20% 40% 20% 20% 60% 80% 60% 80% Strongly Disagree Disagree N/A Agree Strongly Agree
  16. 16. I needed to learn a lot of things before I could get going with this system I felt very confident using the system I found the system very cumbersome to use I would imagine that most people would learn to use this system very quickly I thought there was too much inconsistency in this system I found the various functions in this system were well integrated I think that I would need the support of a technical person to be able to use this system I thought that the system was easy to use I found the system unnecessarily complex I think that I would like to use this system frequently      POST-ASSESSMENT RESULTS (  Mobile ) 80% 80% 60% 40% 60% 20% 20% 40% 60% 20% 40% 20% 40% 40% 20% 60% 60% 20% 20% 20% 60% 40% 40% 40% Strongly Disagree Disagree N/A Agree Strongly Agree
  17. 17. POST-ASSESSMENT RESULTS ( Combined ) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% N/A Agree Strongly Agree I think that I would like to use this system frequently Desktop Mobile 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Strongly Disagree Disagree I found the system unnecessarily complex Desktop Mobile   
  18. 18. POST-ASSESSMENT RESULTS ( Combined ) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Strongly Disagree Disagree Agree I think that I would need the support of a technical person to be able to use this system Desktop Mobile 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Agree Strongly Agree I thought that the system was easy to use Desktop Mobile   
  19. 19. POST-ASSESSMENT RESULTS ( Combined ) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Agree Strongly Agree I found the various functions in this system were well integrated Desktop Mobile 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Strongly Disagree Disagree N/A I thought there was too much inconsistency in this system Desktop Mobile   
  20. 20. POST-ASSESSMENT RESULTS ( Combined ) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Agree Strongly Agree I would imagine that most people would learn to use this system very quickly Desktop Mobile 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Strongly Disagree Disagree N/A Agree I found the system very cumbersome to use Desktop Mobile   
  21. 21. POST-ASSESSMENT RESULTS ( Combined ) 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% N/A Agree Strongly Agree I felt very confident using the system Desktop Mobile 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% Strongly Disagree Disagree N/A Agree I needed to learn a lot of things before I could get going with this system Desktop Mobile   
  22. 22. • 1 out of 10 tasks' average time for completion was longer on a mobile device versus desktop • 8 out of 10 tasks' average time for completion was equal to/shorter on a mobile device versus desktop • 5 out of 10 tasks' average time for completion was shorter on a mobile device versus desktop ON A MOBILE DEVICE… 50% of tasks’ time for completion ranged 9 to 41 seconds shorter, averaging 20.8 seconds faster. FINDINGS: COMPLETION TIMES
  23. 23. FINDINGS: SUCCESS RATES • 2 out of 10 tasks' overall success rate were less on a mobile device versus desktop • 6 out of 10 tasks' overall success rate were equal to/improved on a mobile device versus desktop • 5 out of 10 tasks' overall success rate were improved on a mobile device versus desktop ON A MOBILE DEVICE… 50% of tasks’ overall success rate improved by a range of 20% to 80%, increased ‘completed with ease’ success rates by 40%, and reduced ‘failed to complete’ rates by 20%.
  24. 24. • 1 out of 5 more of mobile device users strongly disagreed that they found the system complex. • 1 out of 5 more of mobile device users thought inconsistency in this system was not applicable. • 1 out of 5 more of mobile device users thought confidence in using the system was not applicable. • 1 out of 5 more of mobile device users thought needing to learn the system was not applicable. ON A MOBILE DEVICE… User assessment improved by 20% in regards to ease of complexity, learning, and consistency. FINDINGS: USER ASSESSMENT
  25. 25. • 100% of users don't scroll past the media in the document view. Change all full record text to be above or beside media. • 80% of desktop viewers found the collection through the library, whereas 40% used voice search on their smartphone. Eliminating intermediate pages and instead link to collection directly. • 60% of users think Blacklight options are menu options. Make the Blacklight header tabs take up less space, and exist on one line in mobile browsers. • 40% of users thought the WVRHC/Library navigation header was the collection menu. Reduce branding and headers to be minimal and take up less space. • Add web application hints for searching and faceting on the collection’s first load, then disable hints with browser cookies for return visitors. • Incorporate web application hints that explain facets can come from both accordions and search terms, and that users can further refine search after searching. • Test/Fix JavaScript so menus, tabs, and search boxes collapse/expand on search, selection, and page load. • Add the word 'Menu' next to the hamburger icon to increase menu visibility. • Increase the size of the ‘X’s to remove facets on mobile devices. User are struggling with the small selection areas. PROPOSED ENHANCEMENTS
  26. 26. WVU LIBRARIES AND HYDRA
  27. 27. RESPONSIVE DIGITAL REPOSITORIES
  28. 28. MULTI-LAYERED HYDRA HEADS
  29. 29. TWO-PART AUTOMATED INSTALLATION
  30. 30. Michael Bond Senior Software Engineer Michael.Bond@mail.wvu.edu 304-293-0800 Tim Broadwater UX Designer / Front-End Developer Timothy.Broadwater@mail.wvu.edu 304-293-0800 ADDITIONAL INFORMATION Other Digital Repositories: • https://civilwarwv.lib.wvu.edu • https://storercollege.lib.wvu.edu • https://clarysville.lib.wvu.edu • https://holt.lib.wvu.edu • https://rockefeller.lib.wvu.edu
  31. 31. THANK YOU

×