1) The document summarizes a usability test conducted on the Plymouth State Institutional Repository to evaluate how easy it is for users to navigate and find desired information.
2) Participants were recruited from students, faculty, and the public and asked to complete tasks like finding an image to use in a scholarly article.
3) The results of the testing provided lessons learned about how to improve the audience, metadata, and home page of the repository to make it more user-friendly.
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
VRA 2015 Usability Chenard
1. If You Build It, They Will Come,
But Will They Come Back?
Supporting user-friendly online resources with usability testing
Usability Testing the
Plymouth State
Institutional Repository
Christin Chenard
Metadata Resources Library
Plymouth State University
March 12, 2015
9. “You’ve been conducting research pertaining to
the architectural features of Plymouth State
University’s campus buildings. You’ve gathered
Tasks & Script
enough information to
write a scholarly article
and submit it to a
reputable journal for
publication. Where can
you find out if you could
get permission to use the
image?”
14. All Images from the Plymouth State Institutional Repository
Slide 1
Rounds Hall (n.d.) Plymouth State Historical Images Collection
Slide 3
Krug, Steve (2010) Rocket Surgery Made Easy.
Slide 4
YMCA Sunday afternoon meeting (1915) Brown Company Collection
Slide 6
Cook (n.d.) Brown Company Collection
Slide 7
Students on the steps of Mary Lyon Hall (n.d.) Plymouth Historical Images
1970 Yearbook page 130 (1970) Plymouth Yearbooks Collection
Group in front of brick building (1938) Museum of the White Mountains Collection
Slide 8
Office personnel at work (1945) Brown Company Collection
Slide 9
Mary Lyon residence hall (n.d.) Plymouth State Historical Images Collection
Slide 10
Movie production ‘Paper With A Purpose’ (1957) Brown Company Collection
Slide 11
Berlin Milk Company Baseball Team (1924) Brown Company Collection
Slide 12
Luncheon at Costello Hotel (1958) Brown Company Collection
Slide 13
Berlin High School class (1956) Brown Company Collection
PSU has used the CONTENTdm platform for digital collections for nearly 5 years.
There are currently 8 published collections that originate from different departments and offices around campus
Usability testing is very well worth it, and it doesn’t have to be difficult.
The Institutional Review Board (IRB) is there for the protection of human subjects. Even though the regulations were set up to apply only to federally funded research, most academic institutions have a policy that any research involving human subjects conducted at the institution requires IRB approval.
The IRB process turned out to be a very positive one as it forced us to think all the way through the project.
The process requires completion of the Protecting Human Research Participants training course from the NIH (https://phrp.nihtraining.com/index.php) as well as completion of an application.
This only shows items 1-10 of 20 items on the checklist. We spent a couple days in July working on the application. The IRB process forced us to get organized well in advance of the testing. Without this push, it would have been difficult to impossible to complete the tests during the busy fall semester.
By the end of the IRB process, we had thought through and answered many questions:
What are the start and end dates?
Where will testing take place?
Who are the participants?
What are the risks to participants?
What is our purpose?
What instruments will be used to collect data?
How will we protect confidentiality?
What compensation will be given to participants?
And we had written…
a consent form.
the script that would be used in the sessions.
the tasks participants would attempt.
Upon completion of the materials for the IRB, we had thought through all of the “ingredients” we would need for a successful test. These were the ingredients with the most relevance to our project.
We chose to run the test with three people from each of our three different user groups. We identified three groups of users who we believed already used or would potentially use the repository in the future. We suspected and confirmed that each group had distinct characteristics, levels of search experience, expectations, habits, etc.
Participant recruitment was one of our bigger challenges. In our first round, where we sought public patrons, we tried approaching folks in public library and found few willing to participate. In the second round, involving students, we tried scheduling students ahead of time. While we got seemingly genuinely enthusiastic responses to the initial request, we didn’t ask them to commit to a particular time and no one came. We ended up approaching students in the library. This same strategy that did not work well for us with public patrons in the first round worked just fine with students. In our faculty round, we reached out via email to faculty members we had existing relationships with, and the response was good. We scheduled time slots in advance. The upshot is that different strategies work better for different groups.
We chose to test public group in August, students in October, and faculty in December. The 2 month intervals gave us time to make whatever fixes were deemed appropriate based on the first rounds of testing.
Tests were conducted in quiet, private rooms in either the Plymouth Public or Plymouth State libraries.
Small thank you gifts were offered to participants upon completion of the tests. Usually these were tokens for a free beverage at the library café.
In the sessions, we started with some basic questions to get to know about the participant and to try to put him or her at ease. For example: How many hours a week do you spend on the internet? or What is your favorite website?
We emphasized that it was the site we were testing, not them.
From there we started with a general question about the repository home page: What do you think you can do here, what is this page for?
And after that we had three specific tasks. We selected the tasks based on what we thought it was important for people to be able to accomplish using the repository.
We used Camtasia to record the action on the screen as well as the audio. This allowed us to focus on making the session go smoothly and meant that we didn’t need to be overly worried about note taking. It was also helpful to have clips to show collection managers.
If at all possible, work with a co-conspirator. Benefits include, having someone to share the preparation with, having one person to handle the recording and another person conduct the conversation during the sessions, and having two different perspectives when analyzing the results.
We rewarded our hard work by going out to lunch after each round of testing. It allowed us to talk things over while they were still fresh in our minds.
We learned:
Who the audience for our repository was and wasn’t. The public testers did not like our IR. They called it academic, but not in a good way. We learned that the IR is not a resource for the general public, but a research resource that is publicly available. This may impact future decisions about how it is marketed, and described.
We learned that robust metadata is hugely important. People are expecting Google level results sorting, and no matter how good your algorithm is, if it has insufficient data to work on, the results ranking won’t be good. Also, people just want the backstory of the images they are seeing.
We learned that there were a lot of changes we’d like to make to the repository home page. Some of these were easier than others. We made some changes in between rounds of testing, but other changes we have not yet figured out how to do.