Barber Library Website Usability Results, Fall 2012

  • 91 views
Uploaded on

Results of a library website usability study.

Results of a library website usability study.

More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
91
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • A little background
  • Effectiveness is ability of users to complete tasks using the system and the quality of the output of those tasksEfficiency is the level of the resource consumed in performing the tasksSatisfaction is users’ subjective reaction to using the system
  • A ton of library website usability studies have been done over the years – some focus just on the library’s homepage, some on the library website, while others even focus on the catalog or e-journal portal. With the advent of the new “discovery layers,” there are now usability studies on those.
  • Overview of process of recruiting volunteers, response – posters, item in student newsletter; notably, not on the library’s website; secured at least one volunteer from Redmond, Madras, and Prineville; famous Jakob Nielson dictum that 5 is enough people for a usability study (not talking quantitative academic research here) – so more than enough in our study – “discount usability testing” (Krug) – experience showed that it was so – in 5 or 6 sessions, most of the problems revealed themselves
  • Quantitatively, numbers not valid – not enough participants – usability tests focused on finding actionable fixes and on qualitative info; also – not a representative sample – rather, self-selected (impacts for results)Had a participant who identified primarily as a student at each COCC campus; Not really a significant difference b/w Bend users & users in other locales - self-selection could have something to do with that - RDM and Madras students were both enthusiastic about the library generally Both RDM and Prineville student had been to the physical library at some point and had also taken a class or two in Bend
  • Obviously, this stands to skew the results a bit – both b/c people have had instruction and were somewhat familiar with the library’s website
  • Share questions – questions chosen to reflect a wide but practical range of library services; students briefed beforehand on the process and expectations, tasks phrased as brief scenarios – another suggestion from the literature – no questions answered during the session, SUS is an industry standard, developed by John Brooke in 1996 – measures subjective perception of the website, ended the session by following up on any questions students had during the session. Just as we didn’t have a quantitatively valid # of users, our number of tasks is also not quantitatively valid – research suggests that 50 tasks would be required to make a quantitative evaluation of an information retrieval system. Users signed a consent to allow recording, sharing of results.
  • I coded each the “answer” to each question. Complete, expected (CE) - user reached the intended end, using the expected pathComplete, problems (CP) - user reached the intended end almost along the expected path, but experienced problems along the wayComplete, unexpected (CU) - user did reach the intended end and found something that would satisfy the task in reality but they did so in an unexpected way that points to usability problems and/or lack of awareness of library resourcesIncomplete, early end (IEE) - user stopped early, before reaching the intended end, whether they thought they were through and thought they had gone far enough to show that they understood the taskIncomplete, wrong (IW) - user reached unintended end that would not satisfy the task in reality but thought they were in the right placeIncomplete, user terminated (IUT) - user chose to stop trying to complete taskIncomplete, skipped (IS) - user skipped task accidentallyIt is suggested that in live usability testing the first question should really be a “gimme” to give participants confidence and ease their nerves
  • Most people didn’t have trouble finding the catalog search tool. Although this wasn’t a usability test of the catalog, some problems were more common than others. More than one person misspelled words in the catalog but didn’t notice when the “results” screen said “title not found” – one of these people actually ended up going through to Summit to find the book – while other people mysteriously had trouble when they chose the title index and typed in the whole title, including the colon. Putting the user in the shared OSU environment also caused a couple of problems – one user somehow managed to get to the OSU Libraries home page when trying to get back to the COCC Libraries while another user was confused by the oregonstate.edu URL – this wasn’t the first instance of cross-institutional confusion.
  • This was a very hard task for users, given the lack of functionality in the “Library Services” menu (since remedied). Plus, once students got on the page, there was too much text and not clear enough statement of loan periods. A couple of students mentioned that this is they type of question – along with hours & study rooms – that they would just ask a person at the library – more trouble to find it than to just ask. I’ve since changed this page, based on a suggestion Lynne made a long time ago.
  • Students had no trouble getting to the “help” page, but it wasn’t clear that they saw the chat box and understood that it was for immediate help. A couple of students even clicked on the email link or highlighted the phone number for the information desk, as if those were the ways they would seek help. One student went to the help page, then backed up to the home page before clicking on the “Get a Live Answer” box. I’ve since changed the help page to have a live chat widget, which I also hope to put elsewhere on the site.
  • People had a lot of trouble with this one. Some people went to the e-reserves page first. Some people just went as far as the reserves page on the website or the reserves search screen and didn’t go any farther, so who knows if they could actually find it or not? Only one person searched for it in such a way that it came up. Everyone else put in “Math 111” which doesn’t produce any results, since we have it listed as MTH 111. Again, one respondent mentioned that they would just ask at the desk. Almost no one knew where the reserves page or search function was, but we know our reserves get plenty of use in real life, so it’s likely that this is a more in-person kind of service.
  • Most people had no problem with this one. Most used the keyword search, though one student did use the “subject” index – luckily, this topic just happens to be a subject heading as well. One student went to Summit catalog right away, rather than starting with COCC. When students are funneled away from site into catalogs or databases, they have trouble getting back, beyond the “back” button.
  • Most people did pretty well with this. It was clear that several people were familiar with Academic Search Premier, while others chose it because it was linked at the top of the page as a starting place. One student chose Academic One File, probably b/c it’s first in the list – this student’s search turned up a bunch of citation-only results – aka, no full-text – but she didn’t notice. One student reviewed the list of databases and chose “Health Source”; a couple of students demonstrated – unasked – that they knew how to limit and filter their searches. One student used the Library catalog and did a keyword search – possibly didn’t understand what an article was. I did learn that users on the wireless network in Prineville have to proxy into COCC resources (don’t know about the computer lab) – the person who tested there wasn’t confused by this, but others may be.
  • Again, people said they would just ask a person at the library. Some people were still having difficulty with the “Library Services” menu in this question, but most got there somehow. One user was having so much trouble with the library services menu that they gave up on this one.
  • This was a very interesting task to watch – far more interesting than I had intended – people handled it really different ways – one student went straight to the DPL link on our catalogs page and found it there. A few went to the OSU catalog (OSU has it, too). Only three people actually went to the Summit catalog. One poor student went to the WorldCat link and got terribly confused once he did a search in there. One student just clicked around on the site and finally settled on the “Redmond, Madras, and Prineville” page, assuming, I guess, that he could get it from another COCC branch – which of course, he can’t. One student went immediately and confidently to the “Interlibrary Loan” link – which a few other students had looked at, too. I’m not entirely sure what to make of this except that the Summit name doesn’t seem to have wide recognition among students and “interlibrary loan” seems to be expressive of the scenario described.
  • This was problematic for people – though I expected it here. Several users easily found the link to the e-journal portal – both on the left menu and on the “Articles & More” page, but then they had enormous trouble using the e-journal portal – they only put in “ecology,” for example, in the search box, which gives tons of results but only of titles “STARTING WITH” ecology, not including ecology – they clearly thought it was keyword search like so many other searches in other library resources. Two students searched the catalog, using the “Journal” index, which is all fine and good, except that it took them to the links to OSU’s e-journals – one student just stopped after finding the link, and another student stopped at the ONID login page, but still seemed confident that she’d found the right thing, saying “now I’d log in” – perhaps is used to our proxy page off-campus and assumed it was the same thing. A couple of people started in Academic Search Premier and tried searching there, while another user scrolled the list of databases, looking for the “Journal of Ecology”. One student got as far as finding the title in the e-journal portal but then seemed too confused by the various options and didn’t click on any of them, clicking on the “citation linker” link instead. Remarkably, most students recovered and eventually found their way to the e-journal portal and found the title. But, when they found that, they almost invariably clicked on a link that would not give them access to the most recent issue – I think they clicked on it b/c it was listed last, and the conventions of lists would lead people to think that’s the most current one. I can’t help but think that most of them wouldn’t try that hard unless they had to for an assignment – or a usability test…
  • People mostly had no trouble finding the “resources by subject” page, though some did have a little trouble finding the “Social Sciences” category that includes psychology. (Ctrl+F was used).
  • A lot of people didn’t see the quick link on the home page or the option under the help menu on the left – most went to “research tools” menu or page. A couple of people found this link on another, unrelated page (it appears in the “help” column on the right of the resources by subject page, for example). So, they got there eventually, just not as easily as I thought.
  • Most people were able to find this eventually, though I was surprised at how many people scrolled the list of titles rather than searching Credo or another collection. It was also clear that when students selected a title, then searched, they mostly didn’t know they were searching the whole Gale or Sage collection.
  • Students were able to complete more tasks than not – but, would they exhibit as much patience at home?Overall observations: people have a hard time getting out of databases/catalogs; the interface changes are confusing (website to ASP to catalog, etc., for example); if people don’t return to the home page, they don’t see the quick links, etc. – a lot of people use the “Research Tools” page to navigate – there are still many questions that students would just ask a person in the library – these students, anyway – rather than looking for info onlineOne student used the search box on the website for one question – that was it – everyone else attempted to navigateSome students appreciated the descriptive info with the links, others ignored themOn the databases and encyclopedias page, no one clicked on the small orange I for more information – though people read info that was easily visible throughout the rest of the site, they didn’t want to click for more info, presumablyBiggest challenge is probably awareness of library resources, not website usability, honestly - both in Bend and at other campuses
  • Share SUS questions & process. The people in the test felt positively about the Library generally, and I gave them something, so chances are these scores aren’t the most reflective of everyone’s experience – and, again, not enough results to be quantitatively significant – just for info about our volunteers in this study.
  • Of course, library computers open to library website as home page – doesn’t exclude staff usageThis is total pageviews.
  • In addition to usability tests, I set up an experiment with Google Analytics to get some quantitative data as to whether the icons or text were preferred on the home page. Set up an experiment with the articles icon by creating two slightly different links – many more people are clicking on the icon than on the linked text below. Google Analytics only counts links to internal pages (not external links, without some futzing), so not the most reliable count of how much of our traffic is going where, but still shows that more people are clicking on the icons than on the text. Numbers were collected on 11/19/12 – links will go back to same page – too much trouble to maintain two separate pages and no point now that data is gathered.
  • This data is from Nov 1-30, 2012, but the pattern holds for the months we have data from so far. 3 pages represent almost 75% of our entire use. By our 10th most used page, it’s only 1% of our total usage – I think this says something important about the pages we need to pay attention to. Just b/c there’ s a link to something on our homepage, it doesn’t mean it gets a lot of traffic. Our Madras, Redmond, Prineville page, for example, had only 13 page views in November. Important to note that Google analytics only counts traffic to www.cocc.edu – so we don’t know, for example, how many people are going to the ILL log in page, the catalog, the Summit catalog, etc. – we have an idea of database usage – which is quite good, I think – from our COUNTER stats.
  • Based on usability results, I’d like to change the left menu headings. Feel pretty good about all of them except “USE” – OSU uses “In the Library”, which isn’t too bad but not parallel, grammatically, with the other headingsHome page is our best real estate – but, the more stuff there, the less findable any of it is
  • Ultimately, we want to pare down some of our lists and what not, but how do we do that without sacrificing the incredible richness of our resources? We don’t want to go in the other direction and provide less access inadvertently…
  • While we still have 2 years in which we need to provide a very user-friendly website, it’s worth noting that moving to the shared ILS will impact this area, too.

Transcript

  • 1. Fall 2012
  • 2.  Identify confusing/problematic areas of the site and areas for further work and development  Ensure Library website meets needs of students at all COCC campuses  Make data-driven changes  Ultimately: Support student academic success via the website Flickr, @Doug88888
  • 3.  ISO 9241 (Ergonomics of Human System Interaction) defines usability quite well:  Effectiveness  Efficiency  Satisfaction BRLESC I, Early Electronic Computer. Photography. Encyclopædia Britannica Image Quest. Web. 10 Dec 2012.
  • 4.  “Known problems” with Library websites:  Library jargon/terminology (databases, periodicals, peer review)  Users’ lack of information literacy training  Lack of clarity in differences between catalogs, databases, online e-book collections, etc. and what can be found in each  Too many options on Library websites, especially for novice users  Cluttered homepages District Of Columbia. Photography. Encyclopædia Britannica Image Quest. Web. 11 Dec 2012. What LibraryWebsites Can Feel Like for Novice Users
  • 5. •13 scheduled volunteers •10 completed usability tests •18-20 additional volunteers turned away
  • 6. 0 2 4 6 8 10 Age Range # of Participants 0 2 4 6 8 10 1-4 terms 9-12 terms Time at COCC # of Participants 0 2 4 6 8 10 Average Advanced Self-Assessment of Computer Skills # of Participants Students Majoring in: •CIS (2) •Automotive •Undecided •AAOT (2) •Aviation Students Majoring in: •Community Health Ed •HIT •Criminal Justice
  • 7. 0 2 4 6 8 10 Yes No HaveTaken a Library Class # of Participants 0 2 4 6 8 10 1-3 times each term 1-3 times each month 1-3 times each week Frequency ofVisits to Library or Library Website # of Participants
  • 8.  13 common tasks to try to complete on Library’s website  Screen recorded using Blackboard (Bb) Collaborate & notes taken during testing  After completing task portion, students answered two short surveys in Bb  Demographic-type information  System Usability Scale (SUS)
  • 9. 16:04 16:50 16:57 20:53 28:23:00 29:37:00 35:03:00 38:58:00 43:51:00 47:02:00 1 2 3 4 5 6 7 8 9 10 Amount of Time Users Took to Complete 13 Tasks Average (Mean): 29:21:08 Range: 30:56
  • 10. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 11. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 12. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 13. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 14. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 15. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 16. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 17. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 18. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 19. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 20. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 21. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 22. 0 1 2 3 4 5 6 7 8 9 10 CE CP CU IEE IW IUT IS
  • 23. CE 32% CP 21% CU 15% IEE 10% IW 18% IUT 3% IS 1% Success Rate, AllTasks
  • 24. 0 10 20 30 40 50 60 70 80 90 100 0 2 4 6 8 10 SUS Scores SUS Scores
  • 25. - 5,000 10,000 15,000 20,000 25,000 30,000 35,000 40,000 45,000 50,000 Jul-12* Aug-12 Sep-12 Oct-12 Nov-12 LibraryWebsite Pageviews
  • 26. - 2,000 4,000 6,000 8,000 10,000 12,000 14,000 16,000 default 38% databases 18% Research-Tools 16% catalogs 9% articles 6% reference 4% hours 3% subjects 3% academicencylo pedias 2% help 1%
  • 27.  No clear definition when searching COCC web site library whether the information you are looking for is located in the COCC library. I couldn’t tell if I was still on the COCC website or was directed towards OSU's website till it loaded the screen.  Improvements in consistency in the interface and reduction of options per screen would help me.  A more obvious link that would take me back to the homepage.
  • 28.  Making a little less complex and having the links on the left side a little more prominent.  Had I not taken [Lib 127] I think I would have found the site much more difficult to navigate.  I had trouble figuring out that I had to click on the arrow on the menu on the left on the homepage in order to get a dropdown menu with further choices.  Once I got down a path to find resources I had to use the back button to get to the home page.
  • 29.  Lots of resources available  Clean interface  Icons on home page  Well-labeled links  Availability of help throughout the site
  • 30.  Improvements to “Subjects by Resources” page – both in organization/presentation and location of link  Make video tutorials easier to find, more obviously linked  Needs to be more engaging/welcoming; too boring – more color, pictures  Live chat widget instead of “Get a Live Answer” icon  Return of rotating images to home page rather than static icons  Improve Articles & More page to give students better idea of what to do when they land there  Things in left menus are too hidden  Too much scrolling necessary on articles, encyclopedias, subjects by resources pages
  • 31. “Since they lack sufficient skills for searching and evaluating resources, it is difficult for undergraduates to navigate successfully in an environment that provides an overwhelming amount of information.” (Lee, 2008) RyoanjiTemple, Dry Stone Garden And Blossom, UNESCO World Heritage Site, Kyoto City, Honshu Island, Japan, Asia. Photography. Encyclopædia Britannica Image Quest. Web. 11 Dec 2012.
  • 32.  Winter term project focusing on Redmond, Madras, and Prinevillle students and how the Library website can best serve their needs.  Migrating to Ex Libris’ Alma/Primo will significantly change how people search for library resources…in ways both foreseen and unforeseen.  This is also likely to have a significant impact on our website. Sun Coming Up Over Horizon. Photography. Encyclopædia Britannica Image Quest. Web. 11 Dec 2012.
  • 33.  Brooke, J. (1996). "SUS: a "quick and dirty" usability scale". In P.W. Jordan, B. Thomas, B. A.Weerdmeester, & A. L. McClelland. Usability Evaluation in Industry. London:Taylor and Francis.  Dixon, L. , et al. (2010). “Finding articles and journals via Google Scholar, journal portals, and link resolvers: Usability study results.” Reference & User Services Quarterly 50.2 (170-181).  Lee, H. (2008). “Information Structures and Undergraduate Students.” Journal of Academic Librarianship 34.3: 211-219.  Letnikova, G. (2008). “Developing a standardized list of questions for the usability testing of an academic library web site.” Journal ofWeb Librarianship 2.2 (381- 415).  Lown, C., Sierra,T. & Boyer, J. (anticipated 2013). “How Users Search the Library from a Single Search Box.” College and Research Libraries. Pre-print.  McHale, N. (2008). “Toward a user-centered academic library home page.” Journal ofWeb Librarianship 2.2 (139-176).  Swanson,T. & Green, J. (2011). “Why We Are Not Google: Lessons from a Library Website Usability Study.” Journal of Academic Librarianship 37.3 (222-229).  Vakkari, P. & Huuskonen, S. (2012). “Search Effort Degrades Search Output but ImprovesTask Outcome.” Journal of the American Society for Information Science andTechnology 63.4 (657-670).