Improving Catalyst by Reaching Out


Published on

A bit about how we're handling user testing/feedback with our Blacklight-powered ( library catalog.

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • For the first 81 days.
  • Not 70% of search engine traffic. Of all traffic.
  • Because we are collecting this data in order to improve Catalyst.
  • We’re not just limited to what our programming can record.
  • Feedback form directly on every page of Catalyst. Asks for your comment, some contextual information, and an optional e-mail address if you’d like a response. Every item that has come in with an e-mail address has been responded to. 234 pieces of feedback have been submitted so far.
  • Between April 5 – April 15, Cynthia York and I conducted one-on-on user tests with members of the MSEL community
  • Silverback. Allowed us to document the user tests. Records the screen activity and user image and voice.
  • (Blurred out user) Recording the face and voice is important. Someone may tell you (out of embarrassment, perhaps) that they did not have any trouble performing the tasks, but a scrunched up face and exasperated breaths may indicate otherwise.
  • Example of a task that the users were asked to perform. Aimed at seeing how comfortable users are with using the limits, date range selection, and the selected items features of Catalyst
  • Handout given to the user to rank the elements on the page. We received a lot of comments that indicated that certain things should and should not be on the page. It was very useful to make those preferences quantitative.
  • Would have liked to get more, of course, but we are approaching this with a short testing cycle and then integrating the results into our updates.
  • Members of the Catalyst team met with Eisenhower librarians, and members of the PI and TS HILTS groups to answer any questions and address any concerns relating to those user groups. Sometimes feedback can be hard to articulate in a text box or e-mail, so like the user testing, we felt that a face-to-face approach would gain some meaningful insights.
  • So how are we using all of this data? I’m going to give you an example.
  • Not enough. We think it’s a cool feature. People love to text. Mobile devices are everywhere. Maybe they’re not into it as much as we think they are.
  • No, they do.
  • Ah ha! Got it. People don’t know what it is. When you don’t know what something is, you don’t click on it.
  • Here’s how it currently appears in Catalyst. Without all of that preamble, would you have ever clicked on that icon? Does it even look clickable for that matter? The icon was a design compromise to address another concern that we heard early on “There’s too much text.” It’s a compromise that didn’t work. We would have never known that, however, if we hadn’t bothered doing the research. And we would have spent a time and energy on a feature that no one would use. Some brief explanatory text will return in a future revision.
  • Improving Catalyst by Reaching Out

    1. 1. Improving Catalyst by Reaching Out<br />Sean Hannan &<br />Dave Kennedy<br />
    2. 2. Some facts. <br />
    3. 3. 135,789 total page views<br />
    4. 4. 13,777 searches<br />
    5. 5. 2,446 title searches<br />
    6. 6. 1,003 subject searches<br />
    7. 7. 1,550 searches for online materials<br />
    8. 8. 465 searches for journals<br />
    9. 9. 146 records texted to mobile devices<br />
    10. 10. 71% of Catalyst traffic comes from search engines<br />
    11. 11. 70% comes from Google<br />
    12. 12.
    13. 13. Most viewed item: Gil Scott-Heron’s MFA thesis.<br />
    14. 14. Why am I telling you this?<br />
    15. 15. Other data collection methods:<br />
    16. 16.
    17. 17. User Testing<br />
    18. 18.
    19. 19.
    20. 20. Locate a thesis on architecture published from 1900 – 1913 that can be found at the Eisenhower location. Save these to your Selected Items.<br />
    21. 21.
    22. 22. Users Tested<br />5 graduate<br />1 undergraduate<br />1 faculty<br />6 of the 7 indicated that the interface was easy to use and provided satisfying user experience.<br />
    23. 23. Focus Groups<br />
    24. 24. An example.<br />
    25. 25. 146 records texted to mobile devices<br />
    26. 26. “I like the SMS option and the search capability seems to be a lot better than the current one.” “The option to SMS a listing from the results screen is very nice…”- via Feedback<br />
    27. 27. Only 3 of the 7 users in the user tests bothered to even rank it as a visible feature. The 3 that did, ranked it of low importance.<br />
    28. 28.
    29. 29. Questions?<br />