Your SlideShare is downloading. ×
Incorporating Usability Into the Database Review Process: New Lessons and Possibilities" Charleston Conference Presentation 2013 Saturday
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Incorporating Usability Into the Database Review Process: New Lessons and Possibilities" Charleston Conference Presentation 2013 Saturday

332
views

Published on

Published in: Technology, Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
332
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • In theory every review of database should account for the full user experience of the product. And by user experience, I don’t only mean end user. I mean the total experience a user or an institution has with a product, which contains many different aspects.There are many factors affect someone’s user experience of an online resource. How credible it is for instance. The cost of the product, how much it is valued by the department. And of course how usable that product is. We have had many different ways of finding out how usabile something it.
  • In the past, we’ve used usage statistics, our impressions of a database, and the user feedback to determine how usable a product way, but we’ve never had a real process for how we decide how usable something it. Often we know that these errors exist, but lack a concret way of getting this information back to database providers, who need that feedback, but we don’t need user feedback to find these errors. They are obvious to us whenever we use the product. We need a method that we can incorporate into our existing system
  • This project used a method existing in user experience research that might be useful in recording usability errors when evaluating the process.
  • From Beth Associate Deans
  • Transcript

    • 1. INCORPORATING USABILITY INTO THE DATABASE REVIEW PROCESS:: NEW LESSONS AND POSSIBILITIES Ilana Barnes New Lessons and Opportunities @librarianilana Business Information Specialist Assistant Professor of Library Science Purdue Libraries
    • 2. OUTLINE Problem Statement The Project Results • Satisfaction Survey • Heuristic Reviews Lessons, Opportunities Future Directions
    • 3. PROBLEM STATEMENT How can we better record usability errors in vendor products to communicate those findings back to vendors and more formally incorporate usability into our collection assessments?
    • 4. THE PROJECT Database Reviews 2013 • Yearly (Summer project) on a three year rotation • ~ 100 databases reviewed each year (300 total) • Each subject librarian reviews 1-8 databases • Usage, audience, content, cost per use • Certain information needed for discontinue decision
    • 5. THE PROJECT Database Usability Heuristic Form • Instructions to orient researching usability • 8 sections with 1-4 likert questions and room for comment at the end • Required for all database reviews by Associate Dean Satisfaction Survey- how they liked the process, the form, etc. IRB-approved, Administration-backed
    • 6. DATABASE USABILITY HEURISTIC REVIEW
    • 7. DATABASE USABILITY HEURISTIC REVIEW Libraries and Usability Tests • Websites • Vendor Databases • Vendor Discovery Layers
    • 8. Does the incorporation of heuristic reviews affect the database review process? • Is it redundant with other parts of the database evaluation process? • Is it perceived as valuable?
    • 9. DATABASE USABILITY HEURISTIC REVIEW 37 databases 8 librarians
    • 10. RESULTS- SATISFACTION SURVEYS Overall, how much impact did the Database Usability Heuristic Review have on your final selection decision? No Impact Some Impact Highly Impact Total Response 7 1 0 % 88% 13% 0% 8 100% “Usability is seldom a determining factor. We might complain to the vendor or ask for changes for usability but unless a database is completely unusable, I don't think it would affect retention.” “I completed the database review form prior to the heuristic review so felt that I had sufficient information to make a decision.” “The database that I am reviewing is core to the research done in a department. The current product is well designed. I didn't need the heuristic review to realize that.” “The deciding factor is the importance of the content. The interface has zero influence on the decision to keep these databases.”
    • 11. RESULTS- SATISFACTION SURVEYS Overall, did you find the Database Usability Heuristic Review redundant with other parts of the database review form? Why or why not? Response % Yes No 4 4 50% 50% Total 8 100% “We already have to consider whether and how users are learning the database in the database review.” “We did already evaluate the "quality of the product" in the database report.” “It asks different questions than the database review form.” “The questions were focused on usability of the interface not on the content and cost of the database.” “The database review asks for need for instruction and access to help, which are also “I think some of these questions are worth asking but they could be asked more covered in the heuristic review, although in effectively and efficiently.” different ways.”
    • 12. RESULTS- SATISFACTION SURVEYS Overall, did you find the Database Usability Heuristic Review easy to use? Why or why not? Yes No Total Response 8 0 % 100% 0% 8 100% “The content was easy to use, but the formatting was not. The redundant use of the likert scale over and over had the effect of making it hard to read. The short answer questions part also became difficult to navigate after 4 or 5 of the questions had been answered.” “Some of the requested actions are not very well described so I was not sure how to answer the questions.” “The form itself was easy to use, the instructions did not match the work do be done on the form.” “Overall, yes. Some of the statements were more challenging to evaluate than others, however. The "common platform standards" line stood out to me in particular. Many of us may not be aware of such standards...?”
    • 13. RESULTS- SATISFACTION SURVEYS Overall, did you feel that the Database Usability Heuristic Review should be done as part of the database reviews every year? Why or why not? Response % Yes No 2 6 25% 75% Total 8 100% “It could be done yearly for databases that are not on large platforms” “Not sure if this exact form/process is the most effective way, but something like it - yes.” “It did not add to the decision making process. One or 2 questions from this added to the regular review might be good.” “A heuristic review is likely to be more helpful when there is a new interface. Otherwise we've all long since learned to overlook, or adapt to any quirks of a given database interface.” It would be useful in the case of a database that is really difficult to use or inappropriate for the intended audience, otherwise, not so much. “I think some of these questions are worth asking but they could be asked more effectively and efficiently.”
    • 14. RESULTS- HEURISTIC REVIEWS Proquest Statistical Insight (1) Visibility of System Status 6 Help Users Recognize, D iagnose and Recover… Match between the 5 System and the Real… 5 4 3 Proquest Statistical Insight (2) Help Users Recognize, Diagnose and… 5 4 3 Match between the System and the… 2 1.66667 2 1 0 Aesthetic and 6 Minimalist Design Visibility of System Status 6 Consistancy 5.5 and Standards 0 Aesthetic and Minimalist Design 1 Consistanc y and Standards 0 2 3 Flexibility and Ease of Use Error Prevention Recognition Not Recall Proquest Statistical Insight (1) Flexibility and Ease of Use Error Prevention Recognitio n Not Recall Proquest Statistical Insight (2)
    • 15. RESULTS- HEURISTIC REVIEWS USA Trade Online Help Users Recogn ize, Dia gnos… Visibilit y of System Status 6 5 4 3 SRDS Media Solutions Help Users Recognize, Diagnose and… Match betwee n the System and… Consist ancy and Standa rds 1 0 Flexibili ty and Ease of Use Error Preven tion Recogn ition USA Trade Online Not Recall 5 4 3 Match between the System and the… 2 2 Aesthe tic and Minim alist Design Visibility of System Status 6 Aesthetic and Minimalist Design 1 0 Flexibility and Ease of Use Consistanc y and Standards Error Prevention Recognitio n Not Recall SRDS Media Solutions
    • 16. LESSONS • Communication is key. • Busy work vs. deeper dive vs. path of least resistance. • Be aware of project deadlines versus real deadlines. • Heuristic reviews might work better as a tool than a requirement.
    • 17. OPPORTUNITIES • New ways to visualize usability • Recording of errors- 37 heuristic reviews • Could be useful as a jumping off point of discussing usability across products • What’s the lowest an interface can get on DUHR before it’s too low? • New possibilities for expert feedback
    • 18. FUTURE DIRECTIONS • Look into implementation into new database acquisition, database renewals, borderline cases • Communicate findings to stakeholders • Hold workshops and trainings on heuristic reviews for collection development
    • 19. THANK YOU!

    ×