Training daypresentation
Upcoming SlideShare
Loading in...5
×
 

Training daypresentation

on

  • 310 views

 

Statistics

Views

Total Views
310
Views on SlideShare
310
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • We have three goals for this part of the day: Exploring our options for our public databases webpages and taking a look at what other libraries do with them Exploring our options for full records for databases and looking at what librarians and users find important in them Talk a little bit about doing usability testing on our webpages and tips for planning successful usability projects And while this is an ERM forum, it’s important to realize that you don’t need an ERM for any of this – or, rather, whatever you use to manage this information IS your ERM.
  • So what do I mean when I say the front end? When discussing databases webpages, libraries often include four chief elements: a landing page, which includes access to the other three types of access
  • An A-Z list of databases to help people navigate to known resources
  • Databases-by-subject pages, which are meant to facilitate choosing a database when you don’t know what to choose, or to show what options are available in a particular field
  • And full records for databases, which typically contain more information about a resource than can be contained in either of the list formats. If you think about your own library’s website, you probably have at least a couple of these of these types of webpages for your databases, if not all four kinds, managed either by an ERM or other commercial software like LibGuides, or managed by homegrown software.
  • When we were thinking about restructuring our databases webpages, which are generated by the Innovative ERM, I decided to look at the websites of all 119 ARL libraries to see what their approaches were. I looked for the presence of all four front end webpages I just mentioned, what kind of software they were using to generate these pages, what they called the link on their home page into the pages, how they arranged their databases-by-subject lists, and whether or not they used visual elements within their records.
  • My approach was bolstered historically by other similar website surveys of ARL libraries and their findings.
  • So here’s what I learned. Most libraries are still using homegrown database-driven systems to generate their webpages for databases. This is software Kent developed several years ago. This means that most libraries are maintaining separate software for this purpose and maintaining information about their resources both here and, most likely, in other places too, whether that means an ERM, the catalog, LibGuides, etc. The reason for this is that transitioning to another system is time-consuming and staff-intensive and disruptive for patrons, but also because homegrown systems do what we want without the shortcomings of commercial software. For example, using Innovative for this purpose, we don’t have a lot of control over a lot of the elements of our public webpages, and customization of them is extremely cumbersome. However, I think we’re going to see more and more libraries getting away from the build-it approach as services like LibGuides get easier to use, or as more libraries start using open-source solutions.
  • Almost every ARL library – 97% - maintains an A-Z list of databases, and the vast majority also provide databases-by-subject lists. This supports the idea that we believe that our patrons are still for the most part navigating to known resources rather than searching for them. When we did usability testing on our databases webpages last winter, most of our participants used the databases portal to click through to known resources, and almost none of them thought to type the resource name into the search box on our home page, though doing so would get them that resource at the top of the results list every time. We’re now implementing a discovery layer, Summon, so it will be interesting to see if discovery layers change how users find databases, but lists are a familiar approach for now.
  • For libraries that have offer databases-by-subject lists, the majority are just in alphabetical order, but a surprising number of libraries are making an effort to order lists by relevance. This example is from Uconn, which also did a terrific usability study on their databases webpages that was published in 2010. They not only order subject lists by relevance, but limit the initial screen to only five resources, with the option to display all for that subject with an additional click. I have not seen any usability testing that proves this is a better approach for users, but you have to think that our patrons are really used to relevance ranking, at least in response to free-text searching online. We are not able to offer this because of the limitations of the Innovative ERM, but I wish we could.
  • Another thing I think users are really used to seeing are images, and I love the University of Missouri’s approach to lists, which includes a screenshot of the database search screen in its lists. A slight majority of ARL libraries use at least some images, most often to indicate access restrictions,
  • As we see in the University of Cincinnati’s list here.
  • The last point I want to make about this is – it’s OK to call our databases databases and, in fact, more ARL libraries use this term than any other in getting students to their databases pages. In our usability testing, our users did not have a problem using our databases link to get to our databases lists. I was afraid when we asked them to find articles on a subject they would use our e-journals link, but few did. Not all students may know what databases are, but it’s a term both we and their professors use to describe these resources. Students do NOT know what e-resources are.
  • There are a wide variety of approaches to presenting full records for databases, from Wright State’s minimal approach with descriptions, dates, and related databases, to ours at BG, which includes tutorial links, contact information, permissions, and coverage information.
  • Last summer we asked libraries in OhioLINK what information they included in their ERM records. Descriptions, permissions and vendor contact information were most often included, while information on purchasing, payments and tutorials for users were least frequently used. We also asked libraries what information they displayed to the public, and learned that most libraries in Ohio chiefly use ERMs as staff tools. As you can see from this listing of responses, libraries display far more of this information to staff than to users. This is probably because libraries are already using other systems for their public webpages and have implemented ERMs to manage information for staff, even though this approach duplicates work.
  • At Bowling Green, we include twelve of the 26 possible variable fields in our resource records in the public display, and use an additional nine fields for staff only.
  • Last winter we asked 15 users for their feedback on these records, asking them to look at a printout of the resource record fr Business Source Complete and circle information they thought was important, put question marks next to information they thought was confusing, and cross out information they thought was unnecessary.
  • Users most often circled descriptions, dates and the words full text; were most often confused by our links to alternate access (mobile and on-campus) and the coverage load information; and, though they rarely chose to cross anything out, decided links to tutorials and local contact information was the least likely to be necessary to include in the records. From this, we can conclude that Wright State’s minimal approach to record building actually covers most of the bases necessary for users, so if you were worried about not having more complete records, you shouldn’t. Descriptions and dates actually are important to users and help them decide whether or not they want to use a resource.
  • This was part of a usability study we did on our home page and database pages. We talked to fifteen undergraduate and graduate students. From this timeline, you can see that the entire project did not take a huge amount of time from planning to completion – we began it in December and were finished making recommendations last May. The most challenging part of our testing was actually designing the tasks we used and testing them to make sure they would give us the information we were looking for.
  • For this, these were our most useful sources: Steve Krug’s don’t make me think, Lehman & Nikkel, and foster and Gibbons. Foster and Gibbons is an extremely exhaustive and groundbreaking study that used a staff anthropologist to design research around students’ use of the University of Rochester Library. But a usability study does not have to be that involved in order to be effective, and, really, any time spent talking to students is useful. We took a very minimal approach that used minimal staff time, minimal technology (we used free software to record audio but did not film our sessions, use screen captures or set up a usability lab – instead, we relied chiefly on old-fashioned observation and detailed note taking), and minimal analysis – putting data into spreadsheets instead of exhaustively transcribing each session.
  • We did make changes to our records and webpages based on the feedback we got from students. For example, we learned that our Images and Media category was equated with media studies, not videos, so we changed that and added a category for media studies that we called film, television and media studies. We also removed the search box on this page which allowed users to search for databases by title, because we did not observe students using it for that but did observe them typing topical searches into the box and getting no results.
  • We revised some of the wording on our records and added a big orange connect button because some users did not recognize the hyperlinked database title as the link to get into the database and search it.
  • Then we did some follow-up testing by grabbing students in the main entrance to our student union and revisiting some of our tasks to see if the changes we made helped them be successful.
  • The testing we did was overall not difficult and very worthwhile, and you can do it too! It was easy to get funding for incentives (in our case, $20 gift certificates to Kroger or the bookstore) from our dean’s office; easy to recruit participants (we didn’t do anything complicated like Facebook or even campus newsppaer ads, which we discovered were surprisingly expensive; most of our participants came from the flyers we hung up in the library) and we didn’t fuss with too much technology, instead focusing on developing a good testing instrument and listening and observing carefully when we were with the students themselves.
  • We would have done some things differently – for example, I wish we had been more committed to making changes to our databases pages before we began. We were not able to make any desired changes to our databases A-Z list. However, sometimes administrators and IT staff will not realize changes are necessary until you gather the data, so this may be impossible, as well. We’ve not been able to really assess the impact of the changes we made – for example, are our students using the film category and the connect button? I would need help sifting through Google Analytics data to learn this and don’t have a plan to make this happen. Also, we haven’t done any follow-up testing since last summer, though I suspect that is on the near horizon for us, as we are currently implementing summon and will want to do testing on that this fall.

Training daypresentation Training daypresentation Presentation Transcript

  • The Front End Design of & Usability for ERM Data Amy Fry, Electronic Resources Coordinator Bowling Green State University http://personal.bgsu.edu/~afry/
  • Goals for this presentation
    • What best practices for databases webpages should we follow?
    • How do libraries structure full resource records, and what do users look for in them?
    • Usability testing: tips and resources
  • The Front End: Landing page
    • Portal or landing page for all databases
    • Databases A-Z list (separate from e-journals)
    • Databases-by-subject pages (usually separate from other course and subject guides)
    • Full resource records – information pages about each individual database
    BGSU
  • The Front End: A-Z list
    • Portal or landing page for all databases
    • Databases A-Z list (separate from e-journals)
    • Databases-by-subject pages (usually separate from other course and subject guides)
    • Full resource records – information pages about each individual database
    Kent State University
  • The Front End: DBs by subject
    • Portal or landing page for all databases
    • Databases A-Z list (separate from e-journals)
    • Databases-by-subject pages (usually separate from other course and subject guides)
    • Full resource records – information pages about each individual database
    Case Western Reserve
  • The Front End: Full records
    • Portal or landing page for all databases
    • Databases A-Z list (separate from e-journals)
    • Databases-by-subject pages (usually separate from other course and subject guides)
    • Full resource records – information pages about each individual database
    OhioLINK Wright State University
    • 2010 survey of ARL library websites
    • Databases A-Z list
    • Databases-by-subject lists
    • Full resource records
    • Software
    • Discovery layer or federated search
    • Link name
    • Order of databases-by-subject lists
    • Use of icons/graphics
  • Other surveys of ARL library sites
    • Cohen and Calsada (2003) Found that 66 of 114 academic ARLs used database-driven webpages to present their e-resources in 2002.
    • Shorten (2006) Found that 88.6% of ARL libraries had databases A-Z lists in 2003, and 10.5% also categorized them by type.
    • Caudle and Schmitz (2007) Found that 97% of the 99 American academic libraries in ARL had a databases A-Z list, 96% had databases-by-subject lists and 27% had federated searching.
  • Type of system Homegrown: 71.1% *Percentages are based on 114 libraries (excluding 7 national/special libraries and 4 libraries whose databases pages were behind a login) Kent State University Type of system # of libraries %* Homegrown 81 71.1% Metalib 14 12.3% Innovative 8 7% LibGuides 4 3.5% Xerxes 4 3.5% WebFeat 2 1.75% LibData 1 <1%
  • University of Missouri-Columbia Types of access # of libraries % Databases A-Z 111 97 Databases-by-subject lists 91 80 Full resource records 83 73 All three 73 64
  • Order of subject lists University of Connecticut Subject list order # of libraries % By relevance 38 41.8% By format 7 7.7% Alphabetical only 46 50.5%
  • Libraries using icons or graphics: 64 (56%) Use of icons and graphics Icon # of libraries Access restrictions 38 More information 27 Full text 9 Magnifying glass (Metalib: search in database) 5 Tutorials 4 Funding source 3 Format (audio, etc.) 3 Plus sign (Metalib: add to a set) 2 Social media 2 Metasearch 2 Logo/screenshot 2 RefWorks 1 New 1 Plus-star 1 SFX 1
  • University of Cincinnati
  • Name of Link Link title begins with… # of libraries % Examples “ Databases” 47 41% Databases (30), Databases A-Z (8) “ Articles” 22 18.6% Article Databases (4) Articles & Databases (8) “ E” or “Electronic” 16 13.6% E-Resources (7), Electronic Resources (5) “ Find” 8 6.8% Find Articles (3) Find Articles & Databases (1) “ Research” 8 6.8% Research Databases (3) “ Search” 4 3.4% Search & Find (2), Search a Database (1) “ Indexes” 2 1.7% Indexes & Databases (1) Indexes & Databases (Articles) (1) “ Journal” 2 1.7% Journal Articles (2) Branded names 2 1.7% Vera: E-Journals & Databases Galileo @ UGA Other 4 < 1% each Resource Gateway – Resources More Databases All Databases A-Z and Database Finder Online Research Resources (Databases)
  • What’s in full records? Wright State OhioLINK BGSU
  • Survey Question 9 What types of information are currently collected in your library's ERM system and to whom does that information display? Check all that apply. Answer Options In ERM? Display to public? Display to staff? formats Databases 14 8 13 Electronic journals 12 5 11 Electronic books 8 4 7 “ public” info Resource descriptions 14 7 12 License information (permissions) 14 6 13 Coverage dates 6 5 6 Resource advisories 7 5 7 Trial information 8 2 8 Tutorials/user guides 5 2 5 “ library” info Vendor/contact information 12 1 10 License information for ILL/fair use 11 4 10 Login/passwords 10 0 8 Renewal information 9 0 8 Purchase approval information 4 0 4 Payment history 4 0 4
  • a: author b: resource format c: tickler log d: subject e: description f: public note g: user support h: coverage i: incident log j: access information k: resource advisory l: usage statistics m: administration n: note o: connect button p: resource id q: not used r: local contact s: pricing and payment t: resource name u: trial or trial info v: resource type w: resource contains x: alt. resource name y: resource url z: resource mgmt tickler Resource records at BGSU
  • Student comments on a resource record from BGSU’s 2010 usability study
  • Fields in resource records Important Confusing Not needed Most important fields Description 14 0 1 Dates 10 1 0 Full text 7 1 0 Most confusing fields Mobile access 0 10 3 Coverage load 2 6 1 On-campus access 1 4 0 Least important fields User support 2 2 3 Mobile access 0 10 3 Local contact 4 1 2
  • BGSU usability study: steps and timeline
    • Identify goals (December 2009)
    • Complete Human Subjects Review Board (HSRB) training (January 2010)
    • Submit HSRB application, including script, recruitment materials, consent form (January 2010)
    • Obtain funding for incentives (January 2010)
    • Test the instrument (February 2010)
    • Recruit participants (February 2010)
    • Complete the testing (February-March 2010)
    • Analyze results (March-April 2010)
    • Present findings and recommendations (April-May 2010)
  • Lehman & Nikkel, 2008 Foster & Gibbons, 2007 Krug, 2006
  • Other library usability studies
    • Hammill (2003) Did common task testing with 52 users at Florida International University Libraries, including finding a named database.
    • Krueger, Ray and Knight (2004) Did common task testing with 134 users at the University of the Pacific Library.
    • Fuller, Livingston, Brown, Cowan, Wood and Porter (2009) Did three rounds of testing with five users each on the databases pages at the University of Connecticut Libraries.
  • change to Databases A-Z change to Databases by subject Remove search box change to Videos & Images add Film, Television & Media Studies
  • Database title Contains Notes Access for mobile devices Alternate on-campus link Tutorials & help Add a connect button Journal titles in this database Dates included View this title
  • Guerrilla Testing
    • July 2010
    • Twelve participants
      • 4 graduate students
      • 4 incoming freshmen
      • 2 undergraduates
      • 1 staff member
      • 1 faculty member
  • It’s easier than you think!
    • Ask your administrative office or Friends to fund the incentives
    • Recruit with signs in the library or grab people as they go by
    • Design for minimal prep and minimal analysis
    • Don’t worry about technology
  • It’s also harder than it should be.
    • Make sure people are committed to change (both intellectually and with resources).
    • Have a plan to assess the impact of your changes.
    • Build time into your future schedule to do more testing.