Sanjeet Mann
Arts and Electronic Resources Librarian, University of Redlands
SCELC Research Day, March 4, 2014

http://www...


Defined as a research method replicating the
steps that library users take from a citation to
full text (or an error)
Proxy

Innovative
WAM

Source

RILM
(Proquest)

KB

Proquest
360 Core

Resolver

Proquest
360 Link

Target

Oxford
Journal...







Can users get to the full text?
How often do they get errors? What kind of errors?
Should you be satisfied wi...
4 searches x 10 disciplines x 10 results = 400 item sample
(estimates overall population with 98% confidence, +/- 5% error...


2012: http://goo.gl/606us



2013: http://goo.gl/O5XK9A
45%
40%

41%

41%
38%

35%
30%
25%

Local Avail

20%
13%

15%
10%
5%
0%
2012

2013

Error
Value

% before

% after

Z score

Result

Error rate

37.5

13.5

7.79

SIGNIFICANT!

Local availability

40.8

40.5

0.0...
2012
1- Proxy errors

2013

Z score

Result

1.3%

0%

2.25

Significant

Proxy

Source

KB

Resolver

Target

ILLIAD
2012
2 – Source errors

2013

Z score

Result

8.5%

8.3%

0.09

Not significant

Proxy

Source

KB

Resolver

Target

ILL...
2012
3 – kb errors

2013

Z score

Result

6.3%

4.0%

1.44

Not significant

Proxy

Source

KB

Resolver

Target

ILLIAD
2012
4 – Resolver errors

2013

Z score

Result

1.8%

0.0%

2.70

Significant

Proxy

Source

KB

Resolver

Target

ILLIA...
2012
5 – Target errors

2013

Z score

Result

3.8%

1.3%

2.27

Significant

Proxy

Source

KB

Resolver

Target

ILLIAD
2012
6 – ILLIAD errors

2013

Z score

Result

16.0%

0.0%

8.34

SIGNIFICANT!

Proxy

Source

KB

Resolver

Target

ILLIA...
dissertations, 9%

proceedings, 2%

chapters 8%

books 6%

articles 75%
90%
80%
70%
60%
50%

2012

40%

2013

30%
20%
10%
0%
MUS ENG PHIL PSYC HIST ECON SOC

BIO

CDIS MATH
100%

90%
% of total sample

80%
70%

ILL 59%

ILL 60%

Print 5%

Print 4%

Online 36%

Online 37%

2012

2013

60%
50%
40...


Availability studies can help you make a
statistically significant reduction in errors



Supports evidence based libr...


Conduct availability studies at other libraries



Availability study with Redlands students



What are your e-resou...


Proxy error (domain not in forward table)


Source error (both rft.jtitle and rft.title)


KB error (“Get Article” link is missing because SerSol
collection doesn’t support article level linking)


Resolver error (match on wrong journal title)


Target error (Page unavailable)


ILLIAD error (book chapter treated as an
article; rft.atitle used for two fields)
How Much do Availability Studies Increase Full Text Success?
Upcoming SlideShare
Loading in …5
×

How Much do Availability Studies Increase Full Text Success?

211 views

Published on

Availability Studies are a systems research technique that academic libraries can use to identify errors affecting access to electronic resources. Comparing two availability studies conducted before and after troubleshooting showed a statistically significant decrease in errors from 38% to 13%.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
211
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • This presentation builds off of my 2013 SCELC Research Day presentation, “Measuring Electronic Resource Availability”, available online at http://www.slideshare.net/sanjeetmann.
  • Availability studies are a research method that involves replicating the steps that library users take to go from a citation to full text (or an error). They usea sample to approximate the proportion of your entire library collection that is available in full text or generates an error.
  • Availability studies are a systems analysis research method, measuring the performance of the library as a “system” able to deliver full text information. Different researchers have defined the system in slightly different ways. In my research study, I identified six discrete parts. Each of them has to work together for users to get to full text, and each of them is a potential point of failure.
  • This example illustrates the kind of problems I hope to find and correct through availability research. Source database contributed metadata from a book chapter into an ILLIAD article request form; a unicode error occurred, and an invalid date was provided. This request cannot be filled automatically via Direct Request, and may require significant intervention from a librarian or ILL staff before it can be placed at all. These kinds of problems cost time (and money) to troubleshoot and can be very frustrating for library users and staff alike.My availability research seeks to answer three questions: why do problems like this occur, how often do they happen, and what can we do as librarians to stop these problems from happening again?
  • Lessons learned from availability studies pertain to other areas of librarianship besides e-resources troubleshooting. You can use study results as evidence to address a variety of questions.
  • I created a sample of 400 items by creating database searches based on actual reference questions from our ref desk statistics. I ran 4 searches in each of 10 A&I databases for different disciplines spanning the humanities, social sciences and sciences. I tried to obtain the full text of the first 10 results of each search, recording the results in a spreadsheet. All items are either locally available online, locally available in print, or available from other libraries through ILL. Errors are recorded in the six categories mentioned previously.Using statistical techniques for sampling of binomial (yes/no) variables, I determined that I can be 98% confident that my sample size of 400 items will estimate the population of all items in the library’s collection within +/- 5% margin of error.
  • So far I have conducted three availability studies at Armacost Library. I conducted a pilot project to determine the sample size, and then the initial study of 400 items which I reported on last year at Research Day. After troubleshooting (discussed later) I replicated the same study again in 2013 to see if the local availability and error rates had changed. This was an important step to see if the availability study really had been effective at helping me discover problems.
  • A comparison of my results from the two studies shows that the proportion of items locally available from Armacost Library’s print and online collections stayed the same, but the error rate dropped from 38% to 13%. Is this significant?
  • Used 100 Statistical Tests by Gopal Kanji to determine that I could use the z test for difference of two proportions to see if the change in error rates was significant. The test looks at how much my results differ from the result we would expect if there really was no change in error rates. Z > 1.645 is statistically significant. The reduction in overall error rate is a VERY strong result so I can easily reject H0.No significant change in local availability rate so I would conclude that the troubleshooting steps I took did not make more items locally available.I also used Z tests to examine the change in error rates for each category of error to try to understand why I was or was not effective at fixing each category of error.
  • Proxy errors generally meant the e-journal’s domain was missing from the proxy forward table. Usually these were e-journals with a unique domain. The solution, adding the domain to the forward table, only took a few minutes and was entirely under my control. I had a statistically significant impact on the proxy error rate.
  • Source errors usually involve OpenURL metadata in the A&I database. There were a lot of these errors (8% of all items in my sample!) Sometimes very difficult to identify because the problem might not manifest until linking into a target database or placing an ILLIAD request. You have to look carefully at the OpenURL string before and after it enters the link resolver to determine if you have a source error. Source errors are also difficult to troubleshoot. I contact the database vendor each time I find an error, and they often need to forward the problem to a different unit, which may need to ask another party to fix the problem. It’s interesting to notice a trend in the 2013 study. In some situations where a library subscribes to both a full text and a citation-only database from the same publisher indexing the same item, the database inserts the PDF from the full text record into the record for the abstracting database, thus bypassing OpenURL linking entirely – and rendering source metadata errors a moot point.
  • Knowledge base errors usually involve collections that do not resolve down to the article level. In my experience students often assume the item is not available when they are not taken directly to the article so I count this situation as an error that needs to be improved (cf. Trainor and Price 2010 who also categorized browsing for results as an error) The collections in our knowledgebase for “Freely Available” journals are not centrally managed and only resolve to the title level, so I couldn’t affect the overall error rate.
  • Link resolver errors involve problems with the logic used to match items. For example, linking from the resolver result screen to the library catalog by title may match on multiple items or an incorrect item. We fixed problems by adding a prominent link to match on ISSN or ISBN instead.
  • Target errors could occur because the full text database being linked into does not have the PDF loaded on their site or the site was down when I was testing it. Database vendors improved this category of problems through website redesigns, and by overlaying full text records as mentioned earlier.
  • ILLIAD errors improved because OCLC upgraded ILLIAD to support Unicode, and because our web librarian created request forms for more item types and edited the Customization Manager to match OpenURL metadata with the correct field in the correct request form.
  • Some interesting facts about the samples: 75% articles, 11% gray literature (proceedings and dissertations, ILL only)
  • Disciplines where the database indexed a lot of gray literature (Music, English, PsycInfo, Math) had very high ILL error rates. Disciplines with article-dominant databases (History, Economics) had fewer errors and more locally available items. Philosophy had an unusually large number of source errors in 2013 study.
  • Most surprising result for me was consistent 40% local availability (irrespective of whether an item produced an error or not). Research libraries often report about 60% local availability (Nisonger 2007). This means that 3 out of every 5 items in our A&I databases are not in our library collections.
  • Comparing the 2012 and 2013 studies demonstrates that availability studies can help a library make a very statistically significant reduction in electronic resource errors. Availability studies are a technique that lends well to evidence based librarianship and can impact decision making in a variety of areas:Advocating with vendors to improve the quality of metadata. If source errors are 8% of our A&I indexing databases, that’s too many errors to fix one at a time. Regarding the 40% local issue, we could respond in a variety of ways. We could change collection development to prioritize acquiring full text, we could ramp up ILL services, or we could scaffold library instruction to direct students to full text resources until they need to do upper division research in a major, and are able to evaluate whether an unavailable resource really meets their research needs. Need more information from other libraries on whether discovery services help or hurt? Are different knowledge bases more or less error prone?
  • Natural next steps for this research are to conduct further availability studies at other libraries of different sizes, serving various user populations. I spent 25 minutes per search (16 hours total for my 400 item sample) but you can save time by using a 100 item sample and still get 81% confidence. I am also in the process of conducting an availability study with Redlands students where I have them test access to the resources, hoping to present preliminary findings at NASIG in May. If you are viewing this presentation online and have questions for me or are interested in sharing your e-resource availability/error experiences, write to me at sanjeet_mann@redlands.edu. I’d love to hear from you!
  • How Much do Availability Studies Increase Full Text Success?

    1. 1. Sanjeet Mann Arts and Electronic Resources Librarian, University of Redlands SCELC Research Day, March 4, 2014 http://www.slideshare.net/sanjeetmann
    2. 2.  Defined as a research method replicating the steps that library users take from a citation to full text (or an error)
    3. 3. Proxy Innovative WAM Source RILM (Proquest) KB Proquest 360 Core Resolver Proquest 360 Link Target Oxford Journals ILLIAD OCLC ILLIAD
    4. 4.       Can users get to the full text? How often do they get errors? What kind of errors? Should you be satisfied with your knowledge base (KB) vendor? Do you have enough full text in your collection? How often do your users need ILL? Are you teaching users what they need to know to successfully access e-resources?
    5. 5. 4 searches x 10 disciplines x 10 results = 400 item sample (estimates overall population with 98% confidence, +/- 5% error)
    6. 6.  2012: http://goo.gl/606us  2013: http://goo.gl/O5XK9A
    7. 7. 45% 40% 41% 41% 38% 35% 30% 25% Local Avail 20% 13% 15% 10% 5% 0% 2012 2013 Error
    8. 8. Value % before % after Z score Result Error rate 37.5 13.5 7.79 SIGNIFICANT! Local availability 40.8 40.5 0.09 Not significant
    9. 9. 2012 1- Proxy errors 2013 Z score Result 1.3% 0% 2.25 Significant Proxy Source KB Resolver Target ILLIAD
    10. 10. 2012 2 – Source errors 2013 Z score Result 8.5% 8.3% 0.09 Not significant Proxy Source KB Resolver Target ILLIAD
    11. 11. 2012 3 – kb errors 2013 Z score Result 6.3% 4.0% 1.44 Not significant Proxy Source KB Resolver Target ILLIAD
    12. 12. 2012 4 – Resolver errors 2013 Z score Result 1.8% 0.0% 2.70 Significant Proxy Source KB Resolver Target ILLIAD
    13. 13. 2012 5 – Target errors 2013 Z score Result 3.8% 1.3% 2.27 Significant Proxy Source KB Resolver Target ILLIAD
    14. 14. 2012 6 – ILLIAD errors 2013 Z score Result 16.0% 0.0% 8.34 SIGNIFICANT! Proxy Source KB Resolver Target ILLIAD
    15. 15. dissertations, 9% proceedings, 2% chapters 8% books 6% articles 75%
    16. 16. 90% 80% 70% 60% 50% 2012 40% 2013 30% 20% 10% 0% MUS ENG PHIL PSYC HIST ECON SOC BIO CDIS MATH
    17. 17. 100% 90% % of total sample 80% 70% ILL 59% ILL 60% Print 5% Print 4% Online 36% Online 37% 2012 2013 60% 50% 40% 30% 20% 10% 0%
    18. 18.  Availability studies can help you make a statistically significant reduction in errors  Supports evidence based librarianship      Advocate w/ vendors to improve metadata chain More full text resources? More ILL requests? Should first years avoid A&I databases? Do discovery services help or hurt?
    19. 19.  Conduct availability studies at other libraries  Availability study with Redlands students  What are your e-resource experiences? Further Reading  Kanji, Gopal K. 2006. 100 Statistical Tests. 3rd ed. Thousand Oaks: Sage Publications.  Mann, Sanjeet. 2013. “Measuring Electronic Resource Availability” presented at the SCELC Research Day, March 5, Loyola Marymount University. http://www.slideshare.net/sanjeetmann.  Nisonger, Thomas E. 2007. “A Review and Analysis of Library Availability Studies.” Library Resources & Technical Services 51 (1): 30–49.  Trainor, Cindi, and Jason Price. 2010. Rethinking Library Linking: Breathing New Life into OpenURL. Vol. 46. Library Technology Reports 7. Chicago: American Library Association.
    20. 20.  Proxy error (domain not in forward table)
    21. 21.  Source error (both rft.jtitle and rft.title)
    22. 22.  KB error (“Get Article” link is missing because SerSol collection doesn’t support article level linking)
    23. 23.  Resolver error (match on wrong journal title)
    24. 24.  Target error (Page unavailable)
    25. 25.  ILLIAD error (book chapter treated as an article; rft.atitle used for two fields)

    ×