Consortia Level Collection Evaluation
Upcoming SlideShare
Loading in...5
×
 

Consortia Level Collection Evaluation

on

  • 658 views

Presentation given at the SCELC Executive Board Meeting, March 17, 2006

Presentation given at the SCELC Executive Board Meeting, March 17, 2006

Statistics

Views

Total Views
658
Views on SlideShare
658
Embed Views
0

Actions

Likes
0
Downloads
8
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Individually, evaluation has typically been an institutional choice – and usually done piecemeal rather than systematically. Consortial licensing has a number of advantages – the number of schools participating means that any evaluation will be of interest to at least some participants. Second, the number of participants provides a comparative and relative perspective to the analysis. And, since consortial licensing has
  • Basically, evaluation has typically been an institutional choice – and usually done as individual specifically oriented projects rather than systematically as a part of a holistic approach to collection development. Consortia licensing has also not been done systematically for a number of reasons – the difficulty in collecting, merging, and manipulating data; the theory of purchasing clubs rather than collection development units, etc. Evaluation is Necessary Good evaluation assists us in understanding what we’ve licensed and in negotiating future contracts. Do we know we negotiated a good deal? Can we quantify the benefits of the consortial license? Can we evaluate the costs to the consortia as a whole or to individual subscribing institutions? With Electronic resources, we now have stronger quantitative data to use in evaluation. This strength translates into an increased ability to justify and rationalize decisions and to quantify the benefits of the library’s collections. Evaluation is Possible : Although it takes some work, evaluating collections is possible, and becomes easier once evaluation frameworks are established. For example, setting up our initial print journal review system was laborious at the start, but once in place, it became quick, easy, and absolutely essential in evaluating our collection. There are three aspects to evaluation that we need to remember when establishing a system: relativity, comparability, and generalizability Relativity: Each institution has their own usage data, value data, and cost data, but what does it mean? One way to evaluate the meaning of quantitative data is by making it relative to other data. For example, is Journal A that is used Y times as valuable as Journal B that was used X times? Is that value the same for School C and School D? Comparability: Consortial licensing makes the comparison of usage and content possible and promotes a relative perspective from school to school and publisher to publisher. Easy to compare institutions since we have all the data at the consortia level. Easy to compare publishers because we have the data at the consortia level. Easy to compare databases to databases and journals to journals. Generalizibility: Even if we have relative and comparable data, we still need to generalize that data when doing evaluation. Generalize it across products, product types, publisher types, institutions, and even to information use behaviors in general or specific.
  • So why should we be looking at collection evaluation at this time? Basically, it’s time now. Over the past few years, at least since I’ve been involved in SCELC, we’ve spent our time negotiating new contracts and adding participants to ongoing contracts. The focus was more on building up the options for the member institutions and developing SCELC into a strong consortia presence. The collection is now pretty stable and mature, so it’s time to manage the collection of contracts we do have, and evaluation is a key component of that management. A systematic review of our resources can help us accomplish a few strategic tasks: comparing products and publishers, evaluating prices at a number of different layers, a review of our contracts, either current or prospective, and all of this helps the consortia and individual members plan for the future.
  • Of course, all this introduction is basically preaching to the choir, I’m sure. Very Briefly, and not to give you flashbacks of Library School, I’ve outlined some aspects of Collection Evaluation or “Assessment” or “Review”. First, Quality is important – did the resource fit the need that it was either advertised to fit or was expected by the library to fit? And was it of high quality? Second, Usage. And usage is not the only criteria of assessment of libraries. Looking at usage only signfies a shift from collection building principles to collection maintenance. But when we’re talking about electronic resources, especially online journals and A/I databases that change frequently, the aspects of “just in case” collection development that is heavy on archival and collection building principles aren’t as important as “Was it ever used and how much?” Third, we need to remember the principles of value – We should be assessing our resources according to usage but also relative to its cost, relative to other resources, relative to our budgets, and relative to the other aspects of our mission (collection development based on historical trends, political issues, etc.)
  • Since Quality is so subjective, I’m not going to focus on that aspect of evaluation – each institution or librarian needs to determine the quality of each resource individually and that judgment might vary based on subjective criteria with objective criteria like usage factored into that equation. So, I’ll focus on Usage and Value, which can both be very quantitative and objective measures to evaluate collections and resources.
  • Here are 4 current evaluation projects that Jason & I have worked on that I’d like to highlight. The first two are examples of consortia level analysis on current contracts and the last two are examples drawn up to give the first two an aspect of comparativeness and relativeness. First, Jason worked on an analysis of ScienceDirect. John prepared an analysis for Wiley InterScience in preparation for the 2006 contract. Third, examples of similar analysis for just society publishers, and finally comparisons of society and other commercial publishers for just Caltech and Claremont.
  • First, Jason worked on an analysis of ScienceDirect. Jason’s analysis showed the use of the SCs followed an 80/20 rule and found that SCs were a poor option for a number of reasons, that Unique Title Lists were also inferior options, but shared title lists agreed upon by all schools could be effective on a price per use basis.
  • Elsevier: Row 1: About 20% of collection had zero uses by their subscribing school. That’s only 4% of the value of the unique titles in the contract. Row 2: In total 31% of the collection had fewer than 5 uses, which is a benchmark to adjust for internal staff processing use, non-use, misuse. Based on CCC limit of 5 articles before paying copyright. Row 3: Subscribers had 522 titles from SD’s collection covered by at least 1 subscription. That’s 75% of the total SD price. Row 4: We have 748 copies collectively among the 11 schools in the contract, and the total contract value in 2004 was 1.4 million. Row 5: 2% of the use from the consortium was to the top journal, Lancet, where we have 3 copies collectively. Row 6: 16% of the usage is in the top 10% of the collection Row 7: while 68% was in the top 20% Row 8: That represents 37% of the journals, and 60% of the dollars What does this really say? The composition of consortia online journal packages should be flexible and should allow for list revision and reduction in pricing based on frequency of use.
  • This shows just the Top 10 journals by Usage for the consortia – As you can see, some titles are clearly heavily represented in usage and although not in total copies across the consortia, but still have disparities in overall price charged per use or per title even. Biology journal dominate the top of the usage rankings, despite the group of 11 schools not being heavily biology focused (although some are, of course). The overall price per use for some journals is quite reasonable at the top of the scale, but the overall PPU of almost $11 is still relatively high (as we’ll see later).
  • Here’s a quick graph of the Subscribed versus Unsubscribed usage. The Blue Line is the Subscribed Use per Subscribed Journal for each school, ranked by top average usage. The Orange Line is the Avg usage from Unsubscribed titles. What’s interesting is that the blue line is a standard Zipf distribution, with few schools using a high number of uses per journal and sloping down to many schools using very few articles per journal. That’s to be expected, but the orange line is really interesting since it’s basically flat on this scale – with an overall average of 11 uses/journal, and a median of only 6.
  • Here’s a quick graph of the Subscribed versus Unsubscribed Price per Usage. The Blue Line is the Subscribed Price per Use with the group sorted by top usage institutions (1 through 23). As you move from heavy users to lighter users, the price per use moves upwards, but also fluctuates wildly. The Orange Line is the Price per Usage from Unsubscribed titles and is calculated by dividing total EAL online surcharge (the marginal cost to get access to the shared title list) by the number of downloads from those journals. It stays relatively flat and ranges only from about $3 to $10 per use.
  • Elsevier: Row 1: Row 2: Another 25% of the collection had fewer than 5 uses, which is a benchmark to adjust for internal staff processing use, non-use, misuse. Based on CCC limit of 5 articles before paying copyright. Row 3: We have 229 titles from Wiley’s collection covered by at least 1 subscription. That’s 35% of our total EAL price. Row 4: We have 675 copies collectively among the 25 schools in the contract, and the total contract value in 2004 was 1.3 million. Row 5: 15% of the use from the consortium was to Angewandte Chemie, where we have 11 copies collectively. Row 6: 35% of the usage is in the top 4% of the collection Row 7:
  • This shows just the Top 10 journals by Usage for the consortia – As you can see, some titles are clearly heavily represented in usage and in total copies across the consortia, but still have disparities in overall price charged per use or per title even. Clearly two of the biology titles are well overpriced for our usage and compared to the other journals in this list.
  • Here’s a quick graph of the Subscribed versus Unsubscribed usage. The Blue Line is the Subscribed Use per Subscribed Journal for each school, ranked by top average usage. The Orange Line is the Avg usage from Unsubscribed titles. What’s interesting is that the blue line is a standard Zipf distribution, with few schools using a high number of uses per journal and sloping down to many schools using very few articles per journal. That’s to be expected, but the orange line is really interesting since it’s basically flat on this scale – with an overall average of 11 uses/journal, and a median of only 6.
  • Here’s a quick graph of the Subscribed versus Unsubscribed Price per Usage. The Blue Line is the Subscribed Price per Use with the group sorted by top usage institutions (1 through 23). As you move from heavy users to lighter users, the price per use moves upwards, but also fluctuates wildly. The Orange Line is the Price per Usage from Unsubscribed titles and is calculated by dividing total EAL online surcharge (the marginal cost to get access to the shared title list) by the number of downloads from those journals. It stays relatively flat and ranges only from about $3 to $10 per use.
  • Here is a comparison of the raw usage counts for the 9 schools that were in both Wiley and Elsevier contracts, Sorted by Total Usage. What is interesting is that for some schools, usage of subscribed titles is higher than for unsubscribed (which is to be expected), but as the number of titles subscribed to falls, so does the % of usage from subscribed titles (right side of graph). Comparatively, the value of the contract was 4 times higher for Elsevier, but usage was 7 times higher for subscribed, and 5 times higher for Unsubscribed. Elsevier is a better deal by comparison to Wiley. This argues for a disproportionate advantage to smaller schools of shared title lists. But also, the overall usage of those titles is disproportionate to the larger schools in raw numbers.
  • Here are 4 current evaluation projects that Jason & I have worked on that I’d like to highlight. The first two are examples of consortia level analysis on current contracts and the last two are examples drawn up to give the first two an aspect of comparativeness and relativeness. First, Jason worked on an analysis of ScienceDirect. John prepared an analysis for Wiley InterScience in preparation for the 2006 contract. Third, examples of similar analysis for just society publishers, and finally comparisons of society and other commercial publishers for just Caltech and Claremont.
  • Here’s a quick graph of the Subscribed versus Unsubscribed Price per Usage. The Blue Line is the Subscribed Price per Use with the group sorted by top usage institutions (1 through 23). As you move from heavy users to lighter users, the price per use moves upwards, but also fluctuates wildly. The Orange Line is the Price per Usage from Unsubscribed titles and is calculated by dividing total EAL online surcharge (the marginal cost to get access to the shared title list) by the number of downloads from those journals. It stays relatively flat and ranges only from about $3 to $10 per use.
  • Where does Usage based pricing come into play? One – Usage has the promise to do two things: give the publisher an economic incentive to increase our access, improve the quality of their product, and improve retrieval, interface design, etc. Also can level the economic costs across members of the group. But, most libraries and publishers have valid concerns with usage-based pricing – Dialog is a good example:
  • So, here’s a “pie-in-the-sky” usage based pricing example that could be negotiated: a Min and Max spend with Usage based pricing. This chart shows the difference in amount paid if using this model for $5 PPU for the two Wiley/SD contracts. A couple schools (5) would have subsctantial savings, while most fall into the 20% to 40% savings range.
  • So why should we be looking at collection evaluation at this time? Basically, it’s time now. Over the past few years, at least since I’ve been involved in SCELC, we’ve spent our time negotiating new contracts and adding participants to ongoing contracts. The focus was more on building up the options for the member institutions and developing SCELC into a strong consortia presence. The collection is now pretty stable and mature, so it’s time to manage the collection of contracts we do have, and evaluation is a key component of that management. A systematic review of our resources can help us accomplish a few strategic tasks: comparing products and publishers, evaluating costs at a number of different layers, a review of our contracts, either current or prospective, and all of this helps the consortia and individual members plan for the future.

Consortia Level Collection Evaluation Consortia Level Collection Evaluation Presentation Transcript

  • Consortia Level Collection Evaluation John McDonald SCELC Executive Board Meeting March 17, 2006
  • Why Evaluate at the Consortia Level?
    • Evaluation is Necessary
      • Negotiation review
      • Cost/Benefit analysis
      • Quantitative data
    • Evaluation is Possible
      • Relativity
      • Comparability
      • Generalizibility
  • Why Evaluate Now?
    • History of SCELC Use
      • Most major products and vendors now licensed.
      • Most communities saturated or nearly so after multi-year access.
      • Data is available
    • Review of SCELC Use
      • Compare products and vendors.
      • Evaluate prices at a number of different layers.
      • Review of contracts that can help strategic planning.
  • Aspects of Collection Evaluation
    • Quality
      • Did the resource do what it purported to do?
      • Did it do that well?
    • Usage
      • Was it used?
      • How did the usage relate to other resources or schools?
    • Value
      • Did the usage justify the cost?
      • Did it meet other value criteria aside from usage & cost?
  • Quantitative Collection Evaluation
    • Usage
      • Quantitative
      • Objective
      • Immutable
    • Value
      • Price
      • Benefit
      • Other Value Criteria
  • Current Collection Evaluation Projects
    • Evaluation of Elsevier ScienceDirect
      • Consortium
    • Evaluation of Wiley InterScience
      • Consortium
    • Comparison of Society & Commercial Publishers
      • Caltech
      • Claremont
  • Elsevier ScienceDirect Analysis Who : Subject Collection Analysis for 11 schools What : Access included subscribed title list (full print price + 10-12% e-access fee) & leased titles (2-7% subject collection fees) When : SCELC 2003 Usage How : Total Fulltext & 2003 Prices by institution Why : Evaluate SC or Unique Title List options for individual subscriptions
  • Elsevier ScienceDirect Analysis $1,447,943   Total Price of 2004 Elsevier 12 $269,054   Price of 2004 E+SC 11 $1,178,289   Price of 2004 Content 10 $883,698 58% % of Dollars in Top 20% of Titles 9 $883,698 37% % of Copies in Top 20% of Titles 8 $883,698 68% % of Use from Top 20% of Titles 7 $63,645 16% % of Use from Top 10 Titles 6 $1,052 2% % of Use from Top Title 5 $1,298,170 748 Total Copies 4 $1,085,530 522 Unique Titles 3 $109,543 161 Subscribed Titles with <5 use 2 $52,710 98 Subscribed Titles with Zero Use 1 Price Number Meta-statistics Row
  • SCELC Use by Journal (Top 10 Used) $10.88 $1,307,898 120,199 775 Totals - $0.00 1145 0 American Journal of Gastroenterology $6.50 $7,774.64 1197 1 International Journal of Pharmaceutics $1.08 $1,320.84 1221 3 American Journal of Cardiology $2.57 $3,285.83 1278 1 Journal of Controlled Release $1.12 $1,454.07 1297 3 Journal of the American College of Cardiology $3.00 $4,316.26 1437 1 Advanced Drug Delivery Reviews $ 0.76 $1,167.53 1527 3 American Journal of Medicine - $0.00 1707 0 Obstetrics & Gynecology $2.79 $5,094.79 1828 4 Trends in Ecology & Evolution $0.79 $2,449.71 3103 3 Lancet Price / Use SCELC Payment Total Use Copies Title
  • SCELC Use by Elsevier Subscription
  • SCELC Price per Elsevier Use
  • Wiley InterScience Analysis Who : Usage Analysis for 23 contract participants What : Access included subscribed title list (full print price + 6% annual inflation + 3% e-access fee) & shared titles list. When : SCELC 2004 Usage How : Total Fulltext & 2004 Prices by institution Why : Review of previous contract in anticipation for 2006 renewal
  • Wiley InterScience Analysis $1,291,507   Total Price of 2004 EAL Fee 12 $37,617   Price of 2004 E-Access Fee 11 $1,253,891   Price of 2004 Content Fee 10 $648,546 50% % of Dollars in Top 20% of Titles 9 $648,546 32% % of Copies in Top 20% of Titles 8 $648,546 68% % of Use from Top 20% of Titles 7 $235,533 35% % of Use from Top 10 Titles 6 $49,202 14% % of Use from Top Title 5 $1,296,813 675 Total Copies 4 $455,756 229 Unique Titles 3 $162,056 117 Subscribed Titles with <5 use 2 $36,546 56 Subscribed Titles with Zero Use 1 Price Number Meta-statistics Row
  • SCELC Use by Journal (Top 10 Used) $9.42 $1,296,813 137701 675 Totals $5.01 $11,148 2225 7 International Journal of Eating Disorders $5.05 $11,412 2259 3 Developmental Dynamics $16.57 $40,354 2435 5 Journal of Neuroscience Research $3.58 $8,950 2502 9 BioEssays $1.88 $6,037 3218 5 Hepatology $4.27 $14,035 3286 4 Chemistry - A European Journal $4.91 $18,993 3869 7 International Journal of Cancer $1.09 $4,518 4130 9 Cancer $13.58 $70,885 5220 4 Journal of Comparative Neurology $2.52 $49,202 19549 11 Angewandte Chemie Price / Use SCELC Payment Total Use Copies Title
  • SCELC Use by Wiley Subscription
  • SCELC Price per Wiley Use
  • SCELC Wiley Use vs. Elsevier Use
  • Comparison of Society to Commercial
    • Why not to Compare
      • Societies don’t usually package content
      • Societies don’t usually make consortia deals
      • But..if they do, there is one price, rather than surcharges, content fees, and access fees.
    • How to Compare
      • Overall price per use metrics
      • Overall rate of usage relative to other titles
  • SCELC Price per Use Comparison
  • Usage Based Pricing Models
    • Usage
      • Economic incentive for publisher to improve access, quality, retrieval
      • Spread costs across members of consortia equitably
    • Tiered Usage
      • Encourages publishers to increase access and quality
      • Divorces e-access pricing from print legacy pricing
      • Corrects baseline of historical inflationary pricing
      • Corrects for payment disparities b/t Society & Commercial
      • Provides for library budgeting stability
      • Limits need to mediate end user usage
  • Minimum Spend = 50% of Prior Spend Usage Based Spend = $5 PPU Maximum Spend = 125% of Prior Spend Usage Based Pricing Example (Tiered)
  • Why Evaluate Now?
    • Review of current & prospective contracts
      • Continuing price escalation not sustainable
      • Evaluate prices to consortia and members
      • Review contracts with additional criteria
      • Promote models for quality not just quantity
      • Plan for future
  • We may never know why faculty do what they do . . . &quot;If we knew what it was we were doing, it would not be called research, would it?&quot;   Albert Einstein