This document discusses the transition from manually collecting and compiling electronic resource usage statistics to using the SUSHI protocol and EBSCO's Usage Consolidation tool over a five year period at two universities. It describes setting up the tool to automatically harvest COUNTER-compliant usage reports from various vendors via SUSHI, loading any remaining reports manually, and ongoing cleanup work. The tool provides centralized access to usage data and standardized reporting to help with collection development decisions. While it has reduced the manual work, some spreadsheet calculations are still needed and not all resources can be loaded. The document envisions further automation through a shared SUSHI portal for libraries in a consortium.
As libraries move to become centers of digital collections, maintaining information on the usage of these collections is ever more critical. It's also essential to be able to maintain common measures across heterogeneous collections, in order to be able to effectively analyze how the library's collection dollars are being spent. The Project COUNTER Code of Practice and the SUSHI protocol aid in this work. This session will explore the newly-published Release 4 of the COUNTER Code of Practice for e-Resources and highlight its use in conjunction with the SUSHI (Standardized Usage Statistics Harvesting Initiative) protocol in an active library environment.
This presentation illustrates an online A-Z usage statistics web site at Arizona State University Libraries and how usage reports are gathered, stored and made accessible to decision makers. Furthermore, details about creating a usage web site and challenges one may encounter. Additionally, potential uses and future plans are discussed.
Building a foundation for collection management decisions: two approachesNASIG
Salisbury University and the University of Maryland both undertook projects to evaluate the effectiveness of EBSCO Information Service's Usage Consolidation product and the usefulness of the data extracted for collection development decisions. The goals of implementation were to centralize the collection and analysis of e-resource usage data and to allow collection management librarians easy access to usage and cost per use data to aid in their decision-making. The presenters will discuss how staff at each institution populated Usage Consolidation and presented usage reports to collection managers; how collection managers responded to the data; and how they used the data to inform collection management decisions.
Leigh Ann DePope
Serials/Electronic Services Librarian, Salisbury University
Leigh Ann DePope is the Serials/Electronic Services Librarian at Salisbury University. She is responsible for all aspects of serials and electronic resource management. She has serials experience in both public and academic libraries. Leigh Ann has earned her MLS from Clarion University of Pennsylvania and a BA from the Pennsylvania State University.
Mark Hemhauser
Systems Librarian, University of Maryland
Mark Hemhauser has 18 years of experience managing serials acquisitions and is currently the Systems Librarian for the Aleph Acquisitions and Serials module at the University System of Maryland and Affiliated Institutions. He also serves on the e-Acquisitions Team of the Kuali OLE (Open Library Environment) project--an open-source, library-driven project to build a truly integrated library system
Rebecca Kemp
University of Maryland
Rebecca Kemp is Continuing Resources Librarian at University of Maryland, College Park. She has served as a continuing resources librarian since 2004, has served on national library association committees, and has participated in a variety of state and national conferences.
As libraries move to become centers of digital collections, maintaining information on the usage of these collections is ever more critical. It's also essential to be able to maintain common measures across heterogeneous collections, in order to be able to effectively analyze how the library's collection dollars are being spent. The Project COUNTER Code of Practice and the SUSHI protocol aid in this work. This session will explore the newly-published Release 4 of the COUNTER Code of Practice for e-Resources and highlight its use in conjunction with the SUSHI (Standardized Usage Statistics Harvesting Initiative) protocol in an active library environment.
This presentation illustrates an online A-Z usage statistics web site at Arizona State University Libraries and how usage reports are gathered, stored and made accessible to decision makers. Furthermore, details about creating a usage web site and challenges one may encounter. Additionally, potential uses and future plans are discussed.
Building a foundation for collection management decisions: two approachesNASIG
Salisbury University and the University of Maryland both undertook projects to evaluate the effectiveness of EBSCO Information Service's Usage Consolidation product and the usefulness of the data extracted for collection development decisions. The goals of implementation were to centralize the collection and analysis of e-resource usage data and to allow collection management librarians easy access to usage and cost per use data to aid in their decision-making. The presenters will discuss how staff at each institution populated Usage Consolidation and presented usage reports to collection managers; how collection managers responded to the data; and how they used the data to inform collection management decisions.
Leigh Ann DePope
Serials/Electronic Services Librarian, Salisbury University
Leigh Ann DePope is the Serials/Electronic Services Librarian at Salisbury University. She is responsible for all aspects of serials and electronic resource management. She has serials experience in both public and academic libraries. Leigh Ann has earned her MLS from Clarion University of Pennsylvania and a BA from the Pennsylvania State University.
Mark Hemhauser
Systems Librarian, University of Maryland
Mark Hemhauser has 18 years of experience managing serials acquisitions and is currently the Systems Librarian for the Aleph Acquisitions and Serials module at the University System of Maryland and Affiliated Institutions. He also serves on the e-Acquisitions Team of the Kuali OLE (Open Library Environment) project--an open-source, library-driven project to build a truly integrated library system
Rebecca Kemp
University of Maryland
Rebecca Kemp is Continuing Resources Librarian at University of Maryland, College Park. She has served as a continuing resources librarian since 2004, has served on national library association committees, and has participated in a variety of state and national conferences.
Presented at the 2010 Electronic Resources & Libraries Conference. --
Mary Feeney, Jim Martin, Ping Situ, University of Arizona --
Abstract: Searches, sessions, article requests - have access to data, but what's the next step? Learn how the University of Arizona Libraries' Spending Reductions Project analyzed usage of different types of resources to assess them against quality standards and make cancellation decisions. Tools, challenges, and organizational approaches will also be discussed.
Data Stories: Using Narratives to Reflect on a Data Purchase Pilot ProgramNASIG
Anita Foster and Gene R. Springs, presenters
The Ohio State University Libraries, driven by campus demand, developed and implemented a data resource purchase pilot program that took place over one fiscal year. Having previously only prioritized the purchasing of subject-related data resources on a small scale, this initiative included large data resources, most of which can meet the research and teaching needs of a variety of academic disciplines. Beginning the pilot with very few criteria for selection and potential acquisition, the Collections Strategist and Electronic Resources Officer encountered various challenges along with way, each requiring additional exploration, research, and eventual resolution. As the pilot program proceeded, other criteria emerged as important considerations when examining data resources, particularly for content and licensing.
To best develop an understanding of what was learned over the year of this pilot program, the Collections Strategist and Electronic Resources Officer collaborated in writing "data stories," or narratives about each of the data resource options investigated for acquisition. Each narrative is structured similarly, from the requestor and initial stated need through the end result. Any pertinent details regarding content, access, or licensing were incorporated to complete the narratives. The data stories will be further analyzed to track commonalities among both the successful and unsuccessful acquisitions, with the proposed outcome of developing tested criteria for future acquisition of data resources.
Evaluating the Big Deal: What metrics matter?Selena Killick
In April 2010 the Cranfield University Libraries embarked upon a review of the electronic journal packages. Following research into usage metrics employed at other institutions a number of key performance indicators were developed and assessed using a standardised Excel template. The resulting information helped to inform a cancellation decision.
The Road from Millennium to Alma: Two Tracks, One DestinationNASIG
In 2016, two academic libraries migrated from Innovative Interface’s Millennium to Ex Libris’ Alma. Though both libraries came from a similar starting point in terms of library software, their migration environments were quite different: Colorado State University’s migration involved two campuses, CSU Fort Collins and CSU Pueblo, while Central Connecticut State University migrated with a newly-formed consortium comprised of 18 institutions. Even though both libraries share the same proprietary ILS, the environmental differences between the two libraries shape their experiences throughout the migration process. The presenters will share their libraries’ unique experiences while also addressing commonalities germane to the ILS migration process such as pre-migration data clean up, data migration, training, and designing workflows. Particular attention will be paid to the data migration process that details the extraction process along with coordinating these efforts. Because Alma is designed on a different concept than III’s Millennium, the redesign of workflows is critical prior to the final cutover to the new system. In light of this, the presenters will address the engagement of staff during these discussions along with their professional growth. In addition to explaining the technical aspects of this migration, they will also delve beneath the surface of the intellectual labor required for implementation and examine the psychological impact on all constituents who will use the new system for their daily work.
Kristin D'Amato
Central Connecticut State University
Kristin D’Amato is the Head of Acquisitions and Serials at Central Connecticut State University’s Elihu Burritt Library. She received her master’s in Library and Information Science from SUNY Albany and her bachelor’s in English Literature from SUNY Geneseo.
Rachel Erb Edit Profile
Colorado State University
Rachel A. Erb is the Electronic Resources Management Librarian at Colorado State University’s Morgan Library. She received her master's in Library Science from Florida State University, a master's in Slavic Languages and Literatures from Ohio State University, and her bachelor’s in Russian from Dickinson College.
Appfluent enables Enterprise Data Administration
by providing IT organizations with unprecedented
visibility into their Big Data systems to reduce costs
and optimize performance.
Appfluent is the only solution that provides detailed insight into business activity, data usage and resource consumption across heterogeneous analytic data platforms including Teradata, Oracle Exadata, IBM DB2, IBM® PureData™ for Analytics powered by Netezza and Hadoop.
Presenters: John Stephens, Lydia Hofstetter.
Presented at the Georgia Libraries Conference in Columbus, GA on 10/05/2017.
This session will give an overview of COUNTER statistics, including their development, use, and relation to other library metrics. The presentation will also discuss practical issues related to gathering, reporting, and common problems.
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
This presentation was provided by Karen Wetzel and Todd Carpenter of NISO, Peter Shepherd of Project COUNTER, Tansy Matthews of George Mason University, and Susan Golden of Serials Solutions during the NISO Webinar "COUNTER and Usage Data, Part One: COUNTER: A How-To Guide," held on May 6, 2009.
Evaluating the Big Deal: Usage Statistics for Decision MakingSelena Killick
Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.
Quick reference cards for research impact metricsLibrary_Connect
When meeting with students, researchers, deans or department heads, the metrics on these quick reference cards can serve as a jumping off point in conversations about where to publish, adding to researcher profiles, enriching promotion and tenure files, and benchmarking research outputs. The cards were co-developed by librarian Jenny Delasalle and Elsevier's Library Connect program. Learn more and download poster versions as well at: https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Opening Scholarly Communication in Social Sciences (OSCOSS)GESIS
Our system will initially provide readers, authors and reviewers with an alternative, thus having the potential to gain wider acceptance and gradually replace the old, incoherent publication process of our journals and of others in related fields. It will make journals more “open” (in terms of reusability) that are open access already, and it has the potential to serve as an incentive for turning “closed” journals into open access ones.
OSCOSS is funded by the DFG in the Open Access Transformation programme.
2014 Charleston Conference
Thursday, Nov 6, 2:15 PM
Helen Josephine, Stanford University
Indira Yerramareddy, International Food Policy Research Institute (IFPRI)
Jennifer Chang, Elsevier/ Mendeley
Presented at the 2010 Electronic Resources & Libraries Conference. --
Mary Feeney, Jim Martin, Ping Situ, University of Arizona --
Abstract: Searches, sessions, article requests - have access to data, but what's the next step? Learn how the University of Arizona Libraries' Spending Reductions Project analyzed usage of different types of resources to assess them against quality standards and make cancellation decisions. Tools, challenges, and organizational approaches will also be discussed.
Data Stories: Using Narratives to Reflect on a Data Purchase Pilot ProgramNASIG
Anita Foster and Gene R. Springs, presenters
The Ohio State University Libraries, driven by campus demand, developed and implemented a data resource purchase pilot program that took place over one fiscal year. Having previously only prioritized the purchasing of subject-related data resources on a small scale, this initiative included large data resources, most of which can meet the research and teaching needs of a variety of academic disciplines. Beginning the pilot with very few criteria for selection and potential acquisition, the Collections Strategist and Electronic Resources Officer encountered various challenges along with way, each requiring additional exploration, research, and eventual resolution. As the pilot program proceeded, other criteria emerged as important considerations when examining data resources, particularly for content and licensing.
To best develop an understanding of what was learned over the year of this pilot program, the Collections Strategist and Electronic Resources Officer collaborated in writing "data stories," or narratives about each of the data resource options investigated for acquisition. Each narrative is structured similarly, from the requestor and initial stated need through the end result. Any pertinent details regarding content, access, or licensing were incorporated to complete the narratives. The data stories will be further analyzed to track commonalities among both the successful and unsuccessful acquisitions, with the proposed outcome of developing tested criteria for future acquisition of data resources.
Evaluating the Big Deal: What metrics matter?Selena Killick
In April 2010 the Cranfield University Libraries embarked upon a review of the electronic journal packages. Following research into usage metrics employed at other institutions a number of key performance indicators were developed and assessed using a standardised Excel template. The resulting information helped to inform a cancellation decision.
The Road from Millennium to Alma: Two Tracks, One DestinationNASIG
In 2016, two academic libraries migrated from Innovative Interface’s Millennium to Ex Libris’ Alma. Though both libraries came from a similar starting point in terms of library software, their migration environments were quite different: Colorado State University’s migration involved two campuses, CSU Fort Collins and CSU Pueblo, while Central Connecticut State University migrated with a newly-formed consortium comprised of 18 institutions. Even though both libraries share the same proprietary ILS, the environmental differences between the two libraries shape their experiences throughout the migration process. The presenters will share their libraries’ unique experiences while also addressing commonalities germane to the ILS migration process such as pre-migration data clean up, data migration, training, and designing workflows. Particular attention will be paid to the data migration process that details the extraction process along with coordinating these efforts. Because Alma is designed on a different concept than III’s Millennium, the redesign of workflows is critical prior to the final cutover to the new system. In light of this, the presenters will address the engagement of staff during these discussions along with their professional growth. In addition to explaining the technical aspects of this migration, they will also delve beneath the surface of the intellectual labor required for implementation and examine the psychological impact on all constituents who will use the new system for their daily work.
Kristin D'Amato
Central Connecticut State University
Kristin D’Amato is the Head of Acquisitions and Serials at Central Connecticut State University’s Elihu Burritt Library. She received her master’s in Library and Information Science from SUNY Albany and her bachelor’s in English Literature from SUNY Geneseo.
Rachel Erb Edit Profile
Colorado State University
Rachel A. Erb is the Electronic Resources Management Librarian at Colorado State University’s Morgan Library. She received her master's in Library Science from Florida State University, a master's in Slavic Languages and Literatures from Ohio State University, and her bachelor’s in Russian from Dickinson College.
Appfluent enables Enterprise Data Administration
by providing IT organizations with unprecedented
visibility into their Big Data systems to reduce costs
and optimize performance.
Appfluent is the only solution that provides detailed insight into business activity, data usage and resource consumption across heterogeneous analytic data platforms including Teradata, Oracle Exadata, IBM DB2, IBM® PureData™ for Analytics powered by Netezza and Hadoop.
Presenters: John Stephens, Lydia Hofstetter.
Presented at the Georgia Libraries Conference in Columbus, GA on 10/05/2017.
This session will give an overview of COUNTER statistics, including their development, use, and relation to other library metrics. The presentation will also discuss practical issues related to gathering, reporting, and common problems.
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
This presentation was provided by Karen Wetzel and Todd Carpenter of NISO, Peter Shepherd of Project COUNTER, Tansy Matthews of George Mason University, and Susan Golden of Serials Solutions during the NISO Webinar "COUNTER and Usage Data, Part One: COUNTER: A How-To Guide," held on May 6, 2009.
Evaluating the Big Deal: Usage Statistics for Decision MakingSelena Killick
Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.
Quick reference cards for research impact metricsLibrary_Connect
When meeting with students, researchers, deans or department heads, the metrics on these quick reference cards can serve as a jumping off point in conversations about where to publish, adding to researcher profiles, enriching promotion and tenure files, and benchmarking research outputs. The cards were co-developed by librarian Jenny Delasalle and Elsevier's Library Connect program. Learn more and download poster versions as well at: https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Opening Scholarly Communication in Social Sciences (OSCOSS)GESIS
Our system will initially provide readers, authors and reviewers with an alternative, thus having the potential to gain wider acceptance and gradually replace the old, incoherent publication process of our journals and of others in related fields. It will make journals more “open” (in terms of reusability) that are open access already, and it has the potential to serve as an incentive for turning “closed” journals into open access ones.
OSCOSS is funded by the DFG in the Open Access Transformation programme.
2014 Charleston Conference
Thursday, Nov 6, 2:15 PM
Helen Josephine, Stanford University
Indira Yerramareddy, International Food Policy Research Institute (IFPRI)
Jennifer Chang, Elsevier/ Mendeley
presentation at ALA Annual 2016 ALCTS/LITA Electronic Resources Management Interest Group panel “Making it count: Usage statistics and electronic resources management.”
This presentation was provided by Oliver Pesch, Chief Strategist, E-Resources
EBSCO Information Services during the ALA Midwinter Meeting, held on January 22, 2012.
With ever-shrinking library budgets it is more essential than ever to ensure that the library collection is targeted, relevant and well-used. Return on Investment (ROI) has become the mantra of library management and libraries need to show accountability for collection decisions. This webinar will focus on speakers who have successfully implemented assessment metrics (such as COUNTER 3, Eigenfactor and impact factors) as one determining factor of collection development decisions.
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data t...Michael Levine-Clark
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data to Better Understand Collections Usage,” Invited Workshop, 14th International Southern Africa Online Information Meeting (SAOIM), Pretoria, June 19, 2018.
Expert workshop on Improving activity data for Tier 2 estimates of livestock emissions: Dealing with data gaps
July 17-18, 2018
Summary and workplan
Lini Wollenberg, Sinead Leahy, Harry Clark
ACL Software is a powerful product yet many users are concerned it is difficult to start and therefore, may never effectively maximize the product. If you fall into this category or just want to learn from one of the top industry experts in ACL Software (over 20 years experience), this course will provide the key learning blocks to get started quickly auditing three top audit areas for data analytics.
Using a live/video training library approach, we help companies of all sizes use audit and assurance software to improve business intelligence, increase efficiencies, identify fraud, test controls, and bottom line savings.
AuditNet and Cash Recovery Partners Webinar recording available at auditsoftwarevideos.com and AuditNet.tv (registration required) Recording free to view.
Sample Data Files for All Courses are available for $49
To purchase access to all sample data files, Excel macros and ACL scripts associated with the free training visit AuditSoftwareVideos.
A snake, a planet, and a bear ditching spreadsheets for quick, reproducible r...NASIG
Presenter: Andrew Kelly, Cataloging & E-Resources Librarian, Paul Smith's College
This poster has two accompanying handouts: https://www.slideshare.net/NASIG/a-snake-a-planet-and-a-bear-ditching-spreadsheets-handout1 and https://www.slideshare.net/NASIG/a-snake-a-planet-and-a-bear-ditching-spreadsheets-handout2slides.
This webinar will provide an overview of the current work undertaken to re-write the techniques for electronic resource management with the incorporation of open access workflow management. This overview will provide insight into the key areas under exploration and outline the feedback compiled from the two interactive sessions held at the UKSG Annual Conference. We will also talk about the next steps we undertake to share the development of this project.
Business intelligence - benefits of using an online analytical solutionHeadChannel
Have you ever considered using business intelligence solutions in your company? You can find here several important benefits of Business Intelligence. Data analysis can be: faster, more efficient, more accurate. Reports can be: more detailed, wider ranging, more useful. Costs savings from: greater efficiency, more accurately predict future trends.
Business intelligence - benefits of using an online analytical solutionHeadChannel
Have you ever considered using business intelligence solutions in your company? You can find here several important benefits of Business Intelligence. Data analysis can be: faster, more efficient, more accurate. Reports can be: more detailed, wider ranging, more useful. Costs savings from: greater efficiency, more accurately predict future trends.
*Updated and reorganised following feedback in the breakouts*
While many librarians have developed mechanisms and
structures for managing local scholarship separate from
their standard resource management practices, the
intersection of the two content streams is occurring at
many institutions. During the past decade the presenters
have dedicated themselves to capturing best practices
of electronic resource management and mapping out
paths for creating open access workflows. Join them for a
lively discussion and interactive session where they outline
ways to bring these two initiatives together and identify the
teams needed.
Graham Stone
Jisc Collections
Peter McCracken
Cornell University
Jill Emery
Portland State University Library
This presentation was provided by Chan
Li of The California Digital Library, during the NISO update of the ALA Midwinter Conference, held from June 23rd to June 26th, 2009.
Open data presentation on tools and automationPia Waugh
This is a short presentation about how to make your data lovely, including how to prioritise, clean, extract, transform and automate data publishing in your organisation.
Lecture presented by Willian San Andres-Frias on 5 July 2016, at the Philippine Association of Academic/Research Libraries forum on the occasion of the Philippine Academic Book Fair on July 5 to 7, at the Megatrade Hall 1, SM Megamall, Mandaluyong.
NISO Virtual Conference: Expanding the Assessment Toolbox: Blending the Old and New Assessment Practices
Value in numbers: A Shared Approach to Measuring Usage and Impact
Jo Alcock MSc(Econ) MCLIP, Researcher, Evidence Base, Birmingham City University
Value in numbers: A Shared Approach to Measuring Usage and Impact JUSPSTATS
Presentation given as part of the NISO Virtual Conference: Expanding the Assessment Toolbox: Blending the Old and New Assessment Practices. The presentation gives an overview of JUSP and IRUS-UK and shows the value in using a shared approach to measuring usage and impact.
This workshop will provide participants with an overview of COUNTER statistics and the beginnings of a skill set for working with these reports. It is aimed at library personnel new to the area of electronic usage statistics. Participants in this workshop will:
Learn about the types of COUNTER reports
Explore questions usage statistics can help answer
Discuss ways that usage statistics should not be employed
Practice manipulating COUNTER reports in Microsoft Excel
Speaker: Jennifer Leffler, Technical Services Manager, University of Northern Colorado
Similar to From Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources (20)
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
1. From Spreadsheets to SUSHI
Five years of assessing use
of e-resources
Leslie Farison: Appalachian State University
Kristin Calvert: Western Carolina University
2. In the beginning
Create Spreadsheet
Populate Fields
Acquisitions: Title, order record, publisher
platform, cost
Collections: usage statistics, cost per
use, vendor site access information (url;
login; password)
5. Which Reports?
• COUNTER provides a somewhat
comparable model for providing usage
data.
• COUNTER reports are the most
standardized so use those where
available.
6. What measure is meaningful?
Sessions
Abstracts
Searches
Full Text
7. Selected Measures
• Sessions and searches not meaningful
• Full text for full text resources
• Abstracts for index/abstract resources
8. Summer 2009
Cancellation Project
• Added fields to spreadsheet:
•
•
•
•
•
•
•
Selector
One time $
Date acquired
Invoice date
Ebsco SA
Fund code
Notes & cancellation restrictions
10. Preparing for CY 2013
• Many changes to take into
consideration due to COUNTER R4.
• Released April 2012
• Implementation date December 2013
11. COUNTER Release 4
• A single, integrated Code of Practice
covering
journals, databases, books, reference
works and multimedia content.
• No more sessions; result clicks & record
views.
• Increased learning curve.
14. R4: Journal Reports
COUNTER
Reports
Description
Status
Journal Report 1
Number of Successful Full-Text Article Requests by Month and Journal
M
Journal Report 1 GOA
Number of Successful Gold Open Access Full-Text Article Requests by Month
and Journal
M
Journal Report 1a
Number of Successful Full-Text Article Requests from an Archive by Month
and Journal
O
Journal Report 2
Access Denied to Full-Text Articles by Month, Journal and Category
M
Journal Report 3
Number of Successful Item Requests by Month, Journal and Page-type
O
Journal Report 3 Mobile
Number of Successful Item Requests by Month, Journal and Page-type for
usage on a mobile device
O
Journal Report 4
Total Searches Run By Month and Collection
O
Journal Report 5
Number of Successful Full-Text Article Requests by Year-of-Publication (YOP)
and Journal
M
15. Database & Book Reports
COUNTER
Reports
Description
Database Report 1
Total Searches, Result Clicks and Record Views by Month and Database
M
Database Report 2
Access Denied by Month, Database and Category
M
Platform Report 1
(formerly Database Report 3)
Total Searches, Result Clicks and Record Views by Month and Platform
M
COUNTER
Reports
Status
Status
Book Report 1
Number of Successful Title Requests by Month and Title
M
Book Report 2
Number of Successful Section Requests by Month and Title
M
Book Report 3
Access Denied to Content Items by Month, Title and Category
M
Book Report 4
Access Denied to Content items by Month, Platform and Category
M
Book Report 5
Total Searches and by Month and Title
M
16. Other Reports
COUNTER
Reports
Status
Multimedia Report 1
Number of Successful Full Multimedia Content Unit Requests by Month
and Collection
M
Multimedia Report 2
Number of Successful Full Multimedia Content Unit Requests by Month,
Collection and Item Type
O
Title Report 1 (formerly
Journal/Book Report 1)
Number of Successful Requests for Journal Full-Text
Articles and Book Sections by Month and Title
O
Title Report 1 Mobile
Number of Successful Requests for Journal Full-Text
Articles and Book Sections by Month and Title (formatted
for normal browsers/delivered to mobile devices AND formatted
for mobile devices/delivered to mobile devices
O
Title Report 2
Access Denied to Full-Text Items by Month, Title and
Category
O
Title Report 3
Number of Successful Item Requests by Month, Title and
Page Type
O
Title Report 3 Mobile
Number of Successful Item Requests by Month, Title and
Page Type (formatted for normal browsers/delivered to mobile
devices AND formatted for mobile devices/delivered to mobile
devices
O
17. Workflow
Librarians must download spreadsheets a
file at-a-time from each vendor site then
load them.
Librarians need a more efficient method for
getting the data.
19. SUSHI
•
•
•
•
Not a stand alone application
Works with another system
Retrieve COUNTER reports
XML format
20. Not a panacea
To date, SUSHI has failed to live up to its
promise, largely because of the lack of
systems available to take advantage of
the protocol and the ongoing irregularity
of vendors' applications of the COUNTER
standard.
21. Hybrid Models
• Many universities are adding ERMs
and/or third party assessment products
and are using SUSHI to gather some
data, mostly for journals.
• Continuing to use other methods, such
a spreadsheets, for others.
•
Fry, A. (2013). "A Hybrid Model for Managing Standard Usage Data: Principles
for e-Resource Statistics Workflows." Serials Review 39(1): 21-28.
22. EBSCO Usage Consolidation
• EBSCONET Usage Consolidation
Product launched Jan 2012
• Reviewed in April 2012
• Revisited June 2013
• Added in July
23. E-resource Statistics
at Western Carolina University
• Statistics we collected were…
▫
▫
▫
▫
Driven by reporting agencies
Entirely reactive
Time consuming
Lack trending
24. Goals for Western Carolina Univ
• Reduce time spent maintaining database
statistics spreadsheet.
• Make cost-per-use figures more accessible to
collection development.
• Be able to use journal statistics holistically and
move away from title-by-title requests
▫ “Do we have use data for Nature?”
25. EBSCO Usage Consolidation
COUNTER-compliant statistics are loaded from all sources
and matched against EBSCO’s AtoZ Knowledgebase
SUSHI enabled for harvesting statistics automatically
Report tools look at use by platform, title, publisher, etc.
Quickly identify most- and least-used resources
Data is fed into EBSCOnet Subscription Management to
combine use and cost data for incredibly powerful collection
development analysis
26. Configuring Usage
Consolidation
Selected vendors where EBSCO
loads/manages statistics for
you. These vendors cannot be
changed later.
Configured remaining eresource providers, selecting
SUSHI harvesting as often as
possible. Troubleshooting
required.
Manually load statistics for
remaining providers.
Load historical statistics.
30. Requires staff to touch each
exception and ignore a title or
find a match.
31. Set-up period
*2-3 weeks
Initial clean-ups
*8+ hours
Quarterly loads
*4-6 hours
Monthly clean-up
* 1 hour or less
Process & Clean-up
Check resources have
successfully matched against
the EBSCO AtoZ
knowledgebase.
Requires some staff time to
deal with exceptions report,
though the size of these
reports shrink each month as
the system learns from you.
Decide how often to load
statistics manually.
32. Reports
Available Criteria
Example
• Report types
• Date range
• Title, Database or
COUNTER
• Single metric per report
• Sessions or FT requests
• Limit to one platform
• IngentaConnect
• Limit to top/bottom use
• 10 titles with most use
• Limit to use amount
• All titles with use <= 5
• Estimate missing usage
• Project use for the
remainder of the year
33. Reports, cont.
Exporting
Limitations
• Export to Excel, csv, tsv
• Unable to exclude
aggregator usage (as a
group) from title reports
• Includes:
▫ Standard COUNTER
report fields
▫ Title sort field
▫ EBSCO A-Z Holdings Y/N
• Must export reports to
include cost information,
OR look up title in
Subscription Management
34. Journals in UC
• With integration with EBSCO AtoZ, journal statistics
work really well.
• COUNTER reports from publishers can include nonsubscribed titles. UC lets you limit your reports to titles
in your collection.
• Spreadsheet formatting from publisher can require
tweaking before loading in UC.
• Publishers can be slow to respond to requests for SUSHI
permissions.
35. Databases in UC
• WCU still using spreadsheets
• Many, many database providers are not COUNTER
compliant.
• Usage loading service is great for EBSCOhost databases
– there are a LOT of titles in the DB1 and JR1 reports.
• Database title matching is problematic and there is no
standard number to use.
▫ We cannot load stats for some because UC can’t match
the name of the database to the entry in the
knowledgebase (bug).
36. E-Books in UC
• WCU testing this now
• WCU doesn’t use EBSCO AtoZ to manage our
ebooks
• No way to integrate cost/order information
▫ Also true for databases and journals not acquired
through EBSCO
▫ Must export report and calculate in Excel
37. Under Construction
If there are some
functional issues, and
we have not eliminated
spreadsheets entirely,
and some calculations
still need to be done in
Excel, why do we want
to use Usage
Consolidation?
38. Gaining Efficiencies
The small amount of
work it takes to load data
for journal subscriptions
allows us to spend more
time to use the data to
do our jobs then to have
the data collection
become our entire job.
49. A Vision for the Future
• SUSHI implementation seems to be
better suited for a group of libraries.
• A shared portal for some group of
libraries in North Carolina.
• Perhaps the UNC System.
• Libraries participating in the Carolina
Consortium.
• Samples of portals include:
51. SUSHI: Delivering Major
Benefits to JUSP
The use of SUSHI has demonstrably saved JUSP and
the UK HE community hundreds of thousands of
pounds of staff costs since its inception; add in an
estimated 97%+ of time in data collection and
processing every month and the dual benefits are
enormous; as more publishers join, this efficiency will
continue to increase. In an age of funding cuts and
budget restrictions, the combination of JUSP and
SUSHI thus affords an economical, high-quality
alternative to the previously onerous and unending task
of journal statistics gathering and management.
52. Shared Tools for analysis of e-resources in
Higher Education Libraries in Sweden
(NordLIC)
53. The perfect storm
•
•
•
•
•
•
Counter R4: December 2013
New data elements
More stringent auditing
Increasing # vendors SUSHI compliant
New implementation protocol
A single portal
Editor's Notes
The collection and meaningful analysis of accurate usage statistics has become increasingly important as library budgets continue to shrink.
In-house – circulation and re-shelving statisticsReplaced by vendor provided data for e-resourcesVery valuable but difficult to collect, manage and assess
Where to beginLocate where to go to get usage reportsOver time, attempts to gather e-resource usage statistics had been somewhat erratic and shared by a number of different peopleAs a result, the library had a lot of incomplete and incorrect information stored in a variety of places.Gathered and corrected existing information, tested it to see if it worked and contacted vendors for missing informationStore it in one place where it could be both secure and appropriately available. Directly in the spreadsheet.
With some more enhancements, Our spreadsheet became a sort of home grown EMSDiscontinuedIt hasn’t turned out to be what we thought. Cannot set up a logging system (with reminders/tasks) that is not check-in prompted. Current system of emailing works just fine.Will not accept locally-created vendor records for non-Ebsco-associated vendorsWill not accept attachments (i.e. scanned licenses, etc.)Clunky and slow.Is made mostly redundant by EbscoNet and A-to-Z.Seems to be more than we really need.
http://www.projectcounter.org/r4/APPA.pdf
Librarians are able to compare usage statistics from different vendors; derive useful metrics such as cost-per-use; make better-informed purchasing decisions; plan infrastructure more effectively.help librarians to make sense of themARL New Measures Initiativehttp://www.arl.org/stats/newmeas/newmeas.htmlThe ARL Association of Research Libraries) New Measures Initiative has been set up in response to the following two needs: increasing demand for libraries to demonstrate outcomes/impacts in areas important to the institution, and increasing pressure to maximise use of resources.Of particular interest is the work associated with the E-metrics portion of this initiative, which is an effort to explore the feasibility of defining and collecting data on the use and value of electronic resources.new discipline of usage bibliometrics
an attempt to address this difficulty
SUSHI developed as an attempt to address this difficultySUSHI acts as an enabling technology by allowing Usage Consolidation modules to automate the harvesting of COUNTER reportsCOUNTER has also worked with NISO on SUSHI (Standardised Usage Harvesting Initiative) to develop a protocol to facilitate the automated harvesting and consolidation of usage statistics from different vendors. This protocol may be found on the NISO website at http://www.niso.org/workrooms/sushi/ Information StandardsNISO, the National Information Standards Organizationhttp://www.niso.org/apps/group_public/project/details.php?project_id=97http://www.niso.org/workrooms/sushi/http://www.projectcounter.org/http://www.projectcounter.org/documents/newsrelease_apr2012.doc
SUSHI is not a stand-alone application, it works with another system to retrieve COUNTER usage reportsThe COUNTER reports are in XML format so they are not readable by humans therefore,COUNTER reports need to be loaded into another system for processing and reportingFor SUSHI to be effective, a Usage Management system must be in place? Check-list of general featuresDatabase to store usage data and relate it to titles in databases and packages and the platforms where they are hostedData structure to allow usage to be consolidated (be able to find all usage for a particular title)A mechanism to load COUNTER reportsA method of matching variants of titles and variant ISSNsAn interface to generate reportsCheck-list of features for SUSHI implementationDevelop a SUSHI clientStore SUSHI related data about the hosts of contentAre they SUSHI compliant?URL of their SUSHI server and login credentialsDay of month to retrieve reportsCreate a scheduler to detect when its time to harvest the usage and automatically start the SUSHI clientCreate a mechanism to process and import COUNTER data in XML formatAdd some error handlingProviding tools and support to developersWeb siteWebinarsImplementer listservRecruiting experts -- 寿司職人 (SUSHI Shokunin)
Great articleFry, A. (2013). "A Hybrid Model for Managing Standard Usage Data: Principles for e-Resource Statistics Workflows." Serials Review 39(1): 21-28.
The great value will be for your key platforms to be loaded into Usage Consolidation for you to see cost per use integration for your e-journals and e-journal packages handled by EBSCO, including the Package Dashboard in the EBSCONET Analytics.Strength/weaknessesUsage Consolidation cannot total usage data across all platforms for reports such as ACRL’s annual surveyBeta tester - The University of Alabama at Birmingham (UAB) fall 2011If a vendor is sushi compliant jr1,db1, db2Wake ForestUC is basically their online platform; you pay an annual fee and can use their service to download & store usage statistics. ULS is when you pay EBSCO extra to do all the work for you of setting up the platform, downloading the stats, and reconciling the title differences.The real time-killer is in reconciling the downloaded reports. Whether you have UC import the reports using SUSHI or you download COUNTER reports manually and upload them into UC, you still have to manually reconcile the titles in the report against EBSCO's knowledgebase. UC gives you a list of all the journals from the COUNTER report that it couldn't automatically match to their knowledgebase (usually because there are too many possibilities), and you have go one by one and select which knowledgebase title that journal matches. This can be a huge task at first. I have heard EBSCO reps and other librarians using UC say that the number of titles that must be reconciled decreases with each report, because the system remembers how you matched them before. But plan on lots of time up front.I thought of one other criticism I have of EBSCO Usage Consolidation that I probably should have mentioned. Right now UC only supports COUNTER reports JR1, BR1, BR2, DB1, and DB2. We need DB3 (which becomes PR1 with COUNTER Release 4) for annual stats reporting. But really I think UC is designed for compiling & consolidating journal usage stats, which makes sense for EBSCO anyway.based on what I've seen so far, having them load the stats is the way to go.Strength/weaknessesUsage Consolidation cannot total usage data across all platforms for reports such as ACRL’s annual surveyBeta tester - The University of Alabama at Birmingham (UAB) fall 2011If a vendor is sushi compliant jr1,db1, db2
Gather particular statistics for IPEDS or ARL. Used suggested metrics.Journal statistics only gathered when a renewal decision was needed.Compiling stats to a single spreadsheet takes a long time. Looking at large sets of data—like Big Deal packages—becomes a full time job.Generally look at use for just one fiscal/subscription year. Use or CPU alone doesn’t tell you how the resource measures up to the rest of your collection. Or compared to last year’s numbers, etc.
Attempting to automate the gathering of statistics, so we consider what statistics would be useful for us rather than spending what little time we have to collect what we have to.
Valid COUNTER reports: JR1, DB1, DB2, BR1, BR2 – both Release 3 and 4.A journal’s combined use shows where patrons access journal content: e.g. publisher’s site, JSTOR, and EBSCO databases
Usage Consolidation set up is done at the vendor level to apply to all journals and resources on the platform. For instance, Highwire. Includes feature called Usage Loading Service. We received a certain number of these included in our UC subscription, but had the option to pay for additional. EBSCO will manually load some historical and all current statistics twice a year, including all troubleshooting and clean-up. EBSCO vendors: We gave them the vendors with large numbers of resources and database-heavy vendors (EBSCO, Gale, etc.). Configuring vendors: data entry to copy & paste URLs, usernames, and passwords into the system and then selecting which reports are valid. No batch uploading. Additional set-up for SUSHI harvesting. Many of these providers require you to contact the publisher to activate permissions at the publisher site. Fewer providers support SUSHI than expected. I expected most COUNTER compliant providers to also be SUSHI compliant. But some do not support SUSHI and others do, but there are XML issues which cause the loads to fail in UC. We can’t use SUSHI for Elsevier and Wiley and IEEE.
If exceptions not worked, then those stats aren’t loaded and available for reports.
This is the estimated amount of time WCU spent on set-up and a breakdown of the ongoing time commitment. Before we began this process we had gathered the login credentials for our providers already. There were maybe 3 or 4 we had to add. The set-up time includes the basic data entry as well as configuring SUSHI for the vendors. This includes contacting publishers to enable SUSHI permissions and troubleshooting with EBSCO when we had errors.The portion of the ongoing process which takes staff time is the exceptions report from the last slide. These are titles where UC couldn’t find a match, or there was a match against more than one title. Once you go through this clean up, the system remembers your choices with the disambiguation and applies it to future loads. As you go on, the number of exceptions shrinks. In fact, for us, many of the publishers have 0 exceptions to clean up. But at the start it can be a few hundred.We’ve chosen to load statistics for the resources which aren’t pulled in with SUSHI on a quarterly basis. This can be done on your schedule. It takes just as much time to load 3 months of data as it does a year. If you don’t need the statistics available for reports on a regular basis, and you know you always review resources over the summer, you could easily do it once a year. The data EBSCO loads for us, for instance, is done twice a year.
Title (includes book title and journal titles). Database. Or in COUNTER format.Choose date range, and either totals or with monthly breakdowns.For Title and Database reports, you can only run them for one metric at a time. Usage projection appears to use the previous year’s statistics to estimate future use. Calculates the total amount of use and then divides it out evenly over the months.
Title, ISSNs, Publisher, Platform, use dataWe’re interested in looking at the top use titles, we are interested in the top and bottom use titles we subscribe to. Also true looking for zero use titles when there are a ton of journals in aggregators which would be included. You’d have to export and filter them out first.Same with cost and CPU calculations. These features are available in Subscription Management (which I will be showing off in a few minutes), or otherwise you need to do that in Excel.
Enhancements of the SUSHI (Standardised Usage Statistics Harvesting Initiative) protocol designed to facilitate its implementation by vendors and its use by librarians This schema covers all the usage reports listed below. COUNTER reports in XML must be downloadable using the SUSHI protocol, or manually, or both.
The JUSP portal contains (JR1 and JR1a) COUNTER-compliant usage statistics JUSP is a shared service that aims to save academic libraries time and duplicated effort by providing a single gateway for them to access their usage statistics from all publishers, gateways and host intermediaries participating in the NESLi2 consortium agreements and from other publisher deals.
SUSHI: Delivering Major Benefits to JUSPAriadne: Web Magazine for Information Professionals5 December 2012http://www.ariadne.ac.uk/issue70/meehan-et-al
Transcript of Shared Tools for analysis of e-resources in Higher Education Libraries in Sweden (NordLIC)Shared Tools for analysis of e-resources in HE Libraries in Sweden Outcome Swedish Higher Education (HE) Libraries spend on an average 70 % of their media budget on e-resources.Harvesting is easy - analysis hard!Are budgets spent on the right resources? "facilitate analysis of usage in relation to economic factors for HE libraries in Sweden" Background Aim Create a template for the workflow for harvesting and collating usage stats during the year 1) Develop practical tools to facilitate easier and a more effortless evaluation of e-resource package deals 2) What now? Project description A joint project between KarolinskaInstitutet, University Library and Stockholm University Library funded by the National Library of Sweden during 2011.The project group consisted of two librarians from each institution and one statistician. We also had contact with a bibliometrician. Identify trends! National Library administer the toolsFreely available onlineCommunity? KPI E-Journals KPI E-books KPI Databases BibliometricAnalysis Cited by the University In Library Collection Instruction for a bibliometric analysis Web of Knowledge BibExcel abbr. title to ISSN Search strategy Instruction Attribution ShareAlike Image by: freepsdfilesBildfrån Open Clippart: CC-licens Image by: Kyu Oh (CC-license) Bildfrånfreepsdfiles http://www.lib-stats.org.uk/ henrik.akerfelt@ki.se karin.perols@sh.se Contact Information Thank You! Image by: heaplegoblackfriday (CC-license) Demonstration HenrikÅkerfeltKarolinskaInstitutet, Univeristy Library Image by: LuMaxArt (CC-license) Selena Killick Key Performance Indicators Image by: OpenClipart (CC-License) Image by: Will Scullin (CC-license) Hakerfelt
New data elements DOI and proprietary identifierImproves identification of items listed in the reportsImproves matching usage to the correct items in a knowledge baseInvolves coordination with KBART so the same identifier is on both the vendor’s Title List and their COUNTER reportThe difference between COUNTER and many other standard initiatives, is that a content provider must pass formal audit in order to be recognized as compliant. The goal is for auditors to use the COUNTER-SUSHI implementation profile as a guide and thus help eliminate the sometimes seemingly minor inconsistencies that cause major headaches. Adoption of SUSI and the required COUNTER XML reports is increasing; however, for those of us who work with these technologies, inconsistencies between implementations can be a problem. To address that the SUSHI Standing Committee has released an Implementation Profile for COUNTER and SUSHI – we will review what it includesSUSHI is an XML based standard that requires the COUNTER report to also be delivered as XML. A new code of practice means the XML schemas need to be brought in line.challenges faced developers of usage consolidation tools, many are not because of technical errors in the implementation of the SUSHI standard or the COUNTER XML it delivers; but rather in interpretation of the underlying standards. The COUNTER-SUSHI Implementation Profile has been created to set clear expectations as to how the standards should be interpreted and implemented. Improve consistency.