As libraries move to become centers of digital collections, maintaining information on the usage of these collections is ever more critical. It's also essential to be able to maintain common measures across heterogeneous collections, in order to be able to effectively analyze how the library's collection dollars are being spent. The Project COUNTER Code of Practice and the SUSHI protocol aid in this work. This session will explore the newly-published Release 4 of the COUNTER Code of Practice for e-Resources and highlight its use in conjunction with the SUSHI (Standardized Usage Statistics Harvesting Initiative) protocol in an active library environment.
This presentation was provided by Karen Wetzel and Todd Carpenter of NISO, Peter Shepherd of Project COUNTER, Tansy Matthews of George Mason University, and Susan Golden of Serials Solutions during the NISO Webinar "COUNTER and Usage Data, Part One: COUNTER: A How-To Guide," held on May 6, 2009.
About the Webinar
In a time of shrinking budgets and growing reliance on electronic resources, the collection and analysis of usage statistics has become a staple of the library world. But while usage statistics may be ubiquitous, many librarians still struggle with the best methods of interpreting the data. The ability to effectively understand and apply usage data is an important skill for librarians to master as they attempt to analyze their collections and justify their expenses to administrations.
This webinar will highlight the ins and outs of COUNTER, as well as discuss the process of analyzing the data once harvested.Introductions
Agenda
Todd Carpenter, Executive Director, NISO
Todd Enoch, Head, Serials and Electronic Resources, University of North Texas Libraries;
Chair of the Continuing Education Committee, NASIG
* * * * * * *
COUNTER Update: Release 4 of the COUNTER Code of Practice for e-Resources
Peter Shepherd, Project Director, COUNTER
Integrating COUNTER Statistics within the Information Workflow
Oliver Pesch, Chief Product Strategist and Senior Vice President, EBSCO Information Services
Usage in the Eye of the Beholder: Developing Academic Library Usage Reports that Meet the Needs of Your Institution
Jill Emery, Collection Development Librarian, Portland State University Library
In July, COUNTER will publish the Release 5 of the Code of Practice. This webinar will describe the new master reports which are at the heart of the new Code of Practice. It will explain the metrics and attributes and the standard views from the master reports. Finally, it will explain the timeline for publisher compliance and the tools and guides which will enable content providers and librarians prepare for the effective date for Release 5.
This presentation was provided by Karen Wetzel and Todd Carpenter of NISO, Peter Shepherd of Project COUNTER, Tansy Matthews of George Mason University, and Susan Golden of Serials Solutions during the NISO Webinar "COUNTER and Usage Data, Part One: COUNTER: A How-To Guide," held on May 6, 2009.
About the Webinar
In a time of shrinking budgets and growing reliance on electronic resources, the collection and analysis of usage statistics has become a staple of the library world. But while usage statistics may be ubiquitous, many librarians still struggle with the best methods of interpreting the data. The ability to effectively understand and apply usage data is an important skill for librarians to master as they attempt to analyze their collections and justify their expenses to administrations.
This webinar will highlight the ins and outs of COUNTER, as well as discuss the process of analyzing the data once harvested.Introductions
Agenda
Todd Carpenter, Executive Director, NISO
Todd Enoch, Head, Serials and Electronic Resources, University of North Texas Libraries;
Chair of the Continuing Education Committee, NASIG
* * * * * * *
COUNTER Update: Release 4 of the COUNTER Code of Practice for e-Resources
Peter Shepherd, Project Director, COUNTER
Integrating COUNTER Statistics within the Information Workflow
Oliver Pesch, Chief Product Strategist and Senior Vice President, EBSCO Information Services
Usage in the Eye of the Beholder: Developing Academic Library Usage Reports that Meet the Needs of Your Institution
Jill Emery, Collection Development Librarian, Portland State University Library
In July, COUNTER will publish the Release 5 of the Code of Practice. This webinar will describe the new master reports which are at the heart of the new Code of Practice. It will explain the metrics and attributes and the standard views from the master reports. Finally, it will explain the timeline for publisher compliance and the tools and guides which will enable content providers and librarians prepare for the effective date for Release 5.
This presentation by Kornelia Junge, explains the COUNTER Code of Practice Release 5. It describes why it was necessary to develop Release 5 which is effect in January 2019, and the development process. It goes on to describe the key features of Release 5 including Metric Types, Attributes and report formats.
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
Irene Barbers Forschungszentrum Juelich GmbHCOUNTER’s new Code of Practice was effective from January 2019. This breakout session will explain how librarians can make effective use of the new metrics to support decision making. It will explain how librarians can use these new reports to: Understand user behaviours; perform cost per use calculations on the articles they have paid for, compare book usage across different e-book platforms, investigate usage of A&I databases and full text databases; and evaluate usage of open access content. The session will also explain how COUNTER is ensuring compliance with the new Code of Practice, and how librarians can confidently tell if a publisher or vendor is compliant.
INFLATED JOURNAL VALUE RANKINGS: PITFALLS YOU SHOULD KNOW ABOUT HTML AND PDF ...Chan Li
The California Digital Library (CDL) developed a value-based strategy to assess journals which is now used as a major part of the University of California's systemwide e-journal collection planning process. The strategy involves using objective metrics to calculate the value of scholarly journals and identify titles that make a greater or lesser contribution to the University's mission of teaching, research, and public service. A key aspect of this strategy is the use of the CDL Weighted Journal Value Algorithm to assess multiple vectors of value for each journal title under review: utility, quality and cost effectiveness.
Among all the metrics used for the Algorithm, usage data is still the key metric. However, the usage data is not as reliable and comparable as might be expected. One of the reasons is that the design of a publisher's electronic interface can have a measurable effect on electronic journal usage statistics. In 2014, CDL conducted a research project to study the impact on usage data of publisher website design. The presentation also covers how vendor interfaces and other factors impact usage data.
Presented at the 2010 Electronic Resources & Libraries Conference. --
Mary Feeney, Jim Martin, Ping Situ, University of Arizona --
Abstract: Searches, sessions, article requests - have access to data, but what's the next step? Learn how the University of Arizona Libraries' Spending Reductions Project analyzed usage of different types of resources to assess them against quality standards and make cancellation decisions. Tools, challenges, and organizational approaches will also be discussed.
This presentation was provided by Todd Carpenter, Executive Director of NISO, and Nettie Lagace, NISO on June 25, during a ALA session devoted to Altmetrics.
With ever-shrinking library budgets it is more essential than ever to ensure that the library collection is targeted, relevant and well-used. Return on Investment (ROI) has become the mantra of library management and libraries need to show accountability for collection decisions. This webinar will focus on speakers who have successfully implemented assessment metrics (such as COUNTER 3, Eigenfactor and impact factors) as one determining factor of collection development decisions.
COUNTER’s team of volunteer experts have developed Release
5 of the COUNTER Code of Practice. They have designed fewer
but more flexible usage reports and a reduced number of metric
types with the aim of greater consistency and clarity. Release 5
seeks to address changing needs and to ensure that all publishers
and content providers can achieve compliance. This session will
explain the new release and answer questions from stakeholders.
Turning the Corner at High Speed: How Collections Metrics Are Changing in a H...NASIG
Collections metrics have always been an important component of effectively managing libraries. But today they are more important than ever before as user-focused libraries and information centers attempt to adjust their collections to current and future library user needs. Frequently this requires sharp turns, smart traffic control, and even drafting behind other libraries who might be in the lead at any given stretch in order to achieve ultimate success. In this presentation, perspectives from a corporate library context and a liberal arts college library will be presented. What are the key metrics today vs. five years ago? What factors are at work that create changes in metrics value over time? What changes might we expect to see in the future? These and other questions will be addressed.
Speakers:
Marija Markovic, Independent Consultant
Steve Oberg, Wheaton College (IL)
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data t...Michael Levine-Clark
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data to Better Understand Collections Usage,” Invited Workshop, 14th International Southern Africa Online Information Meeting (SAOIM), Pretoria, June 19, 2018.
Els lc metrics_reference_cards_v2.0_slides_dec2016Jenny Delasalle
Version 2 includes the new Citescore metric. I worked on the research behind these cards, but am not the copyright owner. Originals provided at https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Presentation by Todd Carpenter and Nettie Lagace of NISO's Altmetrics Recommended Practice Outputs, delivered to the Charleston Library Conference on November 4, 2016
JUSP report features update: title master report filtering and database stand...JUSPSTATS
Presentation from JUSP webinar run on 9 July 2019. How to tailor the COUNTER Release 5 title master reports to meet your specific requirements. How to use the database standard views and metrics to evaluate databases. Future report development plans.
This presentation was provided by Paul Needham of Cranfield University and Johan Bollen of Indiana University, during the NISO webinar "Measuring Use, Assessing Success, Part Two: Count Me In: Measuring Individual Item Usage," which was held on September 15, 2010.
By The Numbers Can Usage Stats Lend A Helping Handcdabel
Elizabeth Darocha Berenz, Outreach and Instruction Librarian, ARTstor, presentation from VRA 28 Atlanta. "Can Usage Statistics Lend a Helping Hand" for the "By the Numbers" session.
Panel presentation describing potential uses of statistical reports available to ARTstor administrators through Usage Reports. (March 18, 2010)
This presentation by Kornelia Junge, explains the COUNTER Code of Practice Release 5. It describes why it was necessary to develop Release 5 which is effect in January 2019, and the development process. It goes on to describe the key features of Release 5 including Metric Types, Attributes and report formats.
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
Irene Barbers Forschungszentrum Juelich GmbHCOUNTER’s new Code of Practice was effective from January 2019. This breakout session will explain how librarians can make effective use of the new metrics to support decision making. It will explain how librarians can use these new reports to: Understand user behaviours; perform cost per use calculations on the articles they have paid for, compare book usage across different e-book platforms, investigate usage of A&I databases and full text databases; and evaluate usage of open access content. The session will also explain how COUNTER is ensuring compliance with the new Code of Practice, and how librarians can confidently tell if a publisher or vendor is compliant.
INFLATED JOURNAL VALUE RANKINGS: PITFALLS YOU SHOULD KNOW ABOUT HTML AND PDF ...Chan Li
The California Digital Library (CDL) developed a value-based strategy to assess journals which is now used as a major part of the University of California's systemwide e-journal collection planning process. The strategy involves using objective metrics to calculate the value of scholarly journals and identify titles that make a greater or lesser contribution to the University's mission of teaching, research, and public service. A key aspect of this strategy is the use of the CDL Weighted Journal Value Algorithm to assess multiple vectors of value for each journal title under review: utility, quality and cost effectiveness.
Among all the metrics used for the Algorithm, usage data is still the key metric. However, the usage data is not as reliable and comparable as might be expected. One of the reasons is that the design of a publisher's electronic interface can have a measurable effect on electronic journal usage statistics. In 2014, CDL conducted a research project to study the impact on usage data of publisher website design. The presentation also covers how vendor interfaces and other factors impact usage data.
Presented at the 2010 Electronic Resources & Libraries Conference. --
Mary Feeney, Jim Martin, Ping Situ, University of Arizona --
Abstract: Searches, sessions, article requests - have access to data, but what's the next step? Learn how the University of Arizona Libraries' Spending Reductions Project analyzed usage of different types of resources to assess them against quality standards and make cancellation decisions. Tools, challenges, and organizational approaches will also be discussed.
This presentation was provided by Todd Carpenter, Executive Director of NISO, and Nettie Lagace, NISO on June 25, during a ALA session devoted to Altmetrics.
With ever-shrinking library budgets it is more essential than ever to ensure that the library collection is targeted, relevant and well-used. Return on Investment (ROI) has become the mantra of library management and libraries need to show accountability for collection decisions. This webinar will focus on speakers who have successfully implemented assessment metrics (such as COUNTER 3, Eigenfactor and impact factors) as one determining factor of collection development decisions.
COUNTER’s team of volunteer experts have developed Release
5 of the COUNTER Code of Practice. They have designed fewer
but more flexible usage reports and a reduced number of metric
types with the aim of greater consistency and clarity. Release 5
seeks to address changing needs and to ensure that all publishers
and content providers can achieve compliance. This session will
explain the new release and answer questions from stakeholders.
Turning the Corner at High Speed: How Collections Metrics Are Changing in a H...NASIG
Collections metrics have always been an important component of effectively managing libraries. But today they are more important than ever before as user-focused libraries and information centers attempt to adjust their collections to current and future library user needs. Frequently this requires sharp turns, smart traffic control, and even drafting behind other libraries who might be in the lead at any given stretch in order to achieve ultimate success. In this presentation, perspectives from a corporate library context and a liberal arts college library will be presented. What are the key metrics today vs. five years ago? What factors are at work that create changes in metrics value over time? What changes might we expect to see in the future? These and other questions will be addressed.
Speakers:
Marija Markovic, Independent Consultant
Steve Oberg, Wheaton College (IL)
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data t...Michael Levine-Clark
Levine-Clark, Michael, “Going Beyond COUNTER: Strategies for Analyzing Data to Better Understand Collections Usage,” Invited Workshop, 14th International Southern Africa Online Information Meeting (SAOIM), Pretoria, June 19, 2018.
Els lc metrics_reference_cards_v2.0_slides_dec2016Jenny Delasalle
Version 2 includes the new Citescore metric. I worked on the research behind these cards, but am not the copyright owner. Originals provided at https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Presentation by Todd Carpenter and Nettie Lagace of NISO's Altmetrics Recommended Practice Outputs, delivered to the Charleston Library Conference on November 4, 2016
JUSP report features update: title master report filtering and database stand...JUSPSTATS
Presentation from JUSP webinar run on 9 July 2019. How to tailor the COUNTER Release 5 title master reports to meet your specific requirements. How to use the database standard views and metrics to evaluate databases. Future report development plans.
This presentation was provided by Paul Needham of Cranfield University and Johan Bollen of Indiana University, during the NISO webinar "Measuring Use, Assessing Success, Part Two: Count Me In: Measuring Individual Item Usage," which was held on September 15, 2010.
By The Numbers Can Usage Stats Lend A Helping Handcdabel
Elizabeth Darocha Berenz, Outreach and Instruction Librarian, ARTstor, presentation from VRA 28 Atlanta. "Can Usage Statistics Lend a Helping Hand" for the "By the Numbers" session.
Panel presentation describing potential uses of statistical reports available to ARTstor administrators through Usage Reports. (March 18, 2010)
Presented at the 2010 Electronic Resources & Libraries Conference. --
Lauren Fancher, GALILEO, University or Georgia --
Abstract: Galileo, Georgia's Virtual Library, has been capturing usage data from its system since 1995 and eggregating data from vendors since 2002. The history of GALILEO's engagement with providing meaningful data for meaningful purposes is full or adventure, hope, set backs, and opportunities. Learn how one consortia has braved the impossible to deliver the adequate, and hopes for a more a-COUNTER-able future.
Cornell University has had mixed results obtaining accurate cost-per use data for e-journals. In many cases, it is a simple feat of comparing the subscription cost to the COUNTER comliant usage data, but as we look deeper, and we continue to attempt to automate this process as much as possible, we uncover complexities that make this a considerable challenge. We will share our experiences and help attendees to better understand the complexities involved.
As electronic resources become more prevalent in Academic Libraries, communication between Electronic Resources Librarians and other departments such as Reference or Interlibrary Loan becomes increasingly important. The results of a survey of Academic Librarians regarding interdepartmental communication shall be presented along with a demonstration of the Intranet used by Montana State University Libraries.
presentation at ALA Annual 2016 ALCTS/LITA Electronic Resources Management Interest Group panel “Making it count: Usage statistics and electronic resources management.”
Join us for a comprehensive insight into COUNTER and the COUNTER Code of Practice including:
What is COUNTER?
Why COUNTER is important to library customers
Why COUNTER is important to publishers
How to become COUNTER compliant and the COUNTER Code of Practice
COUNTER reports for books, journals and databases
As more resources are indexed online and as more researchers begin their quest in a digital environment, unique local collections and institutional repositories play an ever more important role. The development of standards for these materials and ensuring their long-term preservation is crucial. Please join the Standards Committee and the Holdings Committee to learn more about RDA for Non-MARC testers. Discover how the PIRUS2 project (Publisher and Institutional Repository Usage Statistics) is enabling the recording and reporting of articles hosted by aggregators or in repositories. Learn how preservation standards can ensure the long-term protection of digital collections.
These slides about the COUNTER Code of Practice Release 5 reflect recent clarifications and amendments. They provide an overview of Release 5 metrics and reports.
Webinar zu Release 5 des COUNTER Code of PracticeLorraine Estelle
Als Standard für Nutzungsstatistiken elektronischer Ressourcen erhält der COUNTER Code of Practice[1] mit Release 5 eine neue Struktur. Neue Reports und Metriken werden die Genauigkeit der Nutzungsstatistiken für elektronische Ressourcen verbessern. Release 5 des Code of Practice ist ab Januar 2019 gültig und für alle COUNTER-zertifizierten Anbieter verpflichtend.
Im Webinar bieten wir einen Überblick über die neuen Metriken, erläutern die neuen Reports und geben Beispiele, wie diese zur Evaluierung des eigenen Portfolios elektronischer Ressourcen genutzt werden können. Während des Webinars können Fragen gestellt werden, die wir entweder direkt oder im Nachgang beantworten werden. Ebenso freuen wir uns über Ihr Feedback zu Release 5. Zwei Mitglieder des COUNTER Executive Committees werden das Webinar in deutscher Sprache durchführen.
Das Webinar richtet sich sowohl an Bibliothekare, die bereits Erfahrungen mit COUNTER-Statistiken haben und sich auf das neue Release vorbereiten möchten, als auch an Mitarbeiter von Anbietern elektronischer Produkte, die mit der Erstellung von Anbietern elektronischer Produkte, die mit der Erstellung von Nutzungsstatistiken nach COUNTER befasst sind.
Das Webinar wird durchgeführt von:
Irene Barbers, Leiterin des Fachbereichs Literaturerwerbung in der Zentralbibliothek des Forschungszentrum Jülich. Sie ist verantwortlich für den Betrieb und die Weiterentwicklung des Jülicher Electronic Resource Management Systems und für die Auswertung von Nutzungsstatistiken.
Bernd Oberknapp, Gesamtleitung ReDI (gemeinsame Betriebseinrichtung des Konsortiums Baden-Württemberg) an der Universitätsbibliothek Freiburg. ReDI bietet Dienstleistungen für über 100 wissenschaftliche Bibliotheken. Bernd Oberknapp arbeitet am Nationalen Statistikserver mit und ist an der Entwicklung des Electronic Resource Management Systems LAS:eR beteiligt.
Irene Barbers und Bernd Oberknapp sind Mitglieder im COUNTER Executive Committee und in der COUNTER R5 Technical Working Group
[1] https://www.projectcounter.org/
This presentation was provided by Laura Morse in informing participants about progress made on the Open Discovery Initiative at the NISO Standards Update event held during ALA Midwinter, Saturday, January 25, 2020.
Laura Wong - Jisc
In the UK, the increase in transitional agreements (TAs) has prompted us to ask new questions about how we measure the impact of the transition to OA, the performance of agreements, and the metrics we need. With COUNTER usage reports, we expect to see a shift in interest to global usage for individual research outputs. In this presentation, we cover:
• Drivers, opportunities and challenges in open access usage reporting for libraries and consortia such as Jisc
• Roles of publisher and institutional repository usage statistics
• How COUNTER 5.1 supports this work
• Next steps, lessons learned and the practical takeaways
Everything Counts in Large Amounts: Measuring the Impact of Usage Activity in...OpenAIRE
OpenAIRE usage statistics session slides at #DI4R2017. 1 Dec. 2017 - by Dimitris Pierrakos, Athena Research Center and Jochen Schirrwagen, Bielefeld University.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the closing segment of the NISO training series "AI & Prompt Design." Session Eight: Limitations and Potential Solutions, was held on May 23, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the seventh segment of the NISO training series "AI & Prompt Design." Session 7: Open Source Language Models, was held on May 16, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the sixth segment of the NISO training series "AI & Prompt Design." Session Six: Text Classification with LLMs, was held on May 9, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the fifth segment of the NISO training series "AI & Prompt Design." Session Five: Named Entity Recognition with LLMs, was held on May 2, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the fourth segment of the NISO training series "AI & Prompt Design." Session Four: Structured Data and Assistants, was held on April 25, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the third segment of the NISO training series "AI & Prompt Design." Session Three: Beginning Conversations, was held on April 18, 2024.
This presentation was provided by Kaveh Bazargan of River Valley Technologies, during the NISO webinar "Sustainability in Publishing." The event was held April 17, 2024.
This presentation was provided by Dana Compton of the American Society of Civil Engineers (ASCE), during the NISO webinar "Sustainability in Publishing." The event was held April 17, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the second segment of the NISO training series "AI & Prompt Design." Session Two: Large Language Models, was held on April 11, 2024.
This presentation was provided by Teresa Hazen of the University of Arizona, Geoff Morse of Northwestern University. and Ken Varnum of the University of Michigan, during the Spring ODI Conformance Statement Workshop for Libraries. This event was held on April 9, 2024
This presentation was provided by William Mattingly of the Smithsonian Institution, during the opening segment of the NISO training series "AI & Prompt Design." Session One: Introduction to Machine Learning, was held on April 4, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the eight and final session of NISO's 2023 Training Series on Text and Data Mining. Session eight, "Building Data Driven Applications" was held on Thursday, December 7, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the seventh session of NISO's 2023 Training Series on Text and Data Mining. Session seven, "Vector Databases and Semantic Searching" was held on Thursday, November 30, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the sixth session of NISO's 2023 Training Series on Text and Data Mining. Session six, "Text Mining Techniques" was held on Thursday, November 16, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the fifth session of NISO's 2023 Training Series on Text and Data Mining. Session five, "Text Processing for Library Data" was held on Thursday, November 9, 2023.
This presentation was provided by Todd Carpenter, Executive Director, during the NISO webinar on "Strategic Planning." The event was held virtually on November 8, 2023.
This presentation was provided by Rhonda Ross of CAS, a division of the American Chemical Society, and Jonathan Clark of the International DOI Foundation, during the NISO webinar on "Strategic Planning." The event was held virtually on November 8, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the fourth session of NISO's 2023 Training Series on Text and Data Mining. Session four, "Data Mining Techniques" was held on Thursday, November 2, 2023.
This presentation was provided by Tiffany Straza of UNESCO, during the two-day "NISO Tech Summit: Reflections Upon The Year of Open Science." Day two was held on October 26, 2023.
More from National Information Standards Organization (NISO) (20)
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
2. Release 4 of the COUNTER
Code of Practice for e-
Resources
Peter Shepherd
COUNTER
June 2012
3. COUNTER Release 4
- objectives
A single, unified Code covering all e-resources,
including journals, databases, books, reference works,
multimedia content, etc.
Improve the database reports
Improve the reporting of archive usage
Enable the reporting of mobile usage separately
Expand the categories of „Access Denied‟ covered
Improve the application of XML and SUSHI in the
design of the usage reports
Collect metadata that facilitates the linking of usage
statistics to other datasets, such as subscription
information
4. Release 4: main features
A single, integrated Code of Practice covering
journals, databases, books, reference works
and multimedia content
An expanded list of Definitions, including terms
such as „Gold Open Access‟, „Multimedia Full
Content Unit‟, „Record View‟, „Result Click‟, as
well as different categories of „Access Denied‟,
etc. that are used for the first time in Release 4
Enhancements of the SUSHI (Standardised
Usage Statistics Harvesting Initiative) protocol
designed to facilitate its implementation by
vendors and its use by librarians
5. Release 4: main features
A requirement that Institutional Identifiers, Journal DOI
and Book DOI be included in the usage reports, to
facilitate not only the management of usage data, but
also the linking of usage data to other data relevant to
collections of online content.
A requirement that usage of Gold Open Access articles
within journals be reported separately in a new report:
Journal Report 1 GOA: Number of Successful Gold Open
Access Full-text Article Requests by Month and Journal.
A requirement that Journal Report 5 must be provided
6. Release 4: main features
Modified Database Reports, in which the previous requirement to report
Session counts has been dropped, and new requirements, to report Record
Views and Result Clicks, have been added. (Database Report 3 has also
been renamed Platform Report 1).
A new report, Multimedia Report 1, which covers the usage of non-textual
multimedia resources, such as audio, video and images, by reporting the
number of successful requests for multimedia full content units
New optional reports covering usage on mobile devices
A description of the relative advantages of logfiles and page tags as the
basis for counting online usage
Flexibility in the usage reporting period that allows customers to specify a
date range for their usage reports
7. Release 4: Standard Usage
Reports
Journal Report 1: Number of Successful Full-Text Article Requests by Month and
Journal
Journal Report 1 GOA: Number of Successful Gold Open Access Full-Text Article
Requests by Month and Journal
Journal Report 2: Access Denied to Full-Text Articles by Month, Journal and
Category
Journal Report 5: Number of Successful Full-Text Article Requests by Year-of-
Publication (YOP) and Journal
Database Report 1: Total Searches, Result Clicks and Record Views by Month
and Database
Database Report 2: Access Denied by Month, Database and Category
Platform Report 1: Total Searches, Result Clicks and Record Views by Month and
Platform
Book Report 1: Number of Successful Requests by Month and Title
Book Report 2: Number of Successful Section Requests by Month and Title
Book Report 3: Access Denied to Content Items by Month, Title and Category
Book Report 4: Access Denied to Content Items by Month, Platform and Category
Book Report 5: Total Searches by Month and Title
Multimedia Report 1: Number of Successful Full Multimedia Content Units
Requests by Month and Collection
11. Release 4: recording and reporting
usage on mobile devices
The following optional additional reports enable usage on mobile devices to be
reported separately:
Journal Report 3 Mobile: Number of Successful Item Requests by Month, Journal
and Page Type for usage on a Mobile Device
Title Report 1 Mobile: Number of Successful Requests for Journal Full-text Articles
and Book Sections by Month and Title ( formatted for normal browsers/delivered to
mobile devices AND formatted for mobile devices/delivered to mobile devices)
Title Report 3 Mobile: Number of Successful Requests by Month, Title and Page
Type (formatted for normal browsers/delivered to mobile devices AND formatted
for mobile devices/delivered to mobile devices)
COUNTER will recognize as usage on a mobile device, which may be reported in the
above reports, any usage that meets one of the following criteria:
useragents that are included in the WURFL list. WURFL is the Wireless
Universal Resource FiLe, a database containing the profile of mobile devices; this
database may be found at: http://wurfl.sourceforge.net/
usage via a proprietary mobile App provided by the publisher/content provider
12. Release 4: timetable for
implementation
Deadline date for implementation of Release 4:
31 December 2013
-after this date only vendors compliant with Release
4 will be COUNTER compliant
Between now and 31 December 2013, Release 4 and the
existing Releases of the Codes of Practice are valid
13. Release 4: compliance
process
1.Agree Release 4 usage reports that are relevant to
publisher content
2.Review of reports by a COUNTER library test site
3.Vendor signs Declaration of COUNTER compliance
4.Vendor added to the Register of COUNTER-compliant
vendors
5.Independent audit must be carried out and passed within 6
months of being added to the Register
14. Release 4: independent
audit
Three aspects of the audit
Check report formats
Check data integrity
Check delivery process
Three possible audit outcomes
A Pass, in which case no further action is required by the publisher as a result of the
audit. In some cases the auditor may add Observations to the audit report, which are
designed to help the vendor improve its COUNTER usage reports, but which are
outside the scope of the audit itself.
A Qualified Pass, in which the auditor deems the publisher to have passed the audit,
but where the auditor raises a Minor Issue requiring further action to maintain
COUNTER-compliant status. A Minor Issue does not affect the reported figures, but is
one which should be resolved within 3 months of the audit to maintain COUNTER-
compliant status. An example of a Minor Issue is where a report format does not
conform to the COUNTER specifications.
A Fail, where the auditor has identified an issue that must be resolved immediately
for the vendor to maintain COUNTER -compliant status
15. COUNTER Code of Practice
-Release 4
Full details of Release 4 will be found on the
COUNTER website at:
http://www.projectcounter.org/code_practice.html
16. Quality Content • Resource Management • Access • Integration • Consultation
Making Better Decisions with Usage Statistics
SUSHI: More relevant than ever
Oliver Pesch
Chief Strategist, E-Resource
Access and Management Services
17. Overview
• Why SUSHI
• COUNTER Release 4
• Changes to schemas
• Procedural changes
• Applying lessons learned
• Other tools and resources available
18. Why SUSHI…
• Librarians doing more with less rely on usage
statistics as one measure of value of their
purchases
• Usage consolidation applications and related
services rely on COUNTER
• To be valuable, usage collection needs to be
comprehensive
• Efficiency depends on automation
• SUSHI is a very scalable standard for
harvesting COUNTER reports!
20. COUNTER 4: Schema Changes…
SUSHI Schema (the actual SUSHI standard)
No changes were required!
21. COUNTER 4: Schema Changes…
COUNTER XML Schema for reports:
changes were minor and geared towards better
compliance, including:
• ItemIdentifier, ItemPublisher elements now
optional
• PubYr attributes validate as year
• PubYrFrom, PubYrToattributes added to
support Journal Report 5
22. COUNTER 4: Schema Changes…
COUNTER Elements XML Schema: lists
valid values for certain data elements
• DataType: added Collection and Multimedia
options
• Categories: added Access_denied and
removed Turnaways.
• MetricType: added several new values to
support reports for mobile use, the new
database reports, multimedia reports and
revised access denied reports.
23. COUNTER 4: Procedural Changes related to SUSHI
• To be COUNTER compliant, a content provider
must provide a working SUSHI server
• Testing the SUSHI implementation will be part
of the audit
24. Lessons Learned:
Perspective as a Usage Consolidation Vendor
COUNTER…
• Offers the promise of consistency necessary
for consolidated reporting
• Combined with SUSHI is potentially a
significant time-saver
• Covers a broad range of reports providing
usage for journals, books, databases
25. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
26. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
27. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
28. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
29. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
30. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
31. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
32. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
• SUSHI implementations falling short
33. Lessons Learned:
Perspective as a Usage Consolidation Vendor
SUSHI Challenges
Challenges with COUNTER…
• Overly complex authentication methods
• • “Almost” understanding of market (most
Lack of compliant reports require manual
intervention
SUSHI clients are hosted -- one client
• SUSHI implementations falling short
will harvest usage for many customers)
• Support staff unfamiliar with set-up
needs or operational details
34. Lessons Learned:
Perspective as a Usage Consolidation Vendor
Challenges with COUNTER…
• “Almost” compliant reports require manual
intervention
• SUSHI implementations falling short
• Many content providers still not COUNTER
compliant
35. NISO SUSHI Standing Committee:
Release 4 of the COUNTER Code of Practice
Applying the lessons learned…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Even more resources available at the NISO
SUSHI web site.
36. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
37. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
38. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
39. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
40. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
41. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
42. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
43. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
44. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
45. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
46. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
47. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
48. NISO SUSHI Web Site:
Support for developers and librarians
Addressing the challenges…
• COUNTER schema stricter
• COUNTER SUSHI Implementation Profile
removes ambiguity
• Free web-based client available for testing or
basis for development
• Many resources available at the NISO
SUSHI web site.
49. Conclusion
• SUSHI is both relevant and necessary, but we
need…
• more compliant content providers
• with better interoperability
• COUNTER Release 4 more comprehensive
audit coupled with the COUNTER SUSHI
Implementation Profile are keys to progress
50. SUSHI: More relevant than ever
Thank You!
Visit the NISO SUSHI web site
http://www.niso.org/workrooms/sushi
Or email me at:
opesch@ebsco.com
51. Streamlining Stats:
Managing Usage Statistics Efficiently and Effectively
Amy Fry, Electronic Resources Coordinator, Bowling Green State University
images by Ken Fager, http://www.flickr.com/photos/kenfagerdotcom/
52. Usage statistics: what libraries are thinking
What do I get? How do I get it?
I need it now! How do I put it all together?
58. BGSU’s three principles to
streamline stats
Data control Organization of work Distribution of labor
Everything in its (one Stats are integrated There are jobs for
and only) place, all into the resource’s librarians, staff and
linked together. lifecycle. students.
59. Data control
ONE place for logins and passwords:
our ERM.
60. Data control
ONE place for logins and passwords:
our ERM.
61. Data control
ONE place for logins and passwords:
our ERM.
62. Data control ONE place for downloading instructions:
our wikis
67. Organization of work
Collecting stats has been simplified
• Reports are saved as downloaded (by calendar year)
• Stats are collected once a year
68. Distribution of labor
Everyone gets to help!
Librarian jobs:
• Manage logins and passwords
• Manage reporting
• Manage the SUSHI table in the ERM
• Manage stats through each resource’s lifecycle
Staff jobs:
• Write downloading instructions
• Make changes as needed in the wiki (adding, removing, and
changing resources)
• Convert files XML; upload XML files into the ERM
Student jobs:
• Collect and save stats according to wiki instructions
Other jobs:
• Serials, Acquisitions & IT staff work with order records,
do coverage load, and more.
69. Reporting
Our medium
– For databases: spreadsheets, for now
– For e-journals: our ERM
Our philosophy
– Focus on standard measures
– Try to paint a picture over time
– Accuracy isn’t everything
75. E-journal reporting using the
Millennium ERM
– Create resource records for each journal package
– Relate order records with correct from/to dates for
each subscription
– Download COUNTER JR1 reports in CSV format
– Convert to XML using the University of Nebraska-
Lincoln script (http://statsconverter4erm.unl.edu/)
– Upload each XML file to the proper resource record
80. Feel like outsourcing?
Scholarly Stats
– Basic stats collection service, pay per platform
– Basic collated reports
SwetsWise Selection Support
– Stats collection service (pay per
platform)
– Options to upload data manually
360 Counter (Serials Solutions)
– Unlimited platforms with annual
subscription
– Options to upload data manually
82. Remember…
You’re never, ever going to
get out of collecting data
altogether unless you
decide to stop looking at
your stats.
Editor's Notes
Lets look at COUNTER from the perspective of someone providing a usage consolidation service. COUNTER is the whole reason such services can exist in a scalable way… Having consistent comparable reports is a must.Combine COUNTER with SUSHI and the potential is there for great time savings by having reports automatically retrieved and loaded.COUNTER also covers a broad range which means that simply by supporting COUNTER a fairly comprehensive (and useful) service can be created.
But… there are some challenges.We have run across several “compliant” vendors that have reports that do not really meet the Code of Practice. Let me walk you through an actual example (with the names removed)…
Here is a database report 1… From a human-readable perspective, this looks pretty good. However, it is not COUNTER compliant.
Note the missing data elements…. Forget about a usage consolidation system for the moment and just consider what would happen if you used Excel to filter by “Total sessions” – you would have no clue what the numbers were for.
So to get this report to work, the missing data must be added
But there is more… there are blank rows between entries… this is not part of the COUNTER CoP…
So a librarian has to remove the blank lines.
THEN they have a file that can load… I am told it takes about 30 minutes to fix each file from this content provider.
And there are more issues like this… some are subtle like listing the report name in cell A1 as “COUNTER Journal Report 1” -- it is supposed to be “Journal Report 1” or missing the Platform name in the Totals row… or adding extra comments…
And there are more issues like this… some are subtle like listing the report name in cell A1 as “COUNTER Journal Report 1” -- it is supposed to be “Journal Report 1” or missing the Platform name in the Totals row… or adding extra comments…