Loyola University Chicago migrated from their Voyager system to Alma and Primo after 15 years. They had an aggressive 6 month timeline for the project. Key aspects of the project included data migration, system configuration, training staff, and testing the new systems. Functional workshops and weekly calls helped address workflows. The migration involved consolidating data and locations before transfer. Implementation of resource management, acquisitions, and the Primo discovery layer required continual set up and customization.
The University of Salford implemented the Alma library management system (LMS) to replace its aging Talis system. This was part of a larger digital library plan and aimed to reduce costs through consolidating systems and improve user experience. The implementation involved extensive data cleanup, process mapping, configuration, integration testing and training. While change was challenging, Alma has streamlined workflows and the library is continuing to optimize usage and integrate additional services.
The document discusses Lancaster University's transition from its legacy library system to a new unified library services platform called Alma. Key points include:
- Lancaster signed a contract with Ex Libris in 2011 to implement Alma to improve efficiency, enhance services, and position the library for the digital environment.
- The implementation involved migrating data from previous systems, configuring Alma's functionality, integrating with other campus systems, and optimizing workflows.
- Initial challenges included slow performance and incomplete integrations, but the library has now established basic workflows and sees potential for future improvements through analytics and community collaborations.
- Moving to a cloud-based system with Ex Libris provides benefits like reduced infrastructure costs and
Full Catalog RDA Enrichment in Alma (ELUNA 2015)trail001
The document summarizes the process taken by 21 libraries to enrich over 5.8 million bibliographic records in their Alma/Primo system. They partnered with Backstage Library Works to conduct authority control and descriptive enrichment on the records. Testing was done in phases on sample record sets to refine the customization profile before processing the full catalog. Careful monitoring was required as record sets were imported and published to ensure successful processing. While challenges included identifying record subsets and addressing failed imports, the outcome provided more consistent and clear records for users.
Discover some free tools to help you collaborate online. Including Trello and Twodoo for project management; Conceptboard for visual projects; and Lucidchart for working with charts and diagrams. Plus an overview of how to assess collaborative tools for your needs.
During 2014-15 the Technical Services department at Auraria Library lost over 1/3 of its workforce due to resignations/retirements. After an organizational assessment it was determined that Technical Services could be more successful, efficient, and communicative if the Acquisitions and Access & Discovery teams were merged to form Resource Management. A combined team would provide a holistic understanding of the Eresources lifecycle, creating the ability to analyze existing workflows and tools to maximize staff efficiencies. This presentation will discuss the creation of Resource Management, how lost positions were rewritten to create new positions, and how the department is looking at retention.
Katy DiVittorio, Acquisitions Librarian, Auraria Library
Danielle Williams, Periodicals Librarian, University of Evansville
Richard Guajardo, Head of Resource Discovery Systems, University of Houston
Elena Romaniuk, University of Victoria Libraries
Kathy Varjabedian, Los Alamos National Research Library
ER&L 2015 Simplifying Your DDA Program For A Better User ExperienceRene Erlandson
The panel presentation discussed experiences with demand-driven acquisition (DDA) programs and efforts to simplify the user experience. Rene Erlandson described how using WorldCat Discovery and EBL provided an easy-to-implement DDA program at the University of Nebraska Omaha. Holly Tomren discussed Drexel University Libraries' DDA plan through Ebrary and efforts to streamline record workflows. Kelly Drake explained how the Fenway Libraries Online consortium aimed to create a simple shared DDA collection but found the needs of individual libraries made full simplicity challenging.
Providing and Maintaining Access to Electronic Serials: Consortium and Member...NASIG
Ontario Council of University Libraries (OCUL) is a consortium of Ontario’s 21 university libraries in Canada. Scholars Portal (SP) is an OCUL sponsored digital repository of over 40,000,000 full text scholarly articles drawn from 18,000 journals covering every academic discipline. Scholars Portal export its holdings to knowledgebase in SFX, 360 Link, Ebsco, and Keeper’s Registry. Scholars Portal maintains a central SFX instance for member libraries for their content subscribed via OCUL. University of Windsor library is an OCUL member library who uses SFX as OpenURL link resolver for their OCUL and local subscription content. This study will examine the work flow, the problems encountered in maintaining central and local SFX instance, and discuss the advantages and challenges of providing and maintaining access to electronic serials in consortium and member library.
Presenters: Shuzhen Zhao, Bibliographic Services Librarian, Leddy Library, University of Windsor; Wei Zhao, Senior Metadata Librarian, OCUL -- Scholars Portal; Katie-Scarlett MacGillivray
The document discusses LIBISnet's implementation of Alma, Ex Libris' unified library management system. LIBISnet serves as the central library automation provider for KU Leuven University and several other institutions. It was seeking to consolidate multiple legacy systems onto a single platform to optimize workflows and extend services. After several years of testing and pilot implementations with Ex Libris, LIBISnet launched Alma in July 2014, migrating nearly 6 million bibliographic records and over 1 million patron records for 33 institutions onto the new system. Considerable training, communication, and planning went into preparing staff and ensuring a smooth transition.
The University of Salford implemented the Alma library management system (LMS) to replace its aging Talis system. This was part of a larger digital library plan and aimed to reduce costs through consolidating systems and improve user experience. The implementation involved extensive data cleanup, process mapping, configuration, integration testing and training. While change was challenging, Alma has streamlined workflows and the library is continuing to optimize usage and integrate additional services.
The document discusses Lancaster University's transition from its legacy library system to a new unified library services platform called Alma. Key points include:
- Lancaster signed a contract with Ex Libris in 2011 to implement Alma to improve efficiency, enhance services, and position the library for the digital environment.
- The implementation involved migrating data from previous systems, configuring Alma's functionality, integrating with other campus systems, and optimizing workflows.
- Initial challenges included slow performance and incomplete integrations, but the library has now established basic workflows and sees potential for future improvements through analytics and community collaborations.
- Moving to a cloud-based system with Ex Libris provides benefits like reduced infrastructure costs and
Full Catalog RDA Enrichment in Alma (ELUNA 2015)trail001
The document summarizes the process taken by 21 libraries to enrich over 5.8 million bibliographic records in their Alma/Primo system. They partnered with Backstage Library Works to conduct authority control and descriptive enrichment on the records. Testing was done in phases on sample record sets to refine the customization profile before processing the full catalog. Careful monitoring was required as record sets were imported and published to ensure successful processing. While challenges included identifying record subsets and addressing failed imports, the outcome provided more consistent and clear records for users.
Discover some free tools to help you collaborate online. Including Trello and Twodoo for project management; Conceptboard for visual projects; and Lucidchart for working with charts and diagrams. Plus an overview of how to assess collaborative tools for your needs.
During 2014-15 the Technical Services department at Auraria Library lost over 1/3 of its workforce due to resignations/retirements. After an organizational assessment it was determined that Technical Services could be more successful, efficient, and communicative if the Acquisitions and Access & Discovery teams were merged to form Resource Management. A combined team would provide a holistic understanding of the Eresources lifecycle, creating the ability to analyze existing workflows and tools to maximize staff efficiencies. This presentation will discuss the creation of Resource Management, how lost positions were rewritten to create new positions, and how the department is looking at retention.
Katy DiVittorio, Acquisitions Librarian, Auraria Library
Danielle Williams, Periodicals Librarian, University of Evansville
Richard Guajardo, Head of Resource Discovery Systems, University of Houston
Elena Romaniuk, University of Victoria Libraries
Kathy Varjabedian, Los Alamos National Research Library
ER&L 2015 Simplifying Your DDA Program For A Better User ExperienceRene Erlandson
The panel presentation discussed experiences with demand-driven acquisition (DDA) programs and efforts to simplify the user experience. Rene Erlandson described how using WorldCat Discovery and EBL provided an easy-to-implement DDA program at the University of Nebraska Omaha. Holly Tomren discussed Drexel University Libraries' DDA plan through Ebrary and efforts to streamline record workflows. Kelly Drake explained how the Fenway Libraries Online consortium aimed to create a simple shared DDA collection but found the needs of individual libraries made full simplicity challenging.
Providing and Maintaining Access to Electronic Serials: Consortium and Member...NASIG
Ontario Council of University Libraries (OCUL) is a consortium of Ontario’s 21 university libraries in Canada. Scholars Portal (SP) is an OCUL sponsored digital repository of over 40,000,000 full text scholarly articles drawn from 18,000 journals covering every academic discipline. Scholars Portal export its holdings to knowledgebase in SFX, 360 Link, Ebsco, and Keeper’s Registry. Scholars Portal maintains a central SFX instance for member libraries for their content subscribed via OCUL. University of Windsor library is an OCUL member library who uses SFX as OpenURL link resolver for their OCUL and local subscription content. This study will examine the work flow, the problems encountered in maintaining central and local SFX instance, and discuss the advantages and challenges of providing and maintaining access to electronic serials in consortium and member library.
Presenters: Shuzhen Zhao, Bibliographic Services Librarian, Leddy Library, University of Windsor; Wei Zhao, Senior Metadata Librarian, OCUL -- Scholars Portal; Katie-Scarlett MacGillivray
The document discusses LIBISnet's implementation of Alma, Ex Libris' unified library management system. LIBISnet serves as the central library automation provider for KU Leuven University and several other institutions. It was seeking to consolidate multiple legacy systems onto a single platform to optimize workflows and extend services. After several years of testing and pilot implementations with Ex Libris, LIBISnet launched Alma in July 2014, migrating nearly 6 million bibliographic records and over 1 million patron records for 33 institutions onto the new system. Considerable training, communication, and planning went into preparing staff and ensuring a smooth transition.
Alma, the Cloud & the Evolution of the Library Systems Department - Kevin KiddKevin Kidd
As libraries implement Alma and other cloud-based technologies, there are many questions about the future role of the traditional sysadmin focused library systems department. What opportunities and challenges will systems departments face as libraries push their applications and services into the cloud? What will be the practical effect of implementing Alma on your systems department? What tasks will systems librarians give up? What new duties will they take on? What new skills will systems librarians need to develop? I will discuss these questions in the context of the implementation of Alma at the Boston College Libraries. As the first adopters of Alma, we would like to share thoughts and experiences in a broad discussion of the effects of cloud computing on library systems and services.
Reading Lists and Digitised Content at RGU: experiences and expectationsTalis
This document summarizes the experiences of Robert Gordon University (RGU) with managing reading lists and digitized content for its students and faculty. It describes the history and growth of RGU, the transfer of responsibility for reading lists from liaison librarians to bibliographic services staff, improvements made to streamline processes and address backlogs, and the introduction of a new digitization system to provide more online readings for students. Key challenges included managing increasing workloads, lack of procedures during transitions between departments, and fully utilizing support for new systems. Overall, the university aims to better advocate for reading lists, integrate digitization efforts, and prepare for upcoming upgrades and projects.
Cooperative Cataloging Projects: Managing Them for Best ResultsNASIG
This document summarizes key considerations and best practices for managing cooperative cataloging projects. It discusses factors to evaluate like collection characteristics, staffing needs, record processing workflows, and strategies for ongoing maintenance. Specific projects highlighted include CONSER's Open Access Journal project, the Dacheng Old Periodical Database project, and various Council on East Asian Libraries initiatives. The importance of communication, standards, and feedback with providers to improve metadata quality is emphasized.
Open Information in need of liberation: Aspire and the conundrum of linked dataTalis
This document summarizes a presentation about the challenges of extracting tailored management information from Talis Aspire. While Aspire data is openly available on the web, independent reporting and access to item information is limited. The presentation outlines issues libraries face in accessing Aspire data and suggests potential solutions like enabling API access for batch data requests, custom reporting, or integrating a reporting dashboard. The goal is to balance Aspire's open data principles with giving libraries better tools to manage and leverage resource list information.
One101 double the trouble double the fun_ migrating to one_drive and sharepoi...Craig Jahnke
Craig Jahnke presented on migrating to OneDrive and SharePoint Online. He provided an overview of OneDrive and SharePoint Online, outlining key differences. He discussed the migration process and tools that can be used, including manual migration, PowerShell, the SharePoint Migration Tool, and third party tools. He covered specifics on migrating file shares to OneDrive, OneDrive to OneDrive, and SharePoint on-premises to SharePoint Online. The presentation emphasized planning, cleaning data, and using multiple tools and accounts to complete migrations.
Baha Mar's All in Bet on Red: The Story of Integrating Data and Master Data w...Joseph Alaimo Jr
Baha Mar is the Bahamas' newest luxury resort, featuring world class hotels, restaurants, shopping, spas, a golf course, and a casino. Baha Mar has gone all in, betting big on red by adopting a Planning and Budget Cloud Service (PBCS), a Financial Close and Consolidation Cloud Service (FCCS), and an Account Reconciliation Cloud Service (ARCS). Within PBCS, Baha Mar has implemented daily operational reporting (DOR), which captures daily activity across the entire property.
This presentation explores how, and more importantly, why FDMEE (on-premises) was used in favor of cloud data management. It covers the myriad of disparate and highly specialized data sources needed to support loading detailed operational data on a daily basis. It studies the various methodologies used to streamline data integration and enrich the data load cycle.
Additionally, EDMCS and cloud data management were leveraged to reduce duplicate master data maintenance while sharing key findings about this exciting new cloud service.
This document discusses moving from a snapshot command line process to a new snapshot flat file and IMS LIS framework for updating student enrollment data in the system. The new process leverages SAP HANA for more frequent data pulls and an event-driven process for near real-time student enrollments. It also uses flat files containing extracted user, course, and enrollment data from Active Directory and SAP Hana. The process was implemented to reduce enrollment processing time from 48 hours to near real-time and decrease service desk calls about missing student information.
These slides about the COUNTER Code of Practice Release 5 reflect recent clarifications and amendments. They provide an overview of Release 5 metrics and reports.
Making the Big Move: Moving to Cloud-Based OCLC’s WorldShare Management Servi...Charleston Conference
The library migrated from their previous integrated library system to OCLC's WorldShare Management Services over a 6 month period. They moved their search, circulation, and catalog infrastructure to the new cloud-based system. The migration process involved preparing data for transfer and working through various technical issues. The library has made changes to workflows for acquisitions, technical services, and electronic resources management as a result of the new system. They have also provided feedback to OCLC on ways the system and services could be improved.
Managing Your Hyperion Environment – Performance Tuning, Problem Solving and ...eCapital Advisors
Casey Ratliff from eCapital Advisors provides recommendations on Oracle - Hyperion performance tuning at a Hyperion User Group meeting in Minnesota.
Diagnostics/Troubleshooting
- Where are all the logs
- Using Log Analysis Utility
- EPM System Registry
- Deployment Report
- EPM Diagnostic –Validation
- Zip to Logs
Changes that can improve performance
- Java Heap
- Data Connections
- Essbase/
Casey Ratliff, Lead System Architect
http://www.eCapitalAdvisors.com
Embracing Changing Technology and New Technical Services Workflows in Migrati...NASIG
In 2015, Radford University’s McConnell Library migrated to OCLC’s WorldShare Management Services (WMS), relinquishing three legacy systems in the process. As a result, many of the Collection and Technical Services Department’s workflows changed considerably beginning months before the contract was signed. An extensive temporary departmental focus on data clean-up and training was required while maintaining core duties. New opportunities for collaboration and communication emerged. In light of developing new workflows, Core Competencies for Electronic Resources Librarians and Core Competencies for Print Serials Management were consulted in revising positions. This presentation focuses on the process, management, communication and outcomes involved in migrating to a new next-generation library management system.
Kay Johnson, Head of Collection and Technical Services
Jessica Ireland, Serials & Electronic Resources Librarian
An integrated library management system (ILMS) is an automated software package containing modules for key library functions like cataloguing, circulation, acquisitions, and serials management. It comprises a database, software to interact with the database, and two user interfaces - one for patrons and one for staff. Implementing a new ILMS is a complex process requiring documentation of needs, evaluation of alternatives, negotiation of agreements, and careful implementation according to a project plan.
This document discusses data warehousing concepts and technologies. It defines a data warehouse as a subject-oriented, integrated, non-volatile, and time-variant collection of data used to support management decision making. It describes the data warehouse architecture including extract-transform-load processes, OLAP servers, and metadata repositories. Finally, it outlines common data warehouse applications like reporting, querying, and data mining.
(ATS6-APP05) Deploying Contur ELN to large organizationsBIOVIA
Introducing new IT systems that affect many users could be challenging, in particular for large organizations. This session will describe how Contur ELN has been deployed to 1000+ users in different fields of R&D. Case studies will be used to illustrate strategies and practical considerations.
A slight revised version of the presentation that I gave at the Victorian Australian Society of Archivists - covering off on open source projects using AtoM, Archivematica and other tools.
Data warehousing is an architectural model that gathers data from various sources into a single unified data model for analysis purposes. It consists of extracting data from operational systems, transforming it, and loading it into a database optimized for querying and analysis. This allows organizations to integrate data from different sources, provide historical views of data, and perform flexible analysis without impacting transaction systems. While implementation and maintenance of a data warehouse requires significant costs, the benefits include a single access point for all organizational data and optimized systems for analysis and decision making.
The experience of the Private Academic Library Network of Indiana implementing cloud based next gen ILS system WMS, including the selection process and change management work.
*Updated and reorganised following feedback in the breakouts*
While many librarians have developed mechanisms and
structures for managing local scholarship separate from
their standard resource management practices, the
intersection of the two content streams is occurring at
many institutions. During the past decade the presenters
have dedicated themselves to capturing best practices
of electronic resource management and mapping out
paths for creating open access workflows. Join them for a
lively discussion and interactive session where they outline
ways to bring these two initiatives together and identify the
teams needed.
Graham Stone
Jisc Collections
Peter McCracken
Cornell University
Jill Emery
Portland State University Library
This webinar will provide an overview of the current work undertaken to re-write the techniques for electronic resource management with the incorporation of open access workflow management. This overview will provide insight into the key areas under exploration and outline the feedback compiled from the two interactive sessions held at the UKSG Annual Conference. We will also talk about the next steps we undertake to share the development of this project.
- The document discusses using Atlassian tools like Confluence and JIRA to improve processes for managing documentation, communication, resource allocation, and change management for software development projects.
- Previously, projects experienced scope creep, lack of documentation, unpredictable development timelines, and frequent refactoring due to unclear processes.
- The Atlassian tools provide features for documentation management, issue tracking, time tracking, integration with other systems, and visualizing workflows to enable clearer scoping, documentation, communication, and reporting.
- An example project guide outlines how the tools can be used to support an iterative development cycle with documentation in Confluence, tracking tasks in JIRA, and regular reporting.
Various Applications of Data Warehouse.pptRafiulHasan19
The document discusses various applications of data warehousing. It begins by describing problems with traditional transactional systems and how data warehouses address these issues. It then defines key components of a data warehouse including the extraction, transformation, and loading of data from various sources. The document outlines how online analytical processing (OLAP) tools, metadata repositories, and data mining techniques analyze and explore the collected data. Finally, it weighs the benefits of a data warehouse against the costs of implementation and maintenance.
Alma, the Cloud & the Evolution of the Library Systems Department - Kevin KiddKevin Kidd
As libraries implement Alma and other cloud-based technologies, there are many questions about the future role of the traditional sysadmin focused library systems department. What opportunities and challenges will systems departments face as libraries push their applications and services into the cloud? What will be the practical effect of implementing Alma on your systems department? What tasks will systems librarians give up? What new duties will they take on? What new skills will systems librarians need to develop? I will discuss these questions in the context of the implementation of Alma at the Boston College Libraries. As the first adopters of Alma, we would like to share thoughts and experiences in a broad discussion of the effects of cloud computing on library systems and services.
Reading Lists and Digitised Content at RGU: experiences and expectationsTalis
This document summarizes the experiences of Robert Gordon University (RGU) with managing reading lists and digitized content for its students and faculty. It describes the history and growth of RGU, the transfer of responsibility for reading lists from liaison librarians to bibliographic services staff, improvements made to streamline processes and address backlogs, and the introduction of a new digitization system to provide more online readings for students. Key challenges included managing increasing workloads, lack of procedures during transitions between departments, and fully utilizing support for new systems. Overall, the university aims to better advocate for reading lists, integrate digitization efforts, and prepare for upcoming upgrades and projects.
Cooperative Cataloging Projects: Managing Them for Best ResultsNASIG
This document summarizes key considerations and best practices for managing cooperative cataloging projects. It discusses factors to evaluate like collection characteristics, staffing needs, record processing workflows, and strategies for ongoing maintenance. Specific projects highlighted include CONSER's Open Access Journal project, the Dacheng Old Periodical Database project, and various Council on East Asian Libraries initiatives. The importance of communication, standards, and feedback with providers to improve metadata quality is emphasized.
Open Information in need of liberation: Aspire and the conundrum of linked dataTalis
This document summarizes a presentation about the challenges of extracting tailored management information from Talis Aspire. While Aspire data is openly available on the web, independent reporting and access to item information is limited. The presentation outlines issues libraries face in accessing Aspire data and suggests potential solutions like enabling API access for batch data requests, custom reporting, or integrating a reporting dashboard. The goal is to balance Aspire's open data principles with giving libraries better tools to manage and leverage resource list information.
One101 double the trouble double the fun_ migrating to one_drive and sharepoi...Craig Jahnke
Craig Jahnke presented on migrating to OneDrive and SharePoint Online. He provided an overview of OneDrive and SharePoint Online, outlining key differences. He discussed the migration process and tools that can be used, including manual migration, PowerShell, the SharePoint Migration Tool, and third party tools. He covered specifics on migrating file shares to OneDrive, OneDrive to OneDrive, and SharePoint on-premises to SharePoint Online. The presentation emphasized planning, cleaning data, and using multiple tools and accounts to complete migrations.
Baha Mar's All in Bet on Red: The Story of Integrating Data and Master Data w...Joseph Alaimo Jr
Baha Mar is the Bahamas' newest luxury resort, featuring world class hotels, restaurants, shopping, spas, a golf course, and a casino. Baha Mar has gone all in, betting big on red by adopting a Planning and Budget Cloud Service (PBCS), a Financial Close and Consolidation Cloud Service (FCCS), and an Account Reconciliation Cloud Service (ARCS). Within PBCS, Baha Mar has implemented daily operational reporting (DOR), which captures daily activity across the entire property.
This presentation explores how, and more importantly, why FDMEE (on-premises) was used in favor of cloud data management. It covers the myriad of disparate and highly specialized data sources needed to support loading detailed operational data on a daily basis. It studies the various methodologies used to streamline data integration and enrich the data load cycle.
Additionally, EDMCS and cloud data management were leveraged to reduce duplicate master data maintenance while sharing key findings about this exciting new cloud service.
This document discusses moving from a snapshot command line process to a new snapshot flat file and IMS LIS framework for updating student enrollment data in the system. The new process leverages SAP HANA for more frequent data pulls and an event-driven process for near real-time student enrollments. It also uses flat files containing extracted user, course, and enrollment data from Active Directory and SAP Hana. The process was implemented to reduce enrollment processing time from 48 hours to near real-time and decrease service desk calls about missing student information.
These slides about the COUNTER Code of Practice Release 5 reflect recent clarifications and amendments. They provide an overview of Release 5 metrics and reports.
Making the Big Move: Moving to Cloud-Based OCLC’s WorldShare Management Servi...Charleston Conference
The library migrated from their previous integrated library system to OCLC's WorldShare Management Services over a 6 month period. They moved their search, circulation, and catalog infrastructure to the new cloud-based system. The migration process involved preparing data for transfer and working through various technical issues. The library has made changes to workflows for acquisitions, technical services, and electronic resources management as a result of the new system. They have also provided feedback to OCLC on ways the system and services could be improved.
Managing Your Hyperion Environment – Performance Tuning, Problem Solving and ...eCapital Advisors
Casey Ratliff from eCapital Advisors provides recommendations on Oracle - Hyperion performance tuning at a Hyperion User Group meeting in Minnesota.
Diagnostics/Troubleshooting
- Where are all the logs
- Using Log Analysis Utility
- EPM System Registry
- Deployment Report
- EPM Diagnostic –Validation
- Zip to Logs
Changes that can improve performance
- Java Heap
- Data Connections
- Essbase/
Casey Ratliff, Lead System Architect
http://www.eCapitalAdvisors.com
Embracing Changing Technology and New Technical Services Workflows in Migrati...NASIG
In 2015, Radford University’s McConnell Library migrated to OCLC’s WorldShare Management Services (WMS), relinquishing three legacy systems in the process. As a result, many of the Collection and Technical Services Department’s workflows changed considerably beginning months before the contract was signed. An extensive temporary departmental focus on data clean-up and training was required while maintaining core duties. New opportunities for collaboration and communication emerged. In light of developing new workflows, Core Competencies for Electronic Resources Librarians and Core Competencies for Print Serials Management were consulted in revising positions. This presentation focuses on the process, management, communication and outcomes involved in migrating to a new next-generation library management system.
Kay Johnson, Head of Collection and Technical Services
Jessica Ireland, Serials & Electronic Resources Librarian
An integrated library management system (ILMS) is an automated software package containing modules for key library functions like cataloguing, circulation, acquisitions, and serials management. It comprises a database, software to interact with the database, and two user interfaces - one for patrons and one for staff. Implementing a new ILMS is a complex process requiring documentation of needs, evaluation of alternatives, negotiation of agreements, and careful implementation according to a project plan.
This document discusses data warehousing concepts and technologies. It defines a data warehouse as a subject-oriented, integrated, non-volatile, and time-variant collection of data used to support management decision making. It describes the data warehouse architecture including extract-transform-load processes, OLAP servers, and metadata repositories. Finally, it outlines common data warehouse applications like reporting, querying, and data mining.
(ATS6-APP05) Deploying Contur ELN to large organizationsBIOVIA
Introducing new IT systems that affect many users could be challenging, in particular for large organizations. This session will describe how Contur ELN has been deployed to 1000+ users in different fields of R&D. Case studies will be used to illustrate strategies and practical considerations.
A slight revised version of the presentation that I gave at the Victorian Australian Society of Archivists - covering off on open source projects using AtoM, Archivematica and other tools.
Data warehousing is an architectural model that gathers data from various sources into a single unified data model for analysis purposes. It consists of extracting data from operational systems, transforming it, and loading it into a database optimized for querying and analysis. This allows organizations to integrate data from different sources, provide historical views of data, and perform flexible analysis without impacting transaction systems. While implementation and maintenance of a data warehouse requires significant costs, the benefits include a single access point for all organizational data and optimized systems for analysis and decision making.
The experience of the Private Academic Library Network of Indiana implementing cloud based next gen ILS system WMS, including the selection process and change management work.
*Updated and reorganised following feedback in the breakouts*
While many librarians have developed mechanisms and
structures for managing local scholarship separate from
their standard resource management practices, the
intersection of the two content streams is occurring at
many institutions. During the past decade the presenters
have dedicated themselves to capturing best practices
of electronic resource management and mapping out
paths for creating open access workflows. Join them for a
lively discussion and interactive session where they outline
ways to bring these two initiatives together and identify the
teams needed.
Graham Stone
Jisc Collections
Peter McCracken
Cornell University
Jill Emery
Portland State University Library
This webinar will provide an overview of the current work undertaken to re-write the techniques for electronic resource management with the incorporation of open access workflow management. This overview will provide insight into the key areas under exploration and outline the feedback compiled from the two interactive sessions held at the UKSG Annual Conference. We will also talk about the next steps we undertake to share the development of this project.
- The document discusses using Atlassian tools like Confluence and JIRA to improve processes for managing documentation, communication, resource allocation, and change management for software development projects.
- Previously, projects experienced scope creep, lack of documentation, unpredictable development timelines, and frequent refactoring due to unclear processes.
- The Atlassian tools provide features for documentation management, issue tracking, time tracking, integration with other systems, and visualizing workflows to enable clearer scoping, documentation, communication, and reporting.
- An example project guide outlines how the tools can be used to support an iterative development cycle with documentation in Confluence, tracking tasks in JIRA, and regular reporting.
Various Applications of Data Warehouse.pptRafiulHasan19
The document discusses various applications of data warehousing. It begins by describing problems with traditional transactional systems and how data warehouses address these issues. It then defines key components of a data warehouse including the extraction, transformation, and loading of data from various sources. The document outlines how online analytical processing (OLAP) tools, metadata repositories, and data mining techniques analyze and explore the collected data. Finally, it weighs the benefits of a data warehouse against the costs of implementation and maintenance.
- Data warehousing aims to help knowledge workers make better decisions by integrating data from multiple sources and providing historical and aggregated data views. It separates analytical processing from operational processing for improved performance.
- A data warehouse contains subject-oriented, integrated, time-variant, and non-volatile data to support analysis. It is maintained separately from operational databases. Common schemas include star schemas and snowflake schemas.
- Online analytical processing (OLAP) supports ad-hoc querying of data warehouses for analysis. It uses multidimensional views of aggregated measures and dimensions. Relational and multidimensional OLAP are common architectures. Measures are metrics like sales, and dimensions provide context like products and time periods.
PERUSE Technologies offers online and offline training courses for Oracle applications such as Oracle Financials, Supply Chain Management (SCM), Procure to Pay (P2P) and Order to Cash (O2C) cycles. The training includes modules on inventory, purchasing, order management, bills of material, work in process, and system administration. Courses cover the full project lifecycle from implementation to support and customization using the Application Implementation Methodology (AIM) framework.
Who says you can't do records management in SharePoint?John F. Holliday
Although records management features have steadily improved with each new SharePoint version, many industry observers are starting to express their doubts as to whether SharePoint is a viable platform for building real-world ERM solutions. This session will explore the enhanced RM capabilities of SharePoint 2013 and show how to leverage them to full advantage. The session will also introduce several third-party tools that further enhance the platform to enable true enterprise-class content lifecycle management.
Presented at the 2015 Charleston Conference by Mingyu Chen, Head of Metadata Services, University of Texas at Dallas, and Ellen Safley, Dean of Libraries, University of Texas at Dallas
The document summarizes a presentation about data vault automation at a Dutch department store chain called de Bijenkorf. It discusses the project objectives of having a single source of reports and integrating with production systems. An architectural overview is provided, including the use of AWS services, a Snowplow event tracker, and Vertica data warehouse. Automation was implemented for loading data from over 250 source tables into the data vault and then into information marts. This reduced ETL development time and improved auditability. The data vault supports customer analysis, personalization, and business intelligence uses at de Bijenkorf. Drivers of the project's success included the AWS infrastructure, automation approach, and Pentaho ETL framework.
As electronic serials have shifted from being the exception to the norm, libraries are becoming increasingly reliant on knowledge base driven systems to help manage their electronic resource holdings. In 2011, after over a decade of managing e-serials within a local database, the University of Toronto Libraries migrated its electronic serial holdings to a fully integrated commercial e-resource management system. Now, with two years of experience under our belts, we endeavored to take stock and analyze how our library is coping with e-serial management within this new environment. How accurate are our e-journal holding statements within the ERM? How effective are we at managing e-serial title changes? How well are we tracking journal purchases that fall outside of the big package deals? Throughout this study, we have encountered many of the benefits and pitfalls of managing electronic journals within a knowledge base-driven system. While using a commercial ERM and companion MARC record service has allowed the library to present better data to users and expose previously hidden collections, there are several new challenges that we must contend with in a knowledge base environment. A common issue hindering access to our e-journals is the supply of incorrect, outdated or incomplete metadata within the data supply chain. These metadata problems have a detrimental effect on libraries, and consequently on our users, as it affects the accuracy of our e-journal holdings within our e-resource inventories. Although the study began as an internal investigation of our e-serials management practices and workflows, the results highlight the need for greater standardization within the data supply chain, better communication with publishers and knowledge base providers, and increased collaboration to improve the e-resource management process.
Presenters:
Marlene van Ballegooie
Metadata Librarian, University of Toronto Libraries
Juliya Borie
Cataloguing Librarian, University of Toronto Libraries
Facing our e-demons: challenges of e-serial management in a large academic li...NASIG
As electronic serials have shifted from being the exception to the norm, libraries are becoming increasingly reliant on knowledge base driven systems to help manage their electronic resource holdings. In 2011, after over a decade of managing e-serials within a local database, the University of Toronto Libraries migrated its electronic serial holdings to a fully integrated commercial e-resource management system. Now, with two years of experience under our belts, we endeavored to take stock and analyze how our library is coping with e-serial management within this new environment. How accurate are our e-journal holding statements within the ERM? How effective are we at managing e-serial title changes? How well are we tracking journal purchases that fall outside of the big package deals? Throughout this study, we have encountered many of the benefits and pitfalls of managing electronic journals within a knowledge base-driven system. While using a commercial ERM and companion MARC record service has allowed the library to present better data to users and expose previously hidden collections, there are several new challenges that we must contend with in a knowledge base environment. A common issue hindering access to our e-journals is the supply of incorrect, outdated or incomplete metadata within the data supply chain. These metadata problems have a detrimental effect on libraries, and consequently on our users, as it affects the accuracy of our e-journal holdings within our e-resource inventories. Although the study began as an internal investigation of our e-serials management practices and workflows, the results highlight the need for greater standardization within the data supply chain, better communication with publishers and knowledge base providers, and increased collaboration to improve the e-resource management process.
Presenters:
Marlene van Ballegooie
Metadata Librarian, University of Toronto Libraries
Juliya Borie
Cataloguing Librarian, University of Toronto Libraries
Similar to Alma_Implementation_slides_May06_2016 (20)
Facing our e-demons: challenges of e-serial management in a large academic li...
Alma_Implementation_slides_May06_2016
1. Moving to Next-Gen in 6 Months: A Journey to Alma/Primo after 15 Years with Voyager
Hong Ma, Ling-Li Chang, Margaret Heller, Ursula Scholz
Loyola University Chicago
May 6, 2016
2. Agenda
• Introduction
• Quick review of library automation history, why next-gen?
• Project overview
• Data migration overview
• Functional areas
3. Loyola University Chicago Libraries
3
• Four Campuses: Lake Shore, Water Tower,
Health Sciences, and Rome in Italy.
• Three Separate Library Administrations:
University Libraries (three libraries, two
archives & special collections, an
information commons, and a storage
facility), Law Library and Health Sciences
Library.
• 1.5 million print volumes, 3,276 print
subscriptions, 626,850 ebooks and 135,812
e-journals.
• Three major digital collections
management platforms with many objects.
• 37 staff, 34 Librarians, 98 student workers
4. Voyager (Acquisitions,
Cataloging, Circulation)
Voyager SerialSoluti
ons ERM
360
LinkResolver
E-resource
KB
Disintegrated
Library Systems
Digital
Collection
Management
(contentDM)
IR
(eCommons)
Alma
Silos
Legacy ILS
Platform
Discovery
Layer
Primo
Discovery
Layer/WorldCat
Local
OPAC/WebVoyage
4
5. Agenda
• Introduction
• Quick review of library automation history, why next-gen?
• Project overview
• Data migration overview
• Functional areas
6. 2014 2015Dec Jan Feb March April May June July 2015
Contract signed
Dec 10
Kick-off
Jan 16
Onsite Functional
Workshop
April 20 - 23
Alma
Production
with Test Load
March 16
Consultation
Go live
Switch to
support
July 22
Pre-implementation phase
Dec 10 – Jan 16
Weekly Project
Status Call Feb 6 - Sep 22
Additional Weekly
Functional Call
Aug 15 - Sep 17
Watch online
training videos
2014
Onsite Analysis
Jan 26 - 27
Primo
environment
April 16
Aug Sep
Sep 10
Sep 18-22
Aggressive Timeline
Cutover
(Freeze &Load)
March 31 - Sep 18
Alma Certification Training
July 10 - 21
7. Library project team and collaboration
Project manager Executive group Functional groups
Access Services
Acquisitions and
Resource
Management
E-resource
management
Primo/Discovery Systems
Electronic resources
access and
troubleshooting
group
Library
departments/other
committees
8.
9. Agenda
• Introduction
• Quick review of library automation history, why next-gen?
• Project overview
• Data migration overview
• Functional areas
10. What we did before migration
• No time for cleaning up
• Get ourselves familiar with migration process
• Define libraries
• Consolidate locations
• Created “DELETE” locations for libraries, map non-use locations to “DELETE”
locations (CDELETE, HDELETE,WDELETE)
• Purge expired patron records in Voyager
• Identify things for post-migration clean up
11. Data Migration
Data Sources
• Voyager Data
• Serial Solution E-resources data
Migration process
• The test load cycle
• The final cutover cycle
Areas to be migrated once
• Libraries
• Locations
• Vendors
Things can’t be migrated
• Unfulfilled hold requests are not migrated to Alma
• Historical loan transactions are not migrated to Alma
12. System Configuration
Training
Videos/Weekly
calls
Essential
Document
ation
Forms
• Migration Form
• P2E file
• Configuration Form
Test Load
(initial set up)
Implementation
Cutover
System configuration starts at test load and
gets revised continually
Revising Process
• Getting Ready for Alma Implementation
• Voyager to Alma Migration Guide
• Voyager to Alma AutoExtract Migration Guide
• Electronic Resource Handling in Alma Migration
• Alma Resource Management Guide
• Alma Cutover Process
13. 2014 2015Dec Jan Feb March April May June July 2015
Contract signed
Dec 10
Kick-off
Jan 16
Onsite Functional
Workshop
April 20 - 23
Alma
Production
with Test Load
March 16
Consultation
Go live
Switch to
support
July 22
Pre-implementation phase
Dec 10 – Jan 16
Weekly Project
Status Call Feb 6 - Sep 22
Additional Weekly
Functional Call
Aug 15 - Sep 17
Watch online
training videos
2014
Onsite Analysis
Jan 26 - 27
Primo
environment
April 16
Aug Sep
Sep 10
Sep 18-22
Aggressive Timeline
Cutover
(Freeze &Load)
March 31 - Sep 18
Alma Certification Training
July 10 - 21
14. Functional Workshop
• Reviewing implementation, configuration decision
• Ensuring that all processes are working as expected.
• Help to determine localized workflows
• Acquisitions
• Resource management
• Fulfilment
• Integration between alma/primo
• Integration with third party systems
• Assist to revise the migration/configuration decisions.
15. Cutover
• Freeze work in technical services
• Final data migration from voyager to Alma
• Publish Alma data to production Primo environment
• Freeze all activities in Voyager including circulation (use offline CIRC in
Alma )
• Extract & load Circ transactions and patrons.
• Load Alma off-line circulation files to Alma.
• Final tests
16. 2014 2015Dec Jan Feb March April May June July 2015
Contract signed
Dec 10
Kick-off
Jan 16
Onsite Functional
Workshop
April 20 - 23
Alma
Production
with Test Load
March 16
Consultation
Go live
Switch to
support
July 22
Pre-implementation phase
Dec 10 – Jan 16
Weekly Project
Status Call Feb 6 - Sep 22
Additional Weekly
Functional Call
Aug 15 - Sep 17
Watch online
training videos
2014
Onsite Analysis
Jan 26 - 27
Primo
environment
April 16
Aug Sep
Sep 10
Sep 18-22
Aggressive Timeline
Cutover
(Freeze &Load)
March 31 - Sep 18
Alma Certification Training
July 10 - 21
17. Agenda
• Introduction
• Quick review of library automation history, why next-gen?
• Project overview
• Data migration overview
• Functional areas
22. Fulfillment Unit 2Fulfillment Unit 1
TOU
TOU TOU TOU
• Terms of Use (TOU) are groups of
policies, such as loan periods and
overdue fines, that apply to certain
patron groups.
• They are then attached to fulfillment
units, which are groups of locations
with similar policies.
• TOUs can be used by more than one
fulfillment group
• There are 3 kinds of TOUs: loan,
request, and booking
24. Where is all of the stuff on the configuration
form going?
• Most (but not all) policies are part of the terms of use
• Some policies are configured under the circulation desk
• Hold shelf policies
• Printing policies
• Fine payment policies
• Some are configured under the library
• Opening hours
• Lost and overdue notice intervals
• Some are system-wide
• Block preferences
• User groups
• Text for notices (as well as sender address)
25. Some things to remember
• The configuration is actually a lot more flexible and customizable than
the form implies, which ExLibris doesn’t make clear
• It’s smart to focus on streamlining your policies, but you can add
more fulfillment units later; you are not limited to the 5 that the form
limits you to
• The policies that are system-wide will have to be unified; that can’t be
changed
• Focus on understanding Alma terminology. Learning the vocabulary is
3/4ths of the battle.
27. Everything is different
Unified process for managing all type of resources
• Electronic resources can be managed within Alma
• Name and subject authorities in Alma Community Zone
No provision for separate “owning libraries”
New concepts for holding & item records
• Physical inventory (holding & item) or electronic inventory (portfolio)
• Holding record plays a limited role
• One holding record for multiple copies of same location
• Every physical item ordered/received gets an item record created
Missing functionality such as serials check-in and claiming
Item statuses & back-office locations are replaced by work order processes
28. High priority record cleanup
• Issues with item barcodes (empty barcode, non-unique barcodes,
multiple barcodes )
• Inconsistent format of item numbering e.g. v.1, vol. 2, vol.3 (harder to do
in Alma!)
• Location in item record doesn’t match location in holding record
• Items with a Voyager item status which will be migrated to Alma process
“Technical–Migration”, e.g. Cataloging Review, Withdrawn
• Issues with LC type call number
• Issues with e-resources records
29. Top 10 setups for back office work
Integration profile for OCLC Connexion
External Search for WorldCat
Import Profiles (on order records, shelf-ready books)
Item Description Templates
Ledgers and Funds finalized
Vendor EDI profile setup
Labeling setup
Normalization rules on saving records
Merge rules on overlaying records
Templates for BIB, holdings and PO lines
31. Tips and Lessons Learned
• Good to consolidate locations (We reduced number of locations from 260 to 88)
• Wish to have tested acquisitions more thoroughly before cut over
o Invoices with invoice total amount being zero failed to migrate (Test load statistics didn’t
report them!)
o reporting fund migration
o standing orders
• Wish to have more solid understanding of e-resource conversion process(P2E )
• Pre-migration cleanup may not be necessary, in some cases post migration may be
easier to do in Alma
• Everyone, old and young, can learn new tricks!
• All staff participation is really helpful!
33. Primo Implementation and Testing
PROTOTYPE
• Ex Libris Team
• Systems Staff
• Data checking
group
TESTING
• Web Team
• Primo Team
REVIEW
• Primo Team
• Reference Staff
Data
Testing
Interface
Testing
Outreach
and User
Education
Primo/Discovery Team Division of Labor Implementation Process
34.
35.
36.
37.
38. Highlights/things we found very useful
• Functional workshop on April for analyzing workflows
• Weekly functional calls besides weekly project management calls
• Tuesday focuses on functional topics
• Friday focuses on project progress
• Extra Ex Libris Consultant Service after going live before switching
support
• Strong recommendation to have Alma certification training at much
earlier stage.
39. Continually setting things up
• Resource Management
• Authority control
• Publishing profiles in multiple campus environment
• Publishing to OCLC
• Publishing to PubMed
• Continually harvest/enhance content for Primo discovery
• eComments de-duplication
• Harvest content from contentDM
• Harvest content from LibGuides etc.
• Integration and Interoperability
• Single-Sign-On for Primo and others like ILLiad, EzProxy
• Alma integration with financial system
• Alma & ILLiad integration
40. Hong Ma hma2@luc.edu
Ling-Li Chang LCHANG@luc.edu
Margaret Heller mheller1@luc.edu
Karen Cherone kcherone@luc.edu
Editor's Notes
Today, we are going to share our experience of Alma/Primo migration after 15 years’ use of voyager.
My name is Hong Ma, Head of library systems also project manager for the Alma migration.
Ling-Li Change, Head of Monograph Acquisitions and Cataloging, Lead on implementation of Resource management and acquisitions
Margaret Heller, Digital Services Librarian, Lead on Primo implementation
Ursula Scholz, Head of Access Services, lead on Access Services implementation
Traditional legacy ILS system no longer able to handle increasing complexity of library collections .Aging legacy system: Libraries on average operate legacy ILS system for about 10 years, in our case, it is more than 15 years.
Marsh Breeding named the next generation ILS to Library Services Platform (LSP). LSPs aim to provide more comprehensive approach to manage library collections as a promising move. It starts an initial step into rebuilding systems in alignment with library reality. It is the time for libraries to take the most forward-looking process to start picking up next generation automation system.
LSPs emphasize managing library collections through shared metadata rather than traditional local bibliographic database.
Most libraries especially academic libraries are in the stage of creating technology infrastructure component in tune with the libraries’ strategic needs.
Organization requires a well-designed, maintained technology infrastructure to carry out its mission effectively and efficiently.
Reviewing implementation, configuration decision
Ensuring that all processes are working as expected.
Help to determine localized workflows
Acquisitions
Resource management
Fulfilment
Integration between alma/primo
Integration with third party systems
Assist to revise the migration/configuration decisions.
Cutover
Freeze work in technical services
Final data migration from voyager to Alma
Publish Alma data to production Primo environment
Freeze all activities in Voyager including circulation (use offline CIRC in Alma )
Extract & load Circ transactions and patrons.
Load Alma off-line circulation files to Alma.
Final tests
Project Manager -serves as the primary contact with Ex Libris project manager and manage library implementation team.
Executive group - focused on policy, vision, planning, coordinating, and decision-making
Functional groups are responsible for the detail configuration, workflow for the functional areas.
We have overlapping between groups and also make sure participation from branch libraries.
Additional group, Electronic resources access and trouble shooting group was created midway through the implementation process.
Other library departments/committees also get involved/communicated in the specific implementation or training stage.
Due to aggressive timeline, we don’t have much time to accomplish massive clean up before the migration. So we had to be effectively plan the migration and design a efficient way to organize things and identify some potential tasks for post clean up.
To complete the migration form, we redefined our libraries. For the Location Mapping tab, we created a couple of “deletion” location for each branch library to map these locations in the voyager system we no longer need.
Reviewing implementation, configuration decision
Ensuring that all processes are working as expected.
Help to determine localized workflows
Acquisitions
Resource management
Fulfilment
Integration between alma/primo
Integration with third party systems
Assist to revise the migration/configuration decisions.
Cutover
Freeze work in technical services
Final data migration from voyager to Alma
Publish Alma data to production Primo environment
Freeze all activities in Voyager including circulation (use offline CIRC in Alma )
Extract & load Circ transactions and patrons.
Load Alma off-line circulation files to Alma.
Final tests
Reviewing implementation, configuration decision
Ensuring that all processes are working as expected.
Help to determine localized workflows
Acquisitions
Resource management
Fulfilment
Integration between alma/primo
Integration with third party systems
Assist to revise the migration/configuration decisions.
Cutover
Freeze work in technical services
Final data migration from voyager to Alma
Publish Alma data to production Primo environment
Freeze all activities in Voyager including circulation (use offline CIRC in Alma )
Extract & load Circ transactions and patrons.
Load Alma off-line circulation files to Alma.
Final tests
The configuration form needs to be filled out at the start of the process, before you are able to look at the back-end of the system at all. Each line of this spreadsheet will populate a “term of use”, which are then collected into fulfillment units. The line highlighted here is shown in Alma in the next slides.
This is where the TOU appears within a fulfillment unit
Here is the actual terms of use
I am going to talk about our implementation experience with resource management & acquisitions.
We knew Alma would be quite different from Voyager but we didn’t expect it to be completely different in almost every area.
We love many features of Alma. One example is Alma’s unified management environment. We can manage e-resources and do authority control within Alma.
We had a hard time adapting to some of the changes. For example, Voyager can restrict staff permissions based on “owning library” so each library (Law, Health Sciences and the University Libraries) can have its own bib records and completely separate workflow. Alma bib records have no ownership. We continue our separate “owning library” practice but have to rely on staff to manually check the local field for “owning library”.
Alma’s “inventory” concept took us a while to get used to. Starting from on order items, Alma needs an item record in order for physical resource workflow to work well. It even creates item records for future predicted issues a year ahead.
We also found that the Alma holding records play a less prominent role. A holding record may be deleted automatically when an item record is deleted or updated with a different permanent location. Also, Alma has limited options for manipulating and reporting holding information.
Two import functions for print serials (predictive check-in and claiming) have serious flaws. Alma’s check-in prediction pattern creates an item record for every predicted issue a year ahead and that is confusing to the users in the discovery interface. Alma identifies order lines which may need to be claimed only on the title level, not on the issue level.
“Work orders” is another challenging transition for us. The purpose is good but it is complicated to put it to work.
We didn’t have time to do cleanup before migration so we identified some high priority ones to start with before and right after migration.
The item barcode issues (empty barcode, non-unique barcodes and multiple barcodes) will cause problem for fulfillment so we cleaned them up as quickly as we could.
Inconsistent format of item numbering (called item description by Alma) may result in very messy display of multiple items of a title in Alma and in Primo. It’s good to clean them up for better user experience but they can only be done one by one. It is comparatively harder to do after migration because you can open only one item record at a time with Alma. However, very few libraries had the time to clean them up before migration.
Holding and its items have unmatched locations. The migration process takes the item permanent location as the location in the migrated holding and item records so a holding record may be split into multiple due to different locations in use by its items.
If an item has a back office status (such as withdrawn), the migration process will assign it an Alma status called “Technical Migration” and display it as “not available”. The problem with them is if a withdrawn book is returned at a service desk, Alma will clear the “Technical” status but will not alert the desk operator to forward the book to Cataloging to un-suppress the record.
We have serious problems with our LC type call numbers for older books because we used multiple subfield i’s for call number in holding 852 field. We are cleaning them up one by one whenever we can; there is not a batch way to do it.
We did a lot of cleanup for electronic resources after migration. We had a good strategy to selectively migrate Voyager records to Alma electronic inventory but we still had a lot of cleanup work to do post-migration due to unwanted 856 URLs and lack of provider name in 856 field.
This is my top 10 list for getting ready for real acquisitions and resource management work. All of the institution level setups require administrative or managerial permission but individual operators can create their private templates.
OCLC Connexion and WorldCat External Search are set up by ExLibris during implementation so they are really easy.
We do a lot of batch processes for our YBP shelf-ready books and for ebook vendor records. Import profiles were high priority to set up so that we could push through bulk of work in Alma. Our experience with Voyager bulk import had helped since the setups are similar. YBP and ExLibris Project Team were both very helpful to guide us through the setup process. The hardest part for bulk import was to figure out the overlay and merge behavior of the out-of-the-box merge rules; they were odd. After ExLibris gave us full administrative permission, we created our own merge rules and then it became much easier to test and troubleshoot.
In Voyager we batch loaded EDI invoices from YBP, EBSCO and Hein. Our Law Library is the first Alma library doing EDI invoicing with Hein! It took a lot of testing and communication with Hein and Ex Libris. We didn’t test EDI with YBP or EBSCO until we went live with Alma; they both went smoothly.
For labeling we installed and configured SpineOMetic API (developed by Boston College Library Systems) to work with Zebra printers to print call number labels.
Templates are easy to create and use. They help cut down keystrokes and input data correctly. We created several shared and private templates for bib, holding and PO lines soon after we went live.
Here’s our current workflow using work orders.
We are happy that we finalized how we wanted to migrate the libraries and locations early on and used the migration process to consolidate locations. It’s difficult to delete a location in Alma.
We wish we had tested acquisitions migration result more thoroughly. We had many informational invoices in Voyager which had journal subscription prices in invoice lines but the invoice total was zero because those subscriptions had been prepaid. Invoices with zero in total amount were not migrated to Alma. Migration options for Voyager reporting funds and standing orders were hard to understand. We chose one option for test load and another for production load but wondered if we could be better off with a different option.
We wish we had more understanding of the P2E conversion process. For example, we cleaned up some 856 URLs in Voyager BIB records but didn’t realize we should have also cleaned up the 856 URLs in the associated holding records.
It is easy to create record sets and run batch job against the set in Alma so some data cleanup may actually be easier to do in Alma. It’s good to find out what Alma data can be searched for to create sets before migration to be sure.
After we went live, we felt very proud of ourselves and felt confident that we can learn anything and deal with any change.
It really helped to have all acquisitions and resource management staff members watch Alma training videos and participate in workflow discussions.