Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Hitting the Road towards a Greater Digital Destination: Evaluating and Testing DAMS at the University of Houston Libraries


Published on

Since 2009, the University of Houston (UH) Libraries has digitized tens of thousands of rare and unique items and made them available for research through its UH Digital Library (UHDL) based on CONTENTdm. Six years later, the need for a digital asset management system (DAMS) that can facilitate large scale digitization, provide innovative features for users, and offer more efficient workflows for librarians and staff has emerged. To address these needs, UH Libraries formed the DAMS Task Force in the summer of 2014. The group’s goal was to identify a system that can support the growing expectations of the UHDL.

This presentation will focus on the two core activities, needs assessment and DAMS evaluation, that the task force completed. The key portions of the needs assessment include: the process of literature review on DAMS evaluation and migration; research on tools utilized by peer institutions; and library stakeholder interviews. The presentation will then cover how task force members compiled the results of the assessment to establish DAMS evaluation criteria. The evaluation process consisted of an environmental scan of possible DAMS to test, the creation of criteria to narrow the list of DAMS down for in-depth testing, and the comprehensive testing of the DSpace and Fedora systems.

The presentation will conclude with a discussion of the task force’s results as well as the lessons learned from the research and evaluation process. It will also reflect on the important role that collaboration, project management, and strategic planning played in this team-based approach to DAMS selection.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Hitting the Road towards a Greater Digital Destination: Evaluating and Testing DAMS at the University of Houston Libraries

  1. 1. Hitting the Road towards a Greater Digital Destination: Evaluating and Testing DAMS at the University of Houston Libraries Annie Wu, Rachel Vacek, Santi Thompson, Andy Weidner, Sean Watkins, Valerie Prilop Texas Conference on Digital Libraries | Austin, TX | April 27, 2015
  2. 2. Overview Background and Methodology Evaluation and Results Insights Next Steps 2
  3. 3. Background and Methodology 3 ...with Annie
  4. 4. UH Digital Library ●  Launched in 2009 ●  64 collections ●  60,000+ digital objects ●  Images, text, audio, video ●  CONTENTdm with a custom user interface 4
  5. 5. DAMS Evaluation - Rationale ●  CONTENTdm allowed us to build an initial digital library ●  We needed a more robust DAMS for new digital initiatives ○  Is more flexible, scalable, interoperable ○  Manages larger amount of data in a variety of formats ○  Accommodates creative workflows ○  Allows for configuration of additional functionalities ○  Supports and enables linked data 5
  6. 6. UH Libraries Strategic Directions “ assiduously to expand our unique and comprehensive collections that support curricula and spotlight research. We will pursue seamless access and expand digital collections to increase national recognition.” 6
  7. 7. DAMS Implementation Task Force 7
  8. 8. The Team Special Collections Digital Projects Librarian (Valerie Prilop) Digital Repository Services Web Services Metadata & Digitization Services Head of Metadata & Digitization Services (Annie Wu) Metadata Services Coordinator (Andy Weidner) Co-Chair Head of Web Services (Rachel Vacek) Head of Digital Repository Services (Santi Thompson) Web Projects Manager (Sean Watkins) Co-Chair 8
  9. 9. The Charge ●  Perform a needs assessment and build criteria and policies based on evaluation of the current system and requirements for the new DAMS ●  Research and explore DAMS on the market and identify the top 2-3 systems for beta testing in a development environment ●  Generate preliminary recommendations from stakeholders' comments and feedback ●  Coordinate installation of the new DAMS and finish data migration ●  Communicate the task force work to UH colleagues   9
  10. 10. Methodology ●  Needs assessment ●  Broad evaluation o  Evaluate 12 systems using broad criteria ●  Detailed evaluation o  Install systems for testing environment o  Evaluate top 2 systems from previous evaluation using detailed criteria ●  Data analysis and recommendation 10
  11. 11. Needs Assessment - Goal ●  Collect key requirements of stakeholders ●  Identify future features of the new DAMS ●  Gather data in order to craft criteria for evaluation 11
  12. 12. Needs Assessment - Activities ●  Internal focus group interviews ●  Reviewed scholarly literature on DAMS evaluation and migration ●  Researched peer/aspirational institutions ●  Reviewed national standards around DAMS ●  Identified materials and users ●  Determined both the current use of the DL as well as its projected use of the DL 12
  13. 13. Evaluation and Results 13 ...with Sean
  14. 14. Broad Evaluation Building the criteria: ●  Focus group interviews ●  Literature reviews ●  DAMS best practices Organizing the criteria: ●  Covered 4 topics o  System Function o  Content Acquisition / Management o  Metadata o  User Interface and Search Support 14
  15. 15. Broad Evaluation Results Scoring: ●  Scores were determined by reviewing supporting documentation, marketing information, or talks with venders ●  0 - Does not support criteria ●  1 - Supports criteria ●  Scores where totaled up and top 2 systems were evaluated for the final round 15
  16. 16. 16 DAMS Evaluation Score Possible Score Fedora 27 29 Fedora / Hydra 26 29 Fedora / Islandora 26 29 Collective Access 24 29 DSpace 24 29 CONTENTdm 20 29 Rosetta 20 29 Trinity (iBase) 19 29 Preservica 16 29 Luna Imaging 15 29 RODA 6 29 Invenio 5 29 Score of 12 DAMS from testing using Broad Evaluation Criteria
  17. 17. Detailed Evaluation ●  Detailed criteria drawn from the same sources used previously ●  Looked at specific features for DAMS ●  Criteria was divided into eight testing sections o  System Environment and Testing o  Administrative Access o  Content Ingest and Management o  Metadata o  Content Access o  Discoverability o  Report and Inquiry Capabilities o  System Support 17
  18. 18. System Setup ●  Virtual servers were set up for each system ●  Latest stable versions were chosen (no betas) and installed ●  All supporting server software was installed ●  Additional software to support the evaluation testing was installed and set up 18
  19. 19. Gathering Collections for testing ●  Wide variety of item formats currently available ●  Variety of item formats for future projects ●  Large collections ●  Included items with bad metadata, wrong formats, and corrupted data 19
  20. 20. Detailed Evaluation Results Scoring: ●  Each system was tested with the same set of collection items ●  Ranged score 0 - 3 (0 failed - 3 fully supported) ●  Yes / No criteria was scored 0 or 3 ●  System documentation was still used in some scoring 20
  21. 21. 21 Score of top 2 DAMS from testing using Detailed Evaluation Criteria Testing Sections DSpace Score Fedora Score Possible Score System Environment and Testing 21 21 36 Administrative Access 15 12 18 Content Ingest and Management 59 96 123 Metadata 32 43 51 Content Access 14 18 18 Discoverability 46 84 114 Report and Inquiry Capabilities 6 15 21 System Support 12 11 12 TOTAL SCORE: 205 300 393
  22. 22. Top 2 System Evaluation Results ●  Scores were tallied to show top ranked system ●  Systems summarized into advantages and disadvantages ●  Recommendation was compiled and written into the final report 22
  23. 23. Fedora/Hydra - Advantages and Disadvantages 23 Advantages Disadvantages Open source Steep learning curve Large development community Long setup time Linked Data ready Requires additional tools for discovery Modular design through API No standard model for multi-file objects Scalable, sustainable, and extensible Batch import / export of metadata Handles any file format
  24. 24. DSpace - Advantages and Disadvantages 24 Advantages Disadvantages Open source Flat file and metadata structure Easy installation / turnkey system Limited reporting capabilities Existing familiarity through TDL Limited metadata features User group / profile controls Does not support Linked Data Metadata quality module Limited API Batch import of objects Not scalable or extensible Poor user interface
  25. 25. Insights 25 ...with Rachel
  26. 26. Project Management ●  Initial planning ●  Tools ●  Flexibility ●  Meetings ●  Food ●  Milestones 26
  27. 27. Committee ●  Size ●  Representation ●  Shared understanding ●  Diverse skillsets 27
  28. 28. Time ●  Manage expectations ●  Balance ●  Intensive research 28
  29. 29. Communication ●  Administration ●  Key stakeholders ●  Entire library ●  Other libraries 29
  30. 30. Testing ●  Turnkey vs. framework systems ●  User interfaces o  Local installation o  Other institutions 30
  31. 31. Evaluation ●  Different criteria sets ●  Consistency ●  Vetting process ●  Awareness of possibilities 31
  32. 32. Next Steps 32 ...with Andy
  33. 33. Next Steps ●  Phase 1: Systems Installation ●  Phase 2: Data Migration ●  Phase 3: Interface Development ●  Assessment, Documentation and Training 33
  34. 34. Phase 1: Systems Installation ●  Development environment ●  Rewrite DL front-end for Fedora/Solr ●  Data models ●  Digital preservation workflow ●  Hydra content management application 34
  35. 35. Phase 2: Data Migration ●  Content migration strategy ●  Migrate test collections ●  Data migration ●  Continued Hydra development 35
  36. 36. Phase 3: Interface Development ●  Re-evaluate user interface o  Conduct user testing o  Re-write DL front-end in Hydra OR … o  Update current PHP front-end ●  Inter-departmental production workflows ●  Refine Hydra content management application 36
  37. 37. Assessment, Documentation, and Training ●  Assess service impact ●  User testing ●  System enhancements ●  New workflows ●  Create and maintain documentation ●  Get training o  Conferences, webinars, workshops, etc. 37
  38. 38. Thank you! Annie Wu, Head of Metadata & Digitization Services, Rachel Vacek, Head of Web Services, Santi Thompson, Head of Digital Repository Services Andy Weidner, Metadata Services Coordinator Sean Watkins, Web Projects Manager Valerie Prilop, former Digital Projects Librarian Slides at: 38