Evergreen Migration Success Stories
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Evergreen Migration Success Stories

  • 679 views
Uploaded on

Presented at the 2012 Evergreen International Conference in Indianapoils, IN by me and James Fournie (Sitka).

Presented at the 2012 Evergreen International Conference in Indianapoils, IN by me and James Fournie (Sitka).

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
679
On Slideshare
679
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
6
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • \n
  • context of Emily Carr library - 1800 FTEs, 50,000 bibs\n\n14 staff, including 2.6 librarians, staff are mostly young, almost everyone is comfortable with technology, everyone is curious and open to learning new things.\n\nphotos in presentation mostly student art\n
  • automated in 1997, this was our first ILS migration. \n\nwe’d hit the end of our upgrade path because of our server. the hardware was over a year end of life and we couldn’t upgrade our version of Sybase, so we were one release behind and unable to upgrade. \n\ni remember when the announcement that Horizon 8 was not coming was made. for me, this broke a huge trust with the library community. also, some of the things that we’ve been dreaming about--like better highlighting our artist book collection, a mobile catalogue, and having more control over what our catalogue looks have limited possibilities with a proprietary vendor. \n
  • i started in Sept 2010, spent the first semester trying to keep up, then started cleaning up Horizon, which was a huge mess. 350 circ exceptions down to about 90.\n\nthrough the process of cleaning up your data you will intimately understand your library’s data. this will help with profiling and mapping data and then testing your library’s data before you go-live. simplifying your circ rules will make it easier to set things up and test. however, make sure that your circ rules are still complex enough to deal with how your library does things. for us, our circ rules had grown and become more complex each and every year without ever revising and simplifying things. this clean up made things easier for circulation staff in the end. \n\n\n
  • our users are awesome. and weird. \n\nmy background is not art nor design. one of the things that i found really frustrating was as an outsider hearing some odd local practices defended because “we are special, we are an art and design library”. this sentiment is not uncommon in any kind of library. i imagine that small public libraries, law libraries, government libraries, and large academic libraries view themselves as special snowflakes too. \n\nit seems, as an outsider, that the art and design colleges often look to see what other art and design schools are doing. which makes sense, to some degree, but the functional requirements for an integrated library system are not that unique to the type of library. \n\n
  • before selecting Evergreen, and Sitka as a way to migrate, implement and support Evergreen, i did a thorough needs assessment. it drives me a bit bonkers when people generalize that a specific ILS is for academic libraries, or is better suited to public libraries...\n\nmy library is a one branch library with fairly modest needs, but big, creative dreams for the future\n\nneed to really look closely at practice, current functionality and desired functionality for the future. we definitely could have used Horizon in a more effective way.\n\none of the criticisms that I hear about Evergreen for academic libraries is the lack of cataloguing authorities, and that the serials and acquisitions modules aren’t very robust. none of these were deal breakers for us, as we had never maintained our authorities--so they were useless. in the past we lacked staff expertise to correctly set up serial prediction patterns for all of our serials, and we were rethinking the value of claiming for missing issues. while we used acquisitions we didn’t mange our budgets in our ILS. \n\ni knew where the shortcomings were (acq, serials), had thought out workarounds and staff were committed to make it work.\n\n\nknow what is important for your library and what you need to prioritize. for me, this meant making sure our OPAC looked good. i’m the liasion to the design department and i know my colleagues and students would not believe me if i told them that the new system was better if it didn’t also look better. At another conference John Richardson from Polaris said that our OPAC was one of the most attractive that he’s ever seen.\n\n\n
  • our users are, not surprisingly, really visual people. one of the things that i pushed really hard for was a catalogue that didn’t suck. it had to look better than what we had before. thankfully the Sitka team understood this, and James helped make the KCLS OPAC work for us. \n\n\n
  • the two things that are scary for me are working with code and doing design work. i chatted with the library staff to figure out what information we wanted to highlight on the catalogue. i’d been doing “pssst...! library recommendations” for the website for awhile and staff suggested repurposing those to put a face to the people who use and love our library.\n\ni was constrained by the structure of the KCLS PAC, as i don’t really have the skills and didn’t have the time to change it. these constraints were useful. \n\ni made a paper prototype and got staff feedback and suggestions, and remade the prototype about 5 times. i learned after from design colleagues that this is actually a valid design methodology. because it didn’t look polished and perfect, people felt like their feedback would actually be incorporated. staff suggestions were better than the ideas that i had. initially i wanted to promote our top databases, but am really glad we went with user photos and quotes. \n\n
  • ensure you have support from your managers/directors - this is a big job and you need to make sure you have the money, moral support and buy in from the top. you will need help from everyone, from the person who does the scheduling, to your reference staff and staff to help test the data. \n\none of my first mistakes was timing. our former library director retired after 20 years, new UL started in Sept 2011, told him the first week that we needed to migrate. in retrospect, i should’ve let him get settled a little bit first. \n
  • we scheduled our go-live for May, had plenty of time to work out the migration issues during the quieter summer months. \n\nmap out go-live date (might be a good plan to target having circ working on the go-live date, and put your cataloging, acquisitions, etc on hold for a bit), final data snapshot, other data snapshots, staff training days, and the deadline for notifying your previous ILS vendor that you're leaving. know what dates are completely fixed and know that you may need to work some long hours. \n\ninclude relevant contract dates (both with the legacy contract) and/or with your new support vendor\n\n\n
  • find, read and understand the exit clauses in your current ILS contract - if you are currently paying maintenance to another company for your ILS, figure out how much notice you will need to give them so you aren't charged for another year of maintenance. make sure you have good data snapshots/exports from your old system before you contact the vendor to say you're leaving. i was really excited about dreaming about a go-live date and almost overlooked this--i had an unnecessarily stressful week while we scraped in under the deadline. \n\nour VP Finance needed to sign off on the contract. i imagine at larger universities a lawyer might also need to be involved. \n \n\n \nat this stage we started carefully reading our legacy ILS contract, if you know you will be considering a migration in the next few years, i would thoroughly read the contract, especially the exit clauses. we had a slightly different understanding than our legacy vendor, in the end we gave them enough notice for termination of service, and were not charged extra money. \n\noptional: thank the support people who you've worked with. i wasn't a fan of the previous company we got support from, but the people who worked there were excellent. after wrapping up the contact stuff with the right person i wrote the support folks who had helped me to thank them. they are nice smart, hardworking people who helped me better understand the system and helped me do some technical things that didn't know how to do.\n
  • Excellent training from the Sitka team: in person circ training, online cataloguing, reports, and acquisitions training.\n\nsupport from person doing scheduling to make sure people were here, also \n\nPart of my job is the technical services supervisor. I spent 6 months doing Excel training with my staff so that they would be more comfortable and have the skills that they needed to use Excel to temporarily track acquisitions and new serials if necessary. We have access to Lynda.com and as a group we identified the modules that everyone would do. Lynda.com comes with fake data and exercises which were useful. This turned out to be necessary. even though the software supports basic serials and acquisitions we are not yet using them. i hope to have our serials set up by the end of the semester and there’s a tentative plan for Sitka to roll out support for acquisitions system wide for the beginning of next year. thankfully we are a small library.\n\n\n\n \n
  • manage staff expectations - an ILS migration is the biggest change a library will go through. many libraries have never been through a migration, they might have automated 10 years ago. no one knows what to expect. our migration went very smoothly but some staff were cranky as every bell and whistle wasn't working on the go-live day. \n\nI found myself getting cranky when staff would ask “why doesn’t it do things like Horizon?” If I’m in this position again I will try harder to be patient--it’s a big change and supporting staff through the transition is really important. \n
  • communicate lots with staff - i gave a migration update at every staff meeting and sent out the occasional email update and posted updates on our wiki/intranet. if i were to do this again, i would've posted a big timeline in the workroom and crossed stuff off so everyone could see the progress (even if it didn't directly affect them) and emailed regular updates. our staff are flexible, keen to learn new things and technologically savvy--still this was stressful for people and i could've done a better job at communicating clearly what was\n\n
  • didn’t fully understand the nature of a shared database and didn’t correctly identify local tags that were being used (that i didn’t know about), unsure if one librarian even looked at the data\n\ni’m not the most detail oriented person, so i often ran things by one of our circ staff and cataloguer who are very detail oriented people to. this reduced some of the friction between us as they felt valued and included and i was able to learn about problems (either in the software or with the way we do things, like policy and workflow) and help fix them before they were bigger problems. \n\n
  • use the migration as a chance to rethink some other things figure out what other changes you want to make - we rejigged our student information export so circ staff wouldn't be manually entering new staff and faculty into the ILS, got receipt printers, and made a bunch of small procedural changes. library workflows are often dependent on the ILS, so it's a chance to think of ways to make your library (especially circ and tech services) work better. \n\nthis migration was a chance for us to rethink a lot of our business processes and rejig workflows and set things up efficiently\n\nwe also rejigged our extract from the Student Information System, which hadn’t been updated in over 10 years. because staff found the import into our legacy system so cumbersome, we were only updating once a year and relying on circ staff to do a lot of manual updating. \n\nwe wanted to extend borrowing privileges to the 5000 continuing studies students, so rethinking the export/import process so that it could be done much more regularly, was necessary. this meant working with the SIS coordinator, the company who does support for our hosted SIS, the circulation supervisor, and the Sitka support team (again Mark B--thank you!!) unfortunately due to staff vacations we were still tweaking things during the first two weeks of the school year. this was a bit annoying for circ staff.\n
  • some of the money savings are going into booking module enhancements, other key things that we would like to see, and are willing to fund, include: multiple cover art lookup (especially for Artist Books and other weird and wonderful local collections)\n
  • \n
  • Our Evergreeen hosting service, Sitka, began humbly as the BC PINES project in around 2007. \nWe started out with Prince Rupert library in 2007, a port city in the north coast of BC. Prince Rupert and then my colleague Steven Chan and I were tasked with migrating Fort Nelson. It was an extremely stressful experience, I was up late. The first of many.\n
  • Over the course of 2008 we continued to migrate other libraries, including our largest library yet, Whistler Public Library. We also migrated more sites in the North Coast library district along Hwy 16.\n
  • 2009 was a huge year for us. We migrated our first multibranch library, Lillooet Area Library Association, and much larger library system, Thompson-Nicola based in Kamloops BC. We also migrated our first College, College of the Rockies in Cranbrook with several branches along the Alberta border in the Rockies.\n
  • 2010 was another big year as we added more sites in the North East and another large multibranch system, Cariboo District.\n
  • Just last year, Manitoba Spruce was looking for a new hosting arrangement, so we included them in our database. Additionally, we added two post-secondary libraries, ECU and CBC, and two special libraries, Volunteer MB and the Libraries and Literacy Professional Collection in Victoria BC.\n\nIn 2012 we have added the MB Leg library and Kimberly BC and next week we will be migrating Nicola Valley Institute of Tech, an aboriginal post-secondary school, and later in May we will migrate Chalo School, an independent First Nations K-12 school and our first K-12 school.\n\n\n\n
  • \n
  • The first thing you need to do is a rundown of your general IT needs. Many libraries are used to having either an ILS that is hosted in the same building, or they have been using a Telnet connection. If this is you, your staff need to be aware that doing things over the web will feel a bit different -- it's a bit slower, but also it's entirely dependent on the speed of your internet connection to the outside world.\n\nThe important thing to remember is that if you're not hosting in house, the quality of your ILS experience is going to be very dependent on the quality of your internet connection. Make sure you have separated your public and private networks and make sure you are buying enough bandwidth for your ISP to accomodate this change -- Emily Carr used about\n6 GiB of traffic in Feb. of this year -- our largest library used 30.\n\nBC has a lot of unique challenges. I had the opportunity to visit Hazelton BC, a gold rush town in northern BC. The biggest barrier they've faced is a lack of broadband internet -- the major ISP in the region only provided a connection to the edge of town leaving the "last mile" up to the locality, simply because it wasn't profitable for them. Hazelton Library was using a very shoddy WiMax connection. We've gone through various attempts to use satellite internet and 3G but frankly there's not a lot of options up there, and this is complicated by the fact that their IT person must drive in from another community about an hour away.\n\nMany of Sitka's libraries have their IT managed by the municipality, or with government libraries, workstations can be\nheavily locked down and browser choises are restricted to terrible versions of IE. The only advice I can give you here\nis just to try to work with the IT staff as best you can and keep them in the loop.\n
  • When finding someone to do your migration, there are a number of vendors that do Evergreen migrations. If you have one or more people on your team with SQL skills and familiarity with MARC processing with a scripting language such as Perl or Ruby, you already have someone that has the base knowledge, and you can build from there. All that remains is to learn the Evergreen database schema and some basics about Evergreen. While that sounds like a big job, it's a really good investment to have on-site staff with that knowledge if you plan on having an Evergreen server. There currently aren't a lot of people with Evergreen experience relatively speaking. Relatively speaking, learning about Evergreen will be easier and more helpful to both you and the Evergreen community, than having someone else learn about your legacy ILS. Sitka typically uses data migration as an area for new staff to cut their teeth, as it provides an excellent introduction to the overall structure of Evergreen while avoiding some of the complexities such as OpenSRF.\n
  • When finding someone to do your migration, there are a number of vendors that do Evergreen migrations. If you have one or more people on your team with SQL skills and familiarity with MARC processing with a scripting language such as Perl or Ruby, you already have someone that has the base knowledge, and you can build from there. All that remains is to learn the Evergreen database schema and some basics about Evergreen. While that sounds like a big job, it's a really good investment to have on-site staff with that knowledge if you plan on having an Evergreen server. There currently aren't a lot of people with Evergreen experience relatively speaking. Relatively speaking, learning about Evergreen will be easier and more helpful to both you and the Evergreen community, than having someone else learn about your legacy ILS. Sitka typically uses data migration as an area for new staff to cut their teeth, as it provides an excellent introduction to the overall structure of Evergreen while avoiding some of the complexities such as OpenSRF.\n
  • At Sitka, our first migration was done by ESI, but since then, every migration has been into a live system that other libraries are already using. For large libraries that are the only library using their Evergreen instance, this makes things a bit easier -- if your migration fails, you can continue using the old system and wipe the Evergreen and try again. However, in our case, we've found it useful to have a migration test server which we can do a dry run of our migration scripts. When conducting a migration, our migration team will take a snapshot of the production database, and then write the migration scripts to load the data into the snapshot. The snapshot can then be tested by the migrating site.\n\nI can't recommend this approach enough. You will have a test server which you can look at and test your data in. Ideally, your migrator should get the data well in advance and being writing scripts, and once they have them reasonably complete, they can load the data into your test instance. At this point it's very important that your front line staff get on there and poke it. The hard part here is -- what to test? Most libraries don't know what they should be testing. It's important to have libraries identify some patron records and items and transactions in the the legacy ILS and see if they match up in Evergreen.\n
  • \n
  • \n
  • A basic consideration is circ policy. Some more basic ILSes base how it generates fines on the type of item being circulated. They may vary a great deal on how an "item type" is defined, for example it could be in the MARC record, or a custom field of some kind. Other systems base the fines on the patron type, or others may have myriad options like Evergreen. Every ILS does this a little differently, so it's important that you learn about Evergreen -- Evergreen primarily uses circulation modifiers which are essentially tags for a grouping of loan rules and fine rules. \n\nAnother area of confusion is that Evergreen has two concepts, "status" which tends to indicate a transactional state such as "checked out", and "shelving location" which indicates a location in the library. But this can become a bit confusing because other ILSes may vary in these concepts. A common example of confusion is "On Display" -- should that be a shelving location, or a "status"? Your migration will be a lot easier for your staff to understand if you can figure out where there will be conflict between your old system and new system, and ideally communicate these mappings to your migration team or work with them to create\nthose mappings.\n
  • \n
  • For the rest of your data, staging tables are a good way to do your data migration. You simply create a set of tables based on your legacy ILS's dataset right in Evergreen's postgres database. From there, you can do all the data manipulation you need right in the database, and you'll have the power and speed of SQL. An additional benefit is that if you are migrating into an existing Evergreen system, you'll be able to dedupe barcodes and the like against the master Evergreen database.\n
  • \n
  • \n

Transcript

  • 1. migration success storiesstuff we learnedTara Robertson and James Fournie
  • 2. EMILY CARR UNIVERSITY OF ART + DESIGN
  • 3. AUTOMATION HISTORYHTTP://WWW.FLICKR.COM/PHOTOS/SPORKIST/134487256/
  • 4. CLEAN UP YOUR DATAHTTP://WWW.FLICKR.COM/PHOTOS/JEFFWERNER/4594193367/
  • 5. WE ARE SPECIAL SNOWFLAKESHTTP://WWW.FLICKR.COM/PHOTOS/JEFFWERNER/56807966/
  • 6. DO A NEEDS ASSESSMENT
  • 7. OLD
  • 8. NEWHTTP://ECUAD.CATALOGUE.BCLIBRARIES.CA/
  • 9. GET SUPPORT FROM MANAGEMENTIMAGE USED WITH PERMISSION FROM KARA PECKNOLD
  • 10. DEVELOP A CLEAR PROJECT PLAN
  • 11. READ YOUR CONTRACTHTTP://WWW.FLICKR.COM/PHOTOS/WWWORKS/1431384410/
  • 12. STAFF TRAINING AND SUPPORTHTTP://WWW.FLICKR.COM/PHOTOS/KWL/2948818020/
  • 13. MANAGE STAFF EXPECTATIONSUSED WITH PERMISSION FROM MANUEL GOGOLIN
  • 14. COMMUNICATIONHTTP://WWW.FLICKR.COM/PHOTOS/TRUFFLEPIG/4189165819/
  • 15. STAFF FRUSTRATIONSHTTP://WWW.FLICKR.COM/PHOTOS/O_0/5724900309/
  • 16. CONSIDER MAKING OTHER CHANGESHTTP://WWW.FLICKR.COM/PHOTOS/RETROCACTUS/4342948954/
  • 17. BUILDING WHAT WE NEED
  • 18. Data MigrationsJames Fournie & Tara Robertson
  • 19. Sitka, circa 2007
  • 20. Sitka, circa 2008
  • 21. Sitka, circa 2009pushpin - branch of multibranch system yellow - post-secondary libraries
  • 22. Sitka, circa 2010
  • 23. Sitka, circa 2011 purple - special/government library
  • 24. Hazelton B.C.
  • 25. What’s in your network closet?Photo by geo462rge (flickr)
  • 26. Requisite Skills• Awareness of MARC• Scripting language (Perl, Ruby, Python)• SQL• New ILS• Old ILS
  • 27. DIY vs Outsourcing• Outsourcing leaves it to experts• Outsourcing lets your staff focus on other things• DIY builds capacity• DIY offers more control
  • 28. Test Early, Test OftenPhoto by kbaird (flickr)
  • 29. Garbage In, Garbage Out
  • 30. Barcode on item Barcode on item Barcode on item T 000152637 T 152637 t 152637Barcode Scanner Barcode Scanner Barcode Scanner T 152637 T 152637 T 152637 Legacy Data Legacy Data Legacy Data
  • 31. Data Mapping Old ILS Evergreen ItemType: ref Reference flag: true Some Book Status: ref by Somebody Circ modifier: book 3rd Ed.Location: 2nd floor Shelving location: 2nd floor Ref Status: Available
  • 32. Standardizing Migration ScriptsPhoto by It’sGreg (flickr)
  • 33. Staging tablesPhoto by ianalexandermartin (flickr)
  • 34. Get Reports on Data