AAUP 2012: Digitizing the Backlist II (C. Lewis Evans)


Published on

Published in: Education, Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Okay, so some organizations are so ship-shape that they can execute an all-hands-on-deck review of their backlist. But not every press is in position to do that, for various reasons. That’s not to say that you still can’t make progress, even if it’s interns who help shoulder most of the load while other departments carry on with their regular duties. (The U.S.S. Texas, Laura)
  • If you take a closer look at the way the AAUP digital publishing survey was designed, you can actually see an organization perspective in which rights & permissions departments don’t really fit in the mix. And yet, here we are, wagging about the long tail. At any rate, the survey shows a wide range regarding backlist digitization efforts. Some presses are nearing completion, others are stretched, many are being strategic in their focus.
  • As you cansee – survey respondantswere concerned about rights, but are equally if not more concerned about the financing and how-to issues: resources, and business model at the utmost, followed by production issues and metadata and database systems. As far as IP goes, I wonder if the survey would have skewed differently had this area also been polled? But really, it’s clearly all tied together.
  • It’s all very interconnected. There are many ways to approach it, both in terms of angle of approach and in terms of the order of proceeding.
  • For Alabama, it all started in 1999, when we signed an agreement with the original NetLibrary. One staffer led the charge, single-handedly going through the files of titles then in print with dogged-determination and laser-like focus. Amusingly enough, we only placed 341 titles with NetLibrary – which meant shipping actual books to them to digitize and own the files. We just got royalites –no digital assets created for us. No doubt we would have placed more, but our director in the 80s reverted rights to many titles back to authors when they went OP. Not having a way to do anything with those titles after selling through the printing , it seemed to him the right thing to do, bless his heart. (I inherited the NetLibrary hourglass when JK retired.)
  • But it was the beginning of our rights metadata. At the about the same time that JK was working on the NetLibrary project, the Press retired Cats Pajamas and built “George” using Access 97. Here is an example of the legacy rights data I inherited in 2006. Note: licensed to NetLibrary box, SDRP, rights holder, no other boxes, and the mysterious “art” -- ? Also, yes, that is a redaction that you see.
  • Two laterexamples – evolving records, evolving structure (sub-restrictions). Discuss “Press has E-Rights” box and launch form confusion. This is also an example of something as Kathleen & Laura pointed out – just because permissions problems have been identified does not mean that someone automatically becomes responsible for addressing them. Addressing them can be deciding to redact them, or deal with them. How does a Press decide to approach that? An acquaintance of mine just read an ebook released by one of the Group 5 presses from which ALL the illustrations had stripped, including those by the author. I’m guessing they just did a top-to-bottom to get everything on the market ASAP. But doesn’t that undermine the integrity of book? Is that reflected in the price? Have buyers been warned?
  • On the Cusp: 1) erights but no ebook confusion. 2) Note it has 12 specific restrictions. 3) Ghost ebook ISBN. 4) Because it was too complicated to ebook, no files were sent to our DAD, and not sent to marketing/promo services either. Bad for discoverability.
  • So here’s what I’ve been hiding: the ‘cleared for ebooks’ box. And a full shot at the rights screen. Also: redaction notes. Current effort to better structure redaction information to make sure it gets implemented and that customers know about their choices.
  • And then the next thing we developed, because we really needed a way to focus in on all the fields that needed to be place and connected for the ebook to really make through our workflow and out to downstream vendors, of whom there were more than these 5 (plus CCC) by this time: We literally could not fit them in our database.
  • Andmeanwhile, all the other departments at the were also building George out in a very organic way that was specific to the Press – like a coral reef. But by 2008, the handwriting was on the wall: with more than 7,000 entries and lack of system support for Access 97, it was time to migrate to a new system.
  • Again, in a rather organic that was specific to our situation (due our relationship with Chicago and their use of the Title Management database), we ended up with Firebrand running our entire enterprise system. They are specific to the publishing, and already had a data structure in place. This graphic is just for fun – it’s ONE PART of our overall cyberinfrastructure! * Point out CDC’s TMM, Eloquence, us, Bibliovault, CDC’s r&p database thingy)
  • So, here is the Firebrand record for the last book I just showed you. I’m showing this to illustrate our evolving data structure, and the fact that we actually lost some highly-granular material specific to George because it would have been too expensive to pay to port it over. For example: itemized restrictions, and the actual check marks in all my boxes. Intern-power to replace those, and put itemized restrictions records in as I discover them. Also, you can see we’ve continued to tweak the system in adding a ‘no ebook at this time’ box, just to make it crystal clear. (Also, our policy that ebook ISBNs shall not be assigned until I’ve cleared it). You can also see the ebook redaction note, the territories, and a little bit of the production tasks at the top. While this works for us, especially at the degree of granularity we can handle, there is more on the horizon to keep an eye upon: rights metadata standards.
  • What you can see from what all three of us have shown you is that we really are trying to structure and manage our rights metadata in order to curate and manage our collections. Which seems impressive – but look what Mike Shatzkin blogged last week. But a careful read of this, and in conversations and in my research, what strikes me is the different sets rights metadata people talk about: subrights information, geographical sales restrictions, in-house records like these… but the effort is underway to develop international standards that will allow players to communicate more effectively internally and in the market.
  • Controlled Vocabulary, Version 1.0 released April 2012 – it’s only 10 pages, not scary. (see the page for their Rights committee to get the full PDF)The Book Industry Study Group’s Rights Committee is “working to create a standardized framework that can be used to describe any rights transaction, from a traditional language and territory book deal to serialized content used in educational programs for schools or a cookbook app for the iPhone enhanced with how-to videos and shopping lists. This framework will have a plain language form that will be publicly available for use as a reference both by companies in the traditional book sphere looking for guidance, and the digital payers like tech companies that are entering book publishing and need to understand the rights landscape. We are also developing a code-based version of the framework under the ONIX standard that can be used for system-to-system transactions.
  • So, it does not take a battalion to make progress -- Start where you are, and if you can’t go A>Z in one big pass, there are ways to think about this.
  • Assuming the goal is to increase revenues to better enable us to serve our mission, most of us – with exceptions – are probably focused on the consumer market right now, because that seems to be where the action is.
  • Anecdotal evidence and experience-based, but I don’t have hard data. And I can imagine scenarios by which sales are not the main driver: OA publishers, library parterships….
  • The point is, there are many layers to the onion. And many paths to the top of the mountain. Let me know what you find out on your journey, because I’m still investigating.
  • It’s a constantly evolving information ecosystem.
  • AAUP 2012: Digitizing the Backlist II (C. Lewis Evans)

    1. 1. Tackling the Unthinkable: Digitizing the Backlist Part 3: Controlled Chaos Claire Lewis EvansThe University of Alabama Press
    2. 2. Backlist digitization: A number ofpresses indicated that all possiblebacklist title have been digitized,and many indicated this as a goal.One press reported that resourceconstraints means they are perhaps10 years out from complete backlistdigitization. Many presses arefocusing efforts on key backlisttitles.
    3. 3. The original NetLibrary 1999-2001 change is inevitable
    4. 4. George Access 97 database
    5. 5. Reflowable formats• Kindle variants (AZW, .kf8)• ePub: Nook, Kobo, iBooks-flavored ePub, and many more• HTML5 and the future?
    6. 6. Prioritizing Your Approach For the commercial market: trade titles, fiction, narrative nonfiction (history, biography, memoir) • Although creative writers tend to be savvier in negotiating terms and retaining rights, which can cut into ROI Title with strong sales history Titles for which you have digital assets Titles that lend themselves easily to reflowable text with minimal quality assurance issues. Series and titles with flexible support funding that can be used for digitization (memorial funds, etc.)
    7. 7. PDF I’m not giving up on PDFs yet, especially since they are inexpensive to make. • Library aggregation market is primarily PDF-based • SEO and promotional: Bowker, Google Book Partners, Amazon Search Inside this Book, Dial a Book, etc. • O’Reilly report high sales of ebooks in PDF format • Some types of books really do work better when the layout is fixed, and many readers prefer them for ease of use • It’s not headline-making, but there is greater diversity of software out there to support basic reader-interaction with the text (annotation, etc.)
    8. 8. Factors to Consider• The market is going to reflowable formats, but the cost of conversion adds up• Is it worth it to spend $200-$350 for post- production conversion to reflowable formats for titles that are not likely deliver timely ROI?• QUALITY CONTROL is resource intensive, too• Where is the library aggregation market going? It’s still oriented to PDF, as are many of the search-oriented sites and services. Will they diversify to accommodate an investment in making reflowable version of backlist titles?
    9. 9. Claire Lewis Evanscevans@uapress.ua.edu