Personanondata Predications and Posts: 2006 - 2013

  • 4,379 views
Uploaded on

Since I started my blog PersonaNonData in 2006, I had no idea where the journey would take me but I have always seen the blog as a close extension of what I’ve been doing since my career in publishing …

Since I started my blog PersonaNonData in 2006, I had no idea where the journey would take me but I have always seen the blog as a close extension of what I’ve been doing since my career in publishing started: Trying to make sense of the business amid an ever changing environment.
That’s the nature of business, and I’ve always believed that if you don’t have an interest in the dynamics and influences pervading your company and business you should probably think about making a change. I frequently meet people who after long periods in publishing really have no idea what’s going on. I find that quite sad.
PersonaNonData has always been a selfish preoccupation but I do hope I’ve been able to shed some light on the business – at least as I see it – for some small group of publishers who enjoy my point of view.
And the photos of course!
Best regards,
Michael Cairns (aka Personanondata)

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
4,379
On Slideshare
0
From Embeds
0
Number of Embeds
41

Actions

Shares
Downloads
3
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. 0 | P a g e 2013 Michael Cairns Information Media Partners April, 2013 Predictions & Commentary from Personanondata
  • 2. PersonaNonData: Predictions & Commentary 2006 - 2013 1 | P a g e Dear Reader, Since I started my blog PersonaNonData in 2006, I had no idea where the journey would take me but I have always seen the blog as a close extension of what I’ve been doing since my career in publishing started: Trying to make sense of the business amid an ever changing environment. That’s the nature of business, and I’ve always believed that if you don’t have an interest in the dynamics and influences pervading your company and business you should probably think about making a change. I frequently meet people who after long periods in publishing really have no idea what’s going on. I find that quite sad. PersonaNonData has always been a selfish preoccupation but I do hope I’ve been able to shed some light on the business – at least as I see it – for some small group of publishers who enjoy my point of view. And the photos of course! Best regards, Michael Cairns (aka Personanondata) Michael.cairns@infomediapartners.com @personanondata 908 938 4889 Text and Images © Michael Cairns
  • 3. PersonaNonData: Predictions & Commentary 2006 - 2013 2 | P a g e TABLE OF CONTENTS PREDICTIONS 2013: THE DEATH OF THE MIDDLEMAN. 3 PREDICTIONS 2012: THE SEARCH FOR ATTENTION 6 PREDICTIONS 2011: THE GROWTH OF INTIMACY 10 PREDICTIONS 2010: CLOUDY WITH A CHANCE OF ALARM 14 PREDICTIONS 2009: DEATH AND RESURRECTION 16 PREDICTIONS 2008: RETURN TO THE SCENE OF THE CRIME 18 PREDICTIONS 2007: TAKING IT IN LEAPS AND BOUNDS 20 CORPORATE DATA STRATEGY AND THE CHIEF DATA OFFICER – SEPT. 8TH, 2011 22 BUSINESS OUT OF THE ORDINARY 29 I STEAL STUFF – MARCH 15, 2011 31 WELCOME TO THE MIGRATION (AND OTHER LESSONS) – FEBRUARY 8TH, 2011 33 CONFUSING A SILO WITH A BUSINESS – AUGUST 3RD, 2010 36 UNITED ARTISTS REDUX – JULY 20TH, 2010 37 THE BAKED BEANS ARE OFF – JULY 13TH, 2010 38 DO YOU SINCERELY WANT TO SELL BUSINESS? - JUNE 29TH, 2010 39 THE CURATOR AND THE DOCENT – JUNE 22ND, 2010 42 A DATABASE OF RICHES: A REPORT ON THE MARKET AND PRICING FOR THE GOOGLE BOOKS PROJECT – APRIL 20, 2010 44 YOUR PRICE MAY VARY – NOVEMBER 18TH, 2009 56 SEGMENTING THE PUBLISHING INDUSTRY – NOVEMBER 9, 2009 58 580,388 ORPHAN WORKS – GIVE OR TAKE 61 THE ISBN IS DEAD – AUGUST 4TH, 2009 65 A DIGITAL CONCIERGE – MAY 21ST, 2009 66 SILOS OF CURATION - APRIL 29TH, 2009 67 WHO WANTS TO PAY FOR “CONTENT”? - MARCH 9TH, 2009 69 PRESUMING NO BOOK – FEBRUARY 17TH, 2009 72 PIMP MY PRINT - DECEMBER 10TH, 2008 74 DEATH OF THE BIG BOX - DECEMBER 3RD, 2008 75 RACK JOBBING THE E-BOOK - JULY 16TH, 2008 76 AMAZON THE MONOPOLY – MARCH 28, 2008 77 MUNICH: FEBRUARY 6TH 1958 - FEBRUARY 6, 2008 78 BRANDS TO PUBLISH – JANUARY 13TH, 2007 79 NEW MODEL ARMY OF SELF-PUBLISHERS – SEPTEMBER 19TH, 2007 81 THE NEW PUBLISHING EXPERIENCE: BUILD YOUR OWN BOOK - JULY 10, 2007 82 HAIL THE DEATH OF THE BOOK REVIEW SECTION – APRIL 30T, 2007 83 WHY DON'T LIBRARIES HAVE PUBLISHING PROGRAMS? - APRIL 9TH, 2007 85 BORDERS STRATEGY PLAN: WHAT THEY COULD HAVE SAID - MARCH 26, 2007 87 GIFT REGISTRY FOR LIBRARIES – JANUARY 19, 2007 91 JUST TRYING TO KEEP MY CUSTOMERS SATISFIED - JANUARY 18, 2007 92 MY EDUCATION SPACE: 'ED-SPACE' - OCTOBER 17TH, 2006 93 THE TEXTBOOK IN THE 21ST CENTURY – JULY 13, 2006 94
  • 4. PersonaNonData: Predictions & Commentary 2006 - 2013 3 | P a g e Predictions 2013: The Death of the Middleman. It seems to some that we’ve entered a period of stasis in the ongoing transformation of the publishing industry. This time last year, I noted that the routine operations of many publishers had fully realized the transition to electronic content and absorbed its implications. So perhaps the last twelve months have been about catching our collective breath. But anyone who thinks the big changes are behind us is probably fooling himself, and may be lulling himself into catastrophic inaction. The harbingers of dislocation are easy to see . . . if you know where to look. In the second half of 2012, we saw a slowdown in the growth rate of eBook unit sales; indications of a possibly significant substitution of tablets for eBook readers; a major strategic publishing merger destined to create a trade publishing goliath; and the sale of one of the big three education companies. Any one of these developments occurring independently could have been analyzed at length but, taken together, they suggest to me that more-- rather than fewer-- changes are on their way. The expectation that the big trade houses would consolidate has been going around for at least five years: in fact, it may be more surprising that the Random House/Penguin deal didn’t happen sooner. Now that it has, it’s a foregone conclusion that there will be another trade merger announced in the next few months, involving some combination of Harpercollins, Simon & Schuster and Hachette. Perhaps all three will combine and, if so, that deal would equal the one announced last year in scale and significance. But that’s probably unlikely. One publisher will almost certainly end up the “odd man out” and it will be interesting to see which it is and what they do next. On the education front, there has been widespread speculation that some merger of Cengage and McGraw-Hill Education will take place this year, since the two companies may end up with a common owner. In the short term, there may not be a full combination but some trading of assets may take place immediately to rationalize the respective businesses with deeper integration to come, perhaps, in 2014-5. In education more broadly, all education content companies (other than Pearson) are only at the beginning of their transition from content providers to embedded content and services providers. Professional information publishers provide content and services at the point of need and education publishers will be doing the same thing in the not-too-distant future. At CES this week, McGraw-Hill made some interesting announcements about product development investments they have been making which presage how this “services approach” may take shape.
  • 5. PersonaNonData: Predictions & Commentary 2006 - 2013 4 | P a g e To segue slightly, the justification for a merger is often presented as an opportunity to save cost, apply economies of scale and/or gain access to a new market. At this point, expense and efficiency gains are more likely to be the primary drivers in both the McGraw-Hill and Random House Penguin cases. Each publisher anticipates significantly reducing costs in headcount, facilities, distribution and other areas in order to deliver the same total quantity of titles. They need to undertake this effort because the publishing value chain is compacting, making it easy for content producers/authors to reach consumers directly which, in turn, is also changing the financial model on which publishing is based. The functional areas where publishers added margin in order to make a profit – overhead, distribution, marketing & sales--are becoming less important (though not unimportant) when authors and contributors can reach their market directly. The implications of these changes for publishing houses have been clear for many years but addressing how their businesses must change to cope with them is nowhere near complete in the larger houses. Smaller, more nimble companies like Hay House and SourceBooks have travelled much further down this path. Education publishing is seeing similar changes but the process of dealing with them will be different. I expect we will see an aggregation model emerge in education, where content ‘platforms’ deliver content and services on a per-user basis. As I’ve mentioned before, this model is already in operation. Academic librarians and universities will be offered an extensive database of educational material from which faculty can choose the material – probably pre- selected, topic-driven packages – best suited to their classes. Platform providers such as Amazon, Blackboard, Pearson and EBSCO may soon be the only efficient way for publishers wanting to sell content (or access to their content) to reach students. The platform providers will negotiate distribution agreements with all other content providers and may compete against each other to offer the best combination of content. But a more likely and important point of differentiation will be the unique services and level of integration they can provide faculty, administrators and students. Instead of Pearson and EBSCO, think Reuters and Bloomberg . And instead of profit models based on revenue per book, think per head or per desk. (This model may begin to undermine the argument for DRM in education.) A very positive byproduct of this change in education will be a complete integration of library resources, institutional resources and consortia buying/negotiation that will allow better alignment with objectives for student success. It seems odd (to me) that educational content components, as they are currently supplied to students and faculty on campuses, often stand independently of each source and can only be ‘integrated’ through a manual, rudimentary process. And it’s even more odd when you consider that libraries have long been licensing tools and services from EBSCO and Serials Solutions which provide deep integration of and access to the databases and content the academic library licenses. It will only be a matter of time before pan-university content assets and access are brought together. Ultimately, 2013 may bring more significant change in the trade and educational landscape than we’ve seen in many recent years. While there will be a lot of focus on the big trade merger and its constituents, the industry’s other players will have to fight aggressively not to lose any advantage—we all recognize that “bigger is better” when it comes to applying economies of
  • 6. PersonaNonData: Predictions & Commentary 2006 - 2013 5 | P a g e scale in a business whose underlying business model is changing radically. In education, we may be paying attention to McGraw-Hill and Cengage but Pearson, as the market leader, is likely to embark on even more aggressive strategies this year--under its new CEO, and with the divestiture of Penguin and possible sale of the FT Group, the company has forcefully declared education to be its focus. Opportunities for innovators will continue to emerge, as one would expect in a rapidly changing market. However, many of the niche or narrow solutions currently on offer--whether they be assessment , content-delivery or search tools-- are likely to ‘run out of market-space’ as these solutions become embedded in, and offered as an attribute of, the platform solution. I see opportunity in the delivery of solutions that help specific users – say, university faculty – take full advantage of the integration of content and services that will occur—for instance, on campus-- since many user groups may need to change the way they conduct their usual activities. The outcome of these work changes will be to produce more productivity and better solutions, but getting there will require ‘intelligent agents’ to facilitate—to help assemble content, training programs, workflow and productivity tools and similar applications to rewire their work environments. In education, this requirement may give platforms like Blackboard and Desire2Learn an advantage, given their installed base on campus. While change often produces anxiety, I see dynamism in the book (and “content”) industry that is exciting and invigorating. There will be many big changes in 2013 but in contrast to the most recent past where publishers were buffeted by macroeconomic changes, I expect these changes to reflect the elements of a positive shift in the way publishing companies operate and consumers – especially in education – consume content. We're going to end 2013 thinking completely differently about this industry. Some other trends I see emerging during 2013:  There will be more experimentation with subscription programs, possibly similar to the old book club model, with curated collections and incentives (such as free eBook readers). These could also be combined with social network capabilities to allow reading group subscription models and social networking. For example, these could be ‘self-defined’ groups: A reading group could set themselves up and choose from a variety of subscription models that allow their group to read a specific number of books, use a unique set of social tools and pay by subscription rather than per book.  A Chinese- or Middle-East- based publisher will acquire a major, brand-name media company. CurrentTV notwithstanding, this will be in information/professional or education.  “Self-publishing” will see huge international growth as Asian and Latin American markets develop.  The unlucky leftover of the three trade houses in play will be immediately acquired by Amazon (just wanted to see if you read this far).  There will be some consolidation in the eBook provider market.  The Apple bookstore will be re-launched and will end 2013 poised to surpass the Nook and move into second place behind the Kindle.  Similar to what takes place in gaming, innovative publishers will begin to engage readers in new book and content ideas even before the book is ‘completed’. In game
  • 7. PersonaNonData: Predictions & Commentary 2006 - 2013 6 | P a g e development, it isn’t unusual for games to be released with only the first three out of 10 game levels completed--why build the whole product if no one wants to play? Book and content producers may try to adopt and adapt this model for their new product development.  Manchester United will win the English Premier League title Predictions 2012: The Search for Attention There’s little more to say about eBooks these days: The migration is now embedded into business operations across the industry. Yes, there remain some issues and problems day-to-day but it would seem that the issue of most concern to publishers for the past five years (trade particularly) is now subsumed under business operations as usual. And that bores me. Sure, we could argue about the future purpose and value of a publisher but most (if not all) the big trade houses are doing better now than they were three years ago and continue to sign the big authors and sell lots of units. The amount of attention given to the self-publisher model is disproportionate to its viability as a solution better than that delivered wholesale by a traditional publisher. Yet, to some, the counter argument or disruptive solution is always more interesting and therefore garners more attention. There will be more big success stories in self-publishing but the larger point isn’t about replacing the old model with the new—it’s more about incorporating the new model into the old. Where self-publishing was derisively termed ‘vanity publishing’ 10 years ago, it could now be considered a vital component of a better, more efficient publishing industry. This set of predictions was harder to conceive that those in prior years and I am not sure why that it is. I’ve been going through this exercise since 2007 (the year I started this blog) and so went back over some of the things I suggested in years past. For example, in 2007, I said:  Several major US colleges will teach various social science courses entirely in simulation. The courses will not be taught in traditional lecture form but entirely within the software simulation. Now, five years later, there have been some experiments in this area but my comment was uttered in a time when everyone was building a home in the simulation game world and, at the time, it seemed inevitable we would all be spending half our lives in SecondLife. Clearly that never happened, and on the other hand, during 2011, I spent many weeks looking into the medical simulations ‘business’ which is very impressive and continues to push the boundaries of real simulation in education and training. What’s important here is that simulations solve several business, operational and administrative issues for schools and hospitals which drives
  • 8. PersonaNonData: Predictions & Commentary 2006 - 2013 7 | P a g e the business case for their adoption. That might not have been the case for SecondLife (at least in a comprehensive sense). The anticipated benefits of simulated learning will only be realized if they solve a business problem(s). As I saw during my short research project, in medicine and especially nursing, there are very real addressable problems that simulations solve for educators and administrators. Some of the simulations centers I visited are almost exact replicas of hospital wards and operating theatres. It is quite incredible. The money poured into hardware at these centers is significant (and growing) but the next big change in simulations training will be how traditional medical content is integrated into delivery in the simulations context. No easy thing, but the merging of the practical and the theoretical is viewed as critical by educators and practitioners. The medical segment is representative of how education publishing in particular still has significant challenges to address as their industry deals with changes in technology, delivery and performance measurement. The following year (2008), I incorrectly predicted “McGraw-Hill will reorganize its business much as Thomson [Cengage] has done. MGH education could be sold to private equity.” The impact of the sale of MGH in 2012 is unlikely to drastically change the publishing landscape in the short term, but there may be larger structural changes across the entire business that will be more interesting. As we know, Apple is set to make an announcement soon which is rumored to be about educational publishing. If that’s true, it might stimulate some fundamental changes in education similar to the impact iTunes had on the music business. Sticking with education, in 2009 I suggested that the Obama administration would make wholesale changes in education policy and become more ‘federalist’ in approach. As some ‘celebrate’ the ten-year anniversary of ‘No Child Left Behind,’ the administration is pushing more (or allowing more) responsibility to the states for education policy while at the same time providing more assistance to ‘failing’ schools so they can improve. If anything, the Obama administration may be more ‘activist’ with their assistance versus the prior administration and this policy (or set of policies) is likely to aid education publishers in the provision of the next generation of assessment tools, which will be oriented more toward remediation and intervention (and which I touched on in 2010). Last year, I focused my prognostications on the concepts of curation and community: The growth of intimacy assumes that users will seek closer relationships with their core community of friends, workers or communities of interest in order to make decisions about the content they access, the products they use and the entertainment decisions they make. Book publishers, retailers and authors will need to understand how to actively participate in these communities without ‘marketing’ or ‘selling’ to them. Facebook is obviously the largest social community but, within Facebook, there are a myriad of smaller ‘communities’ and, within these communities, the web becomes highly personal. The relationships among the participants becomes ‘intimate’ in the sense that the participants share knowledge, information, even personal details that in a traditional selling or marketing environment would never be breeched
  • 9. PersonaNonData: Predictions & Commentary 2006 - 2013 8 | P a g e by the vendor. The dynamic of selling becomes vastly different in this context and publishers must find a way to understand these new communities, the influencers that dictate behavior and the motivations that contribute to selling products (and services potentially). I still believe the above to be a trend even though it hasn’t developed as quickly as we might have expected. I fully expect the concept to mature over the coming years. Which suggests a lead-in to a theme for my 2012 predictions: Where 2011 was about the community providing a filter for its ‘members,’ 2012 will be more about the community helping focus/apportion the attention of its members. In a screen-based entertainment world, publishers will struggle to assert their right to a user’s time against competition that includes every media option out there from games to TV to social networks. This is different than the former paradigm because all media usage is rapidly migrating to tablet and applications-based consumption. And this includes television. With both major book retailers actively engaged in the tablet wars, it seems inevitable that tablets will be the predominant delivery mechanism for publishers’ content, including trade and education content. So, if our content is delivered on these devices, how do we establish and hold the user’s attention in an environment where the user can skip from media to media with almost no friction whatsoever. The answer to this question is partially reflected in last year’s post regarding community and curation. The most significant challenge publishers will face is getting their content shared and linked to and powerful social network marketing programs will be at the center of this effort. This doesn’t only apply to trade content--‘communities’ organized around ‘influencers’ such as academics/professors, institutions, specific courses, etc. will also drive the sharing and linking of educational publisher content. For example, an individual interested in business entrepreneurship might ‘friend’ the Harvard class ‘Entrepreneurship 101’ and use the reading list to guide his or her personal reading. Another key aspect of the quest for attention revolves around the metadata and the supplemental content publishers produce for all their content. Most of this remains either dis- or un-organized. A lack of depth and accuracy of meta-data is still a deficiency shared by most publishers, even as the need for more meta-data expands. On the whole, publishers are probably getting further behind. The thing that will help publishers win a larger share of attention will be multiple ‘entry points’ that enable the user to interact with their content and allow influencers to share and link to it. Not only do meta-data files need to be robust and detailed, but users need to be able to easily find references, indexes, TOCs, links, etc. and reviews as well as alternate views of the content (audio, video, even perspectives). Not only do these various elements provide ‘hooks’ which users can grab in multiple ways, they will also serve to build loyalty and authority for the content itself. And this could ‘index’ the content so that it scores high-ranking positions when consumers seek the content you are selling. Thus, the entire process feeds on itself.
  • 10. PersonaNonData: Predictions & Commentary 2006 - 2013 9 | P a g e Searching for Attention will represent a significant challenge for all content owners but particularly publishers, as content amalgamates via the tablet platform. Not for nothing, I think I’d rather go on that journey with B&N and Amazon versus Apple or Google because at least they are booksellers. Whether that’s enough remains to be seen. Here are some additional trends to watch for over the next year or two:  The MGH deal aside, there’s a good chance we will see additional movement in the ownership of segments of the education business. Cengage will have little difficulty with their refinancing (doesn’t mean there won’t be any pain) but educational units on the periphery (medical, legal, etc.) may witness more consolidation in the coming year.  With the ‘settling’ of eBook content and processes within many publishing houses, we’ll begin to see more experimentation from publishers especially with expanded definitions of traditional book content. We’ll see eBook content – the ‘book’ part as a component of something that looks more like an issue of an online magazine. Obviously, an ‘issue’ where the ‘book’ part is the focus but ancillary material (in the magazine sense the supporting articles) lend deeper meaning, context and even leads to obvious tie-ins and sequels. Essentially, I think we will begin to see the beginnings of the renaissance of the ‘book’ that everyone has been moaning about.  In an area that I am focused on, we will begin to see a rapid movement towards atomizing educational content. Apple may well announce an educational publishing version of iTunes where content is such as chapters, cases and articles are sold in parts as songs are sold versus albums. Watch for a painful realization about pricing. The al a carte approach for content purchasing is something educators and institutions are looking for and initiatives similar to the iTunes model (and AcademicPub) are being welcome because they empower people to make better choices.  In sport, it will be a tight run thing at the top of the premiership this year but I still believe Manchester United will beat out Manchester City for the title. England will come second in the medals table at the London Olympics.
  • 11. PersonaNonData: Predictions & Commentary 2006 - 2013 10 | P a g e Predictions 2011: The Growth of Intimacy Things might have been worse: As 2009 came to a close, there wasn’t a lot of optimism about 2010 yet; as the year unfolded, things were neither worse nor better than they had been. And now, there is even some excitement spurred on by the launch of the iPad and the rapid growth of eBook sales. Certainly any analyst, technology company or consultant publicizing his or her [proprietary] forecast of eBook and eReader sales for the next decade was almost guaranteed to gain some attention, especially as each successive forecast sought to outdo the prior reports. Encouraged by the boosterism, many pundits think this is ether the end of book publishers or a new dawn. I don’t think it’s either, but the transition from print to electronic could mimic the transition music made from vinyl to disc which stuffed record company profits in the short term (only to entirely undercut the industry for the long). It is too early to tell how book publishing will survive this transition, but it is entirely possible that we will look back on these ‘transition’ years as ones in which publishers missed an opportunity to connect directly with their readers, having limited their ‘opportunity’ merely to replicating the book experience on the screen. Change and progress is glacial in the book industry while, all around the industry media markets and products advance at break-neck pace. Evidence of massive and rapid change surrounds the publishing industry: This time last year, tablet computers were utilitarian business equipment; now, with the iPad, they are status symbols and, for millions, a gateway page to life online. In 2009, few televisions were web enabled but this year this is a standard feature opening up the web for living room leisure activity on a big screen. Content produced by publishers is now showcased in these channels and on these devices, yet book publishers continue to be bit players in the evolution of eContent and indications are this is unlikely to change appreciably in the future. Some of the macro changes I mentioned last year continue to roll out into the mainstream, such as the migration toward subscription models for education content and trade reference, collaborative content and data sharing in academic publishing and an adoption of the rent vs. buy model for content. And while none overtook the business in any wholesale manner, all continued to grow in significance during 2010 as they will in 2011. The Growth of Intimacy In 1961, Newton Minow (newly installed as Federal Communications Commissioner) made a famous speech to the National Association of Broadcasters in which he described television
  • 12. PersonaNonData: Predictions & Commentary 2006 - 2013 11 | P a g e programming as a ‘vast wasteland’ and he suggested those in attendance watch a day of television where, “You will see a procession of game shows, violence, audience-participation shows, formula comedies about totally unbelievable families, blood and thunder, mayhem, violence, sadism, murder, western badmen, western good men, private eyes, gangsters, more violence and cartoons. And, endlessly, commercials--many screaming, cajoling and offending. And most of all, boredom. True, you will see a few things you will enjoy. But they will be very, very few.” The web may be all of this in spades but, increasingly, the web user is demanding guidance and intermediaries who will then aid in their selection of appropriate and meaningful content. As I’ve discussed before, curation will become a marketable skill set and audience building around specific interests and specialties will be increasingly valued by content users. Just as publishers may have purchased publishing companies with defined title lists in years past, they may now consider purchasing “communities of interest” (and their associated apps and Facebook pages, etc.) to which they can market content/products. These communities may become the ‘imprints” of tomorrow with defined – even built in – product development, marketing and selling channels. The growth of intimacy assumes that users will seek closer relationships with their core community of friends, workers or communities of interest in order to make decisions about the content they access, the products they use and the entertainment decisions they make. Book publishers, retailers and authors will need to understand how to actively participate in these communities without ‘marketing’ or ‘selling’ to them. Facebook is obviously the largest social community but within Facebook, there are a myriad of smaller ‘communities’ and, within these communities, the web becomes highly personal. The relationships among the participants becomes ‘intimate’ in the sense that the participants share knowledge, information, even personal details that in a traditional selling or marketing environment would never be breeched by the vendor. The dynamic of selling becomes vastly different in this context and publishers must find a way to understand these new communities, the influencers that dictate behavior and the motivations that contribute to selling products (and services potentially). This is the next level of social networking: It isn’t enough to have a Facebook page or a Twitter account. Authors and publishers need to engage deeply where it matters in order to build awareness, build their brand (if necessary) and establish selling channels. In the case of Facebook, the company already has a vast amount of book-related information broadly collected from their community and undoubtedly the sales volume that results from the discussions on Facebook is large. Most importantly for vendors, the ‘conversion’ rate from an ‘intimate’ recommendation to purchase is likely to be far higher than from any other source or marketing activity. Finding and understanding the applicable nexus within these communities that delivers the widest possible ‘conversion’ rate will be critical if publishers are to participate in the growth of intimacy.
  • 13. PersonaNonData: Predictions & Commentary 2006 - 2013 12 | P a g e While publishers may think the ‘growth of intimacy’ will have more relevance to trade publishing, this may not be the case. As LexisNexis and some other professional publishers have proven that a social strategy that encourages users to act as curators for other users has significant value in building and supporting the publisher value proposition and brand. I see this evolving in education as publishers encourage academics and students to participate in social networks focused on specific topics and content. But a word of caution: Building a social network simply to facilitate the sale of your content or textbooks will never work. A critical aspect of Facebook is that it is vendor agnostic and thus provides the latitude for the community to come up with the right solution or product. With reference to Minow, it won’t be the ‘broadcasters’ that ‘[could] do better’ as he suggested, but it will be the consumer that will find a way to get to the content they value using their web of ‘intimate’ relationships. Curators (or docents) will become critical for users in this discovery process and, if publishers aren’t connected to this network a meaningful way, they will be consigned to the vast wasteland of skateboarding dogs and porn. The growth of intimacy will be a recurring theme for all content producers over the coming years and addressing the various aspects of this trend may result in important changes in the way publishers develop and market their products. Here are some additional trends to watch for over the next 12-24mths:  Prices for dedicated eReaders will fall to $30-50 and will increasingly be used as “fee-with- purchase” subscription promotions with newspapers and magazine subscriptions or combinations thereof.  That newspapers will be moving toward a paid subscriber model is rapidly becoming old news (with the NYTimes expected to launch their service in January); however, to raise their value proposition, newspapers will be more interested in limited content syndication partnerships that lower the number of outlets with access to specific content, thus raising the exclusivity for the content and the value proposition for consumers. Rather than the same story appearing in hundreds of outlets, consumers will be looking for exclusive insights, analysis and commentary that can’t be found elsewhere. (Again, a ‘curation’ theme going on here).  Tentatively, ranking “best social sites” will attempt to do the same thing that bestseller lists do in reflecting interest and popularity. The parameters will be unclear (or experimental) initially but this data – organized as a ranking – will become a valid measurement of commercial success and reader interests in the same way that bestseller lists do today.  Print will increasingly be diminished by publishers - not directly because of electronic versions, but by their dismissive attitude to the quality of paper and bindings. Shoddy quality will serve to undermine value as paper rapidly yellows, bindings split and pages fall out.  The popularity of eBooks and eContent will also chip away at the Byzantine (or British Empire- like) organization of many international publishing companies, which effectively splits rights by country and region rather than by language. We will start to see international publishing companies completely rethink the ‘local office’ formula where in
  • 14. PersonaNonData: Predictions & Commentary 2006 - 2013 13 | P a g e different editions with different pricing, layouts, covers, release dates, etc. are produced by local staffing. Instead, publishers will begin to dismantle these operations and replace them with ‘centers of excellence’ where specific offices prove their expertise in specific functional or content areas and provide these services to the rest of the worldwide publishing operations. Direct customer-focused staff will remain but the duplication of functions – driven primarily by the content normalization that eContent imposes – will result in the elimination of functions across the global enterprise. Publishing companies will become stronger as a result, since they will be able to aggregate expertise in specific areas and distribute it broadly across their operations.  International ownership of publishing companies is par for the course but we haven’t seen entities from China, India or the Arab world make a material impact on English language publishing. That will change as these markets mature and local investors determine they needn’t be simply buyers of English language materials but they could own the producers of this content as well. Most of these markets are still untapped: the market for English language content continues to grow and the supply of content locally produced and distributed internationally is still in its infancy. There are over 5mm college graduates in China each year versus less than 2mm in the US. This represents a vast market opportunity for all types of content and it is more than possible that a Chinese investor will buy a large English language publisher to address both supply and demand in this market. The same scenario could be true of the Indian and Arab markets. Watch for a big news takeover during 2011. Lastly in sports: Last year I predicted that Manchester United would win the Premier League title over Arsenal but, in fact, United lost by a point to Chelsea. The point was effectively lost in a late season loss to Chelsea but, this year, Chelsea look well out of it. So again I predict United will win the title over Arsenal. I also predicted that England would win the Ashes series in Melbourne which they did last Tuesday. Hooray!
  • 15. PersonaNonData: Predictions & Commentary 2006 - 2013 14 | P a g e Predictions 2010: Cloudy With A Chance of Alarm As we greeted the New Year in 2009, we knew we were in for it economically and, as I suggested in my prediction post this time last year, one of the most obvious assumptions was that things would get worse before they got better. Contrary to expectations, publishing may have come out a winner in spite of the steady litany of bad news on the magazine, newspaper and television fronts that percolated all year. While recognizing the economic challenges in store for us back in 2009, I also suggested a resurrection of sorts could be had as businesses began to accommodate the fundamental changes that were taking place in the industry as they executed their business plans. Sadly, there have been few bright spots in media during 2009, and after having taken the pulse of views on the near-term future in publishing by speaking to a number of senior publishing executives, my belief is we will not see any appreciable improvements during 2010. While some of their collective views can be attributed to ‘hedging,’ external trends support the lack of optimism whether they be reductions in education funding and library budgets or the increasing reliance on “blockbuster” authors or pricing issues. Many of the macro trends that I have noted in years past remain prevalent and in some cases have accelerated. For example,  Educational publishers appear to be increasing – rather than decreasing – their investment in electronic media and more importantly, are beginning to think of their electronic products as distinctly different from their print precursors. In particular, educational publishers have started to talk meaningfully about “databases” and “subscriptions.”  Newspapers – particularly NewsCorp – have been particularly active in attempting to build paid content models which support the separation of ad-based and subscription-based models. Newspapers aside, even trade publishers – notably Disney - are beginning to experiment in interesting ways with paid subscription models. On the other hand, my expectations for further compacting of the publisher supply chain and increasing collaboration across publishing segments appear to have run aground. Interestingly, an executive I recently spoke to noted that the separation of publishing units that historically sat together – education with trade with information, for example – has negatively impacted publishing companies ability to learn and benefit from the experience and market testing of their sister companies. Possibly a decrease in access to ‘institutional knowledge’ has, in general, contributed to some media companies’ hesitancy to experiment. Prognostication being the point of this post, there are some newer macro changes I see that
  • 16. PersonaNonData: Predictions & Commentary 2006 - 2013 15 | P a g e will define the publishing and media space more and more over the next three to five years and it will be interesting to see how these develop.  Firstly, 2009 was the ‘year of the eBook’ as new devices seemed to launch each week. But the eBook, as we understand it today, only has three more years to run. By the end of 2010, we will be focused on the ‘cloud’ as the implications of the Google Editions product become clearer. This accelerated migration away from a physical good – even with an eBook, the title was ‘physically’ downloaded – will challenge our notion of ‘ownership’, rewrite business rules and provide the first true ‘strata’ for communities (or social networks) to develop around content. The Apple iSlab (iSlate, iTablet, iEtc) will become a key driver in this development as the company becomes the first consumer electronics maker to apply their design expertise to multi-content delivery. (I don’t count SONY because they got it completely wrong).  A closely related (but somewhat tangential) development will be the realization by publishers that the library market could become a threat to their business models as mobile and remote access is aggressively marketed by companies such as Serial Solutions and EBSCO. Currently, these products are not specifically related to trade and academic titles; however, the implications for all published product will become clearer as patrons’ ease of access to ‘free’ content grows and as the resolution services improve. Remote access to information products by library patrons is obviously not new, but applied to mobile computing it will change many things about the library model. This trend coupled with the ‘cloud’ concept above, will require an industry-wide re-think of the library business model.  There are hints that the silo-ing of content that has been endemic to information and education for many years could become a trend in trade as well. Examples remain sparse, though Harlequin and Tor are routinely cited as exemplars of this trend.  Subject-specific concentrations of content in trade will become a more broadly viable model; but simply concentrating content is not enough. Trade publishers will begin to license or commission ancillary content that adds a transactional element to their offering (not exclusively in a monetary sense). In effect, this additional content will provide a reason for consumers to return periodically to the site for free reference, news or dictionary content. Thus, this content will complement the subject-specific content that publishers generate themselves. As each segment develops, the ancillary content will also become core content to the publisher and may eventually be produced by them (although, initially, the content may be licensed). Over time other services will be built within each subject silo, and this maturity will replicate the product development seen in information publishing over the past ten years as those businesses established subject specific franchises around topics such as business news, tax and legal information. Aside from these macro trends that will grow in importance over the next few years maintaining the status quo will still be the operative task during 2010. Here are following are some more specific predictions for 2010:  Certain segments (financial, legal and tax information and education, for example) continue to be challenged and any business that relies on the library market will face a very difficult
  • 17. PersonaNonData: Predictions & Commentary 2006 - 2013 16 | P a g e time. Funding will be worse in the coming year (fiscal 2011) making retention, renewals and price increases problematic. By the end of this year, we could see some consolidation in the information media space.  We will see the return of an old model of collaboration between magazines and traditional publishers as magazines look for ready-made content. Witness the return of the serial and short story to the pages of periodicals as their publishers look for low-cost content for their plodding (but suddenly more aggressive) migration to electronic delivery. In turn, electronic magazines will offer publishers a more effective, targeted and supportive mode of marketing than publishers have seen in years.  2010 will be a year of warfare: Publishers against retailers, wholesalers against retailers, retailers against retailers, publishers against consumers. It may be nasty, brutish and short, but will any of them truly understand the stakes? (See macro trend number one).  Finally, we will see consolidation of at least two major trade houses. This is likely to precipitate another combination by year end. An outsider company (not a current trade publisher) may make a major move into the trade market.  Last year, I predicted that out-of-work journalists would become ‘content producers’ and we have seen that develop as companies like Demand Media and Associated Content build market share. I see this trend accelerating during 2010. As magazines migrate to platform models, they become 24/7 publishing operations with a significantly increased demand for content far beyond their capabilities. Where they will succeed is in curating content for their specific audiences; however, much of this content will be produced for them, rather than by them in the traditional manner. In effect, magazines will outsource editorial.  And in sports, Manchester United will retain their Premier League title, winning on goal difference over Arsenal; Barcelona will win the Champions League; and England will win the deciding fourth Ashes test in Melbourne in December. Predictions 2009: Death and Resurrection Like the guy who is asked how he went bankrupt, ‘slowly and then quickly’, the escalating economic downturn in 2008 has really been brewing since the end of 2007, but we only fell off the cliff in fall 2008. I still believe (as I noted in January 2008) that 2007 will be viewed historically as a watershed year for media: the economic decline will further accelerate the macro trends the industry witnessed as 2007 evolved. These trends include:  The rapid commitment to electronic delivery of content in both education and trade
  • 18. PersonaNonData: Predictions & Commentary 2006 - 2013 17 | P a g e  Separation of ad-based and subscription-based models in both information and professional publishing  Forced concentration in the traditional publishing supply chain countered by (nascent) new channels including direct-to-consumer  Further blurring of the edges across media segments: More publishers will offer all content – not just their own - wider services and applications, and broader linking and partnerships designed to draw customers The economic difficulties today are stark compared to the boisterous 2007 where the price for publishing assets kept going up and many big deals were made. During 2008, many high-profile divestitures were either abandoned (Reed Business) or ignominiously completed (TVGuide magazine sold for $1). 2009 is likely to see both the unraveling of some of the deals done in 2007 and some opportunistic buying but, more generally, the deferment of many companies' corporate development strategies. Naturally, 2009 will see new companies emerge and there are numerous precedents for companies launched in economically challenging markets ultimately becoming very successful. Perhaps challenging the status quo is easier when the status quo is concentrating on just staying "status". Predictions 2009  An easy one: It gets much worse before it gets better. When times were good an oversupply of market options – particularly in retail – hid a myriad of structural problems. Right-sizing in media retailing and distribution will result in one major physical book retailer, one wholesaler and one online retailer. Media will be a sideshow compared with some other segments, particularly clothing and department stores.  Another easy one: Several major city newspapers will change hands for less than the debt they carry. Local and hyper-local models will expand and further encroach on the market for traditional big city news. Coupled with linking, content licensing and arrangements with classified providers like Craigslist and EBay, there will be a rapid expansion of the (hyper) local online news provider market.  Out-of-work journalists will see increasing opportunities to become ‘content producers’ as more and more companies seek to enrich their sites with professional content that appeals to their target market. The typical website ‘experience’ becomes more expansive and deeper than a company catalog or press release site. Journalism as a function becomes more widely dispersed across many business segments.  The NYTimes will either close the Boston Globe and ‘rebrand’ a NYTimes version for the Boston Market or sell it for a $1 saddled with as much debt as they can get away with. Sunday's paper will now come on Saturday: The UK market successfully went down this road and the US will (belatedly) follow.  In media M&A, look for companies that have lots of cash to act opportunistically: NewsCorp, Holtzbrink, Bauer, Bonnier, Bertelsmann, Axel Springer, Lagardère, BBC (Commercial).
  • 19. PersonaNonData: Predictions & Commentary 2006 - 2013 18 | P a g e  Gathering of ‘equals’: Media owners unable to sell assets may seek to partner with another media company in the same boat and combine assets to form a new company. One combination that could be interesting is Nielsen Business Media with Reed Business Information (– pure speculation on my part). Others in this space looking for options might include PRIMEDIA, McGraw-Hill, and Penton.  The Obama administration will make wholesale revisions to education policy which will pain education publishers who have made particular investments in assessment companies. Long term, the assessment market will be robust; however, with explicit indications that student performance is no better for the ‘no child left behind’ programs, fundamental changes will be instituted including a more federalist direction. Ready your lobbying dollars.  Evidence that the edges of media segments continue to overlap: Google will bid for the 2014 and 2016 Olympic broadcast rights.  Social networking and ‘community’ building will become the CEO’s pet project as a ‘cheap’ alternative to decreased ad and marketing budgets with, predictably negative results. Senior-level misunderstandings of what constitutes effective social programs will result in efforts being treated casually. Piecemeal approaches will predominate and there will be a continued lack of cohesion of marketing with social networking. Programs backfire as customers witness the cynicism. Effective social networking is not just for Christmas.  Professional and information publishing will effectively leverage LinkedIn-like networks (possibly using their platform) to extend social networking to their closed networks. Lexis/Martindale is creating a private legal social network platform.  Too much LinkedIn with my Facebook. More people will do what I did in 2008 and build barriers around their online social networks and selectively cull ‘friends’ or ‘connections’ Sheer numbers have no logic, friendship is earned and legitimate business connections are money. Quality, not quantity, will reign. Don’t take it personally. Predictions 2008: Return to the Scene of the Crime This is the time of year when prognosticators attempt to handicap the future while, at the same time, trying to explain why they were so horribly wrong with respect to the prior year. I am no different. 2007 saw some stunning developments in the publishing and media space--particularly in mergers & acquisitions—and, broadly speaking, I see several trends emerging. First, I expect more change driven by M & A activity in 2008. Second, as more companies bound by traditional publishing models migrate online and join those already there, the application of technology in our industry will accelerate. Third, we will see a ‘squeezing’ of the value-chain (from author to publisher to consumer) driven by publishers looking to build community models around content and authors. Associated media markets, such as broadcasting, newspapers and games, also influence our industry in interesting ways. We are starting to see our traditional segment descriptors –
  • 20. PersonaNonData: Predictions & Commentary 2006 - 2013 19 | P a g e publishing, newspaper, broadcasting, information – become meaningless as content becomes ubiquitous and network access (or distribution) becomes universal. Publishers and information providers must expand their capabilities beyond traditional market segments if they want to remain competitive. On a related note, there is an escalating ‘compacting of media’ taking place, where the interests of all media players are converging on issues like rights, piracy, market concentration and access to markets or even consumer attention. (Text) book content, broadcast TV programs, movies, music, games and news can all be delivered via Xbox or IPod: In this environment, where does the power lie? Who “owns” the customer? And how are content-selection decisions made? A publisher can no longer be one-dimensional and hope to survive, which is why companies like Lexis, West and Pearson are building delivery ‘platforms’ where (traditional) content is only a part of the offering. In the not-too-distant future, we may look back on 2007 as a significant transition year for the media business. In education, a number of large companies were taken private and will reemerge five years from now as fundamentally different, platform-based companies. The Hollywood writers strike will redefine how content producers are compensated as content distribution expands to the Internet. Journal publishers will trace the history of their ad-based revenue models back to Reed Elsevier’s experiment with oncology online. And in the news & information segment, NewsCorp’s purchase of Dow Jones and Thomson’s acquisition of Reuters will radically change the model of information delivery. Even the self-publishing market showed a level of maturity with the consolidation of AuthorHouse and iUniverse.com. Outside our immediate universe (but no less relevant as advertising becomes a more important revenue stream) is the purchase by Google of adserver DoubleClick. Here are my predictions for 2008: Education:  Recognizing the potential for aggressive competition, McGraw-Hill will reorganize its business much as Thomson has done. MGH education could be sold to private equity. Cengage will spend aggressively to round out its content and assessment products with course management and school resource planning tools. Information:  We will see at least one mega-deal involving, perhaps, D&B, the information assets of McGraw (S&P) or Bloomberg. Following closely on the heels of past investments in tax, legal, financial information, the insurance segment will become a focus of aggressive new investment for information providers. Trade:  It also seems inevitable that there will be some additional consolidation in trade and this could result in a higher profile for Hachette, Bloomsbury and/or Macmillan. One publisher
  • 21. PersonaNonData: Predictions & Commentary 2006 - 2013 20 | P a g e may get out of the self-publishing market but another will jump in with both feet. A company like Lulu.com or the AuthorHouse/iUniverse combination could be targeted by a trade publisher seeking to expand its market and build an author community. More trade publishers will eliminate imprints in favor of theme-specific content. Retail:  The ongoing rumors of a Barnes&Noble/Borders combination will continue until one of these retailers purchases a third. This new combination will not materially change the book retail market, but the combination of the two companies will result in a financially stronger retailer. Other:  Broadcasters will have a strong advertising year due to the political calendar and the Olympics. (A three-party race for President will be an added boost).  Facebook and LinkedIn will join forces, but we will also see the development of more ‘by invitation only’ social networks. (Potentially, these could be administered by the current incumbents but they are more likely to be new entrants).  As many as ten brand-name magazines will cease publication or reduce their frequency due to ad-base declines and the rise of specialty web sites.  News sites (either branded or not) will ramp up efforts to harness niche bloggers and online publishers (either by acquisition or association) in an effort to boost traffic, broaden audience and develop more relevant op/ed and reportage. Incumbent news providers are realizing that acquiring an established online presence with a built-in audience represents a path to growth and they will begin to employ this tactic during 2008. As always, it looks like the coming year will be an exciting one in media. At least according to me. Predictions 2007: Taking it in Leaps and Bounds There are any number of people offering media predictions for 2007 and it is a fun exercise which can also be a useful tool for strategic planning. Consultants use a tactic called ‘scenario planning’ to generate discussion and thought focused on issues impacting a business. In sessions I have managed, I have placed up to ten ‘scenarios’ or predictions on the walls of a conference room where each member of the group is given instructions to vote on the likelihood of each scenario without speaking to the other participants. The scenarios reflect a combination of the existing status-quo and an extrapolation or exaggeration of anticipated
  • 22. PersonaNonData: Predictions & Commentary 2006 - 2013 21 | P a g e market change. Each scenario should be plausible and represent a challenging future environment in order to generate legitimate discussion. A red dot placed on the scenario means it will never happen and green means the participant agrees it will happen. The scenarios can be anything that the facilitator decides could be relevant to the company but should be done in consultation with someone at the company. (The scenarios are not shared before the meeting). Additionally, they can be absolute; ‘this will happen’ or more general ‘over the next five years…’ As the group completes the ‘voting’ the facilitator has the group examine each scenario in detail and will encourage the group to think about the implications of each scenario in a few dimensions; technology, human resources, competitors, etc. The outcome of this exercise is a better understanding of the company’s challenges and an understanding of the company's possible weaknesses (or strengths) relative to the scenarios the group thinks most likely. A document should be prepared from this seminar session and this document can become a material part of the development of a strategic plan. Even discussion of those scenarios the group does not believe are likely can be useful in challenging the executives to closely examine their assumptions. This is an exceptional exercise in encouraging senior management to examine, understand and interpret what is going on in the wider world as a fundamental requirement of their daily responsibilities. It can be the case that management develops a bunker mentality and is subsequently blindsided by events that they should have anticipated. My predictions below are not fully thought out scenarios for a number of reasons – they are not specific for one thing – but nevertheless they are fun to think about. As an editorial comment, I emphasize that I have no inside information on the veracity of any of these. Predictions for 2007:  NYTimes will eliminate the Saturday print edition of the newspaper. It will also create local web news sites for every major metropolitan city in the US and will stream video from their owned broadcast television stations, classified advertising will be free. The company will also launch a citizen’s paper: The New World Times. NYT will create suite of news gathering tools – web services – and make available to ‘citizen journalists’ content and research traditionally only available to professional journalists.  YouTube TV: Just like America’s funniest home videos we will see a TV show based on original YouTube video content. It will win its’ night by 10% and will be turned into a weekly Saturday night talent show.  Using cell phones’ camera as a barcode reader will lead to an explosion of mobile in- context/ in situ mobile advertising – followed in 2008 by RFID based in-store advertising (with software for cell phones). Mobile advertising will surpass 5% of all ad dollars spent by agencies by end 2007. (Web currently at 20%)
  • 23. PersonaNonData: Predictions & Commentary 2006 - 2013 22 | P a g e  Google launches product placement advertising program. Based on similar key word algorithms advertisers will bid for placement in movies, television, other broadcast, sports, etc. prior to production and/or live telecast. Program will represent 10% of all fall 2007 upfront spend.  FCC will hold hearings on standards related to product placement advertising in late 2007 as the market explodes.  Apple will think about buying Disney and Electronic Arts but will buy TiVo and SlingBox. Apple will also launch a Beatles version of the I-Pod including the entire Beatles catalog plus video/movies. The Beatles I-Pod will retain the tradition Apple artwork (Green apple front, cut away apple on the back).  Yahoo will by EA and within six months launch a social network gaming site based on EA content.  No-one will buy Netflix  Social Media in Education: Several major US colleges will teach various social science courses entirely in simulation. The courses will not be taught in traditional lecture form but entirely within the software simulation.  News Corp will buy Dow Jones and Financial Times and sell HarperCollins and Hachette will by HarperCollins.  EBay will by Linden Labs (Second Life). Within six months they will integrate EBay selling tools into SecondLife enabling virtual store fronts, sales assistance and virtual trading. Will launch program with major retailers and create first Second Life mega-mall in cooperation with Westfield. EBay also launches SecondLife media placement agency to handle all media inventory on SecondLife.  T Mobile buys Skype from EBay.  Linden dollars will be included in the Feds M1 currency calculation.  Neil Young’s Living with War wins the Grammy for best Rock Album. Corporate Data Strategy and The Chief Data Officer – Sept. 8th, 2011 (Part 1 of 4) Are you managing your data as a corporate asset? Is data – customer, product, user/transaction – even acknowledged by senior management? Responsibility for data within an organization reflects its importance; so, who manages your data? Few companies recognize the tangible value of the data their organizations produce and generate. Some data, such as product meta-data, are seen as problematic necessities that generally support the sale of the company’s products; but management of much of the other data (such as information generated as a customer passes through the operations of the business) is often ad-hoc and creates only operational headaches rather than usable business intelligence. Yet, a few data aware companies are starting to understand the value of the data generated by their companies and are creating specific business strategies to manage their internal data.
  • 24. PersonaNonData: Predictions & Commentary 2006 - 2013 23 | P a g e Establishing an environment in which a corporate data strategy can flourish is not an inconsequential task. It requires strong, active senior-level sponsorship, a financial commitment and adoption of change-management principles to rethink how business operations manage and control internal data. Without CEO-level support, a uniform data-strategy program will never take off because inertia, internal politics and/or self-interest will conspire to undermine any effort. Which raises a question: “Why adopt a corporate data strategy program?” In simple terms, more effectively managing proprietary data can help a company grow revenue, reduce expenses and improve operational activities (such as customer support.) In years past, company data may have been meaningless in so far that businesses did not or could not collect business information in an organized or coordinated manner. Corporate data warehouses, data stores and similar infrastructure improvements are now commonplace and, coupled with access to much more transaction information (from web traffic to consumer purchase data), these technological improvements have created environments where data benefits become tangible. In data-aware businesses, employees know where to look for the right data, are able to source and search it effectively and are often compensated for effectively managing it. Recognizing the potential value in data represents a critical first-step in establishing a data strategy and an increasing number of companies are building on this to create a corporate data strategy function. Businesses embarking on a data-asset program will only do so successfully if the CEO assigns responsibility and accountability to a Corporate Data Officer. This position is a new management role and not additive to an existing manager’s responsibilities (such as the head of marketing or information technology). In order to be successful, this position carries with it the responsibility for organizing, aggregating and managing the organization’s corporate data to better effect communications with supply chain partners, customers and internal data users. Impediments to implementing a corporate data strategy might include internal politics, inertia and a lack of commitment, all of which must be overcome by unequivocal support from the CEO. Business fundamentals should drive the initiative so that its expected benefits are captured explicitly. Those metrics might include revenue goals, expense savings, return on investment and other, narrower measures. In addition, operating procedures that define data policies and responsibilities should be established early in the project so that corporate ‘behavior’ can be articulated without the chance for mis- and/or self-interpretation. Formulating a three-year strategic plan in support of this initiative should be considered a basic requirement that will establish clear objectives and goals. In addition, managing expectations for what is likely to be a complex initiative will be vital. Planning and then delivering will enable the program to build on iterative successes. Included in this plan will be a cohesive communication program to ensure the organization is routinely made aware of objectives, timing and achievements. In general terms, there are likely to be four significant elements to this plan: (1) the identification and description of the existing data sources within an organization; (2) the
  • 25. PersonaNonData: Predictions & Commentary 2006 - 2013 24 | P a g e development of data models supporting both individual businesses and the corporate entity; (3) the sourcing of technology and tools needed to enact the program to best effect; and then, finally, (4) a progressive plan to consolidate data and responsibility into a single entity. Around this effort would also be the implementation of policies and procedures to govern how each stakeholder in the process interacts with others. While this effort may appear to have more relevance for very large companies, all companies should be able to generate value from the data their businesses produce. At larger companies the problems will be more complex and challenging but, in smaller companies, the opportunities may be more immediate and the implementation challenges more manageable. Importantly, as more of our business relationships assume a data component, data becomes integral to the way business itself is conducted. Big or small, establishing a data strategy with CEO-level sponsorship should become an important element of corporate strategy. Setting the Data Strategy Agenda - September 8th , 2011 (Part 2 of 4) Once a company has recognized the tangible value of its data, the CEO will assign the role to a direct report making this initiative his or her sole responsibility. As noted in my first post, managing data as an asset is so important that it requires direct senior management responsibility and should not be delegated to the head of marketing & sales or IT (in each or any case, both bias and conflict with their ‘real’ responsibilities will prevent the program's benefits accruing to the entire business). In addition, the ability to break down silos, confront the assumed internal politics and impose a solution will be greatly diminished if the executive in charge already has other functional responsibilities. In my view, if this is the approach of the CEO, the initiative will fail. By way of example, the Chief Data Officer at Dun & Bradstreet sits at the highest levels of the company in recognition of the importance of data to that organization and is evidence of the strategic importance of that data. The first task of the “Chief Data Officer” (CDO) and their team would be to conduct a business- wide data audit to determine where data is being collected, both through internal efforts as well as externally by customers, vendors and other partners. This is an important first step and should be as expansive and specific as possible. Things to look for include not only the data sources themselves but their quality (perceived or explicit), frequency of update or creation, source/owner, format, whether the data conforms to any particular standard and whether the data is ever validated against this standard. Reviewing standards might also include not just the data’s technical framework but also whether it is subject to normalization – to a specific categorization method such as BISAC, for example. I once undertook an effort like this at a large educational publisher and the process took about six weeks and required site visits, interviews and the reviews of policies and procedures. During this exercise, it is more than likely that some amount of ‘visioning’ will take place and this feedback should be captured during the analysis; however, at this early stage, the only really important aspect of this effort would be where someone might say ‘we currently make
  • 26. PersonaNonData: Predictions & Commentary 2006 - 2013 25 | P a g e use of this particular data source, but we would really like to use this other one and can’t because our systems can’t handle it’. Questioning and challenging the current state should be the focus during this evaluation stage. There will be ample opportunities to focus on the ‘to be’ environment as the project evolves. Out of this ‘baseline’ analysis will come several recommendations concerning how the data strategy initiative will roll out. The baseline analysis will define all existing data sources, the resources required to manage each effort, the tools deployed in support and perhaps an estimate of what would be required to centrally manage the data operations. In turn, this analysis will form the basis of the business operating plan for the data strategy program and, in common with any business plan, it will include a set of objectives, suggested approach or tactical plan and the requirements or measurements for success. It is likely the initial plan will be progressive in terms of scope and complexity for several reasons: Firstly, this initiative is likely to face significant internal skepticism and, therefore, early observed success will be important in retaining support. Remember, the CEO has also placed his or her ‘reputation’ behind this initiative, and for this reason, the initial roll-out should be modest. Secondly, taking a full-scale approach is unlikely to result in benefits that anyone will support, and finding and assigning an appropriate level of resources to an initiative of that breadth would be cost prohibitive. As often happens, trying to ‘boil the ocean’ will fail. Thirdly, starting the project on a relatively small scale will enable the project team to select a project scope they feel they can comfortably deliver on. There are likely to be varying approaches to initiating a corporate data strategy but, for my money, I would start with product meta-data. Corporate Data Program: Where to Start? – September 8th , 2011 (Part 3 of 4). There are likely to be many approaches to initiating a corporate data strategy but, for my money, I would start with product metadata. Theoretically, this is data you maintain the most control over (you will find out if you actually do during your initial review). In my opinion, product metadata should be considered the basic foundation upon which to build a corporate data strategy. On this foundation, a “data value” can be assigned to all the data a business produces. Product data might also be seen as the data asset from which all other corporate data flows. For example, as businesses begin to generate more user/transaction data, this information will be far more valuable if it is tied back to robust product metadata. Establishing your corporate data strategy around product metadata also has another advantage in that the company is always best placed to manage the information that describes their products. How companies describe and inter-relate information about their products is increasingly viewed as the most proactive activity a company can impose on supply-chain partners (or directly to consumers) in order to positively impact sales. The deeper, more descriptive and interconnected the metadata is, the better all products will perform in a sales environment that is increasingly congested. Unfortunately, only a small number of companies manage their data in a uniform manner and, typically, descriptive metadata continues to be poorly managed and
  • 27. PersonaNonData: Predictions & Commentary 2006 - 2013 26 | P a g e locked in silos deep within organizations. Ironically, if the logic of a “corporation” as a collection of assets makes sense from a financial point of view, isn’t that logic undermined by the disaggregation of information about the collective assets of the organization? For many companies, this describes their product data management ‘philosophy’. As such, their data management is more reflective of their own internal structures rather than a pragmatic understanding of the mechanics of their markets. Just as consumers seek products and services – often via search – in the broadest sense and not in accordance with artificial corporate hierarchies, the smart approach to product metadata management would be to centralize it at a broad or corporate level. This approach would facilitate the most effective integration of all products so that the best combination of product options can be presented to a customer. In choosing product data as the first practical implementation for your data-strategy effort, your team will also benefit from an existing set of methodologies, policies and procedures. (This wouldn’t necessarily be the case if you were to choose customer data or web traffic data, for example.) In launching this first initiative, your internal communications will want to explain to the individual business units and the business as a whole what benefits they will realize as the project becomes operational. All participants in the metadata initiative will be striving for a ‘future state’ where each business and constituency will be able to spend more time analyzing and leveraging better and more complete data. Thus, the future state will be materially different than the “legacy environment,” where staff spend their time chasing and remediating data rather than benefiting from value-added tasks supporting their business units. In my next post, I will spell out in more detail what benefits may accrue from this initiative but, overall, they include the application of scale economies to the management of data, the attribution of control mechanisms (such as thesauri) and a greater ability to merge and mingle metadata to improve revenues. Strategically Managing Data for Long Term Benefit – September 8th , 2011 (Part 4 of 4) In this series of articles, I have attempted to describe the organization and development of a corporate-wide approach to data management. In doing so, I have suggested that product metadata is a good place to initiate a phase one approach to centralizing management of a business’s data. The underlying premise of this series is that data represents a core asset of any organization and, if effectively managed can produce incremental benefits to the organization as a whole. In my view, the efforts detailed in the prior articles will have a material impact on the business in three ways: (i) The application of scale economies to the management of data; (ii) the attribution of control mechanisms (such as thesauri) and (iii) a greater ability to merge and mingle metadata to improve revenues. Below, I suggest how some of these benefits might be actualized: Scale:
  • 28. PersonaNonData: Predictions & Commentary 2006 - 2013 27 | P a g e  Centralizing metadata management allows the organization to take advantage of scale economies in factor costs, technology and expertise. Not every business unit can afford to acquire state of the art technology or the market’s best metadata expert but these types of decisions are almost encouraged if their benefits can be spread across the enterprise. The financial benefits of better data management can also be most appreciated and captured at the corporate level, thereby providing greater financial justification for the adoption of technology and staffing to support the data strategy. Collective Dictionaries “You say tomato, I say tomato” - Thesauri, ontologies and the attribution of consistent cataloging rules:  Business units don’t speak to one another nearly enough and this is absolutely the case in the way they manage information about the products they sell. The manner and method one business unit may use to describe a product could be vastly different than that which a sister unit may apply to a similar or likely compatible product. Take, for example, a large legal publisher publishing journals and educational materials: It would make logical and strategic sense that the metadata used to describe these complimentary products would be produced using the same metadata language and dictionary, yet that is rarely the case. (Think of this as a ‘chart of accounts’ for data).  Additionally, the manner and method by which authors and contributors are required to compile (write) their authored materials are unlikely to take account of the potential for compatibility and consistency across content type. As is readily apparent, traditional content silos are breaking down as users are accorded more power in finding specific content and an organization will be significantly hampered if it cannot provide relevant material to customers irrespective of its format. Inter-relationships and cross selling:  Companies frequently leave it up to channel partners to aggregate compatible or complimentary products. Naturally, at the retail level this activity happens across publishers; however, to not provide the supply chain with an integrated metadata file that represents the best complete presentation of all the company’s products suggests a contradiction with the wider corporate business strategy of acquiring or developing products that ‘fit’ with corporate objectives. In other words, why doesn’t the company’s data strategy support the corporate business strategy in managing a collection of related and complimentary products and services? To do this, data strategy should be a component of business strategy planning.  The opportunity inherent in managing data in this manner will be a real ability to sell more products and services to customers. Providing relevant “packages” of content that add related and complimentary products to the item originally sought will generate cross- and up-sell opportunities. Why rely on a hit-or-miss approach provided by your channel partners? (Or worse, an association with a competing product). This activity is only possible with good data yet; if done effectively, can become a significant competitive advantage with incremental sales. Return on investment would be seen in metrics such as average revenue per customer and/or average shopping cart revenues. Importantly, selling more products to
  • 29. PersonaNonData: Predictions & Commentary 2006 - 2013 28 | P a g e a customer who is already interested in buying from you is always easier and more profitable than finding new customers. Market, promotions and branding:  Combining a company’s products in a logical manner reinforces branding and messaging. If product information is disaggregated and disorganized, it is likely that the branding and messaging in the mind of consumers similarly lacks effectiveness. Channel Partner Relationships:  A company may be able to exert more leverage with a channel partner if it is in a position to represent all of its products in a coordinated and managed manner than if this interaction is dispersed across the organization  Matching and marrying data with partners will be less problematic and more effective if data models can be allied – time to market will be significantly reduced and planned benefits of the relationships should accrue in shorter time frames.  Additionally, providing a well-managed metadata file that supplies the type of product descriptive cohesion described above is going to benefit your channel partners as well, not only making their lives easier but also make them money by selling more of your products. Acquisitions – integration of new data:  Historically, the integration of companies and new products with respect to product metadata might have been a haphazard affair. At a consolidated level this task becomes much easier with the added benefit that connections between products, adoption of standard dictionaries and standards may have an immediate financial impact. As noted before, it is likely the justification for the acquisition of the company in the first place was its compatibility or consistency with a strategy and it is logical that this be reflected in the manner in which the products are managed.  It is likely we will see that companies with ‘best in class’ approaches to data asset management will be valued more than those without. Increasingly, companies will be asked about their data policies and management practices and those which ‘under-manage’ their data will be seen as less attractive – for acquisitions, partnerships and other relationships. Those are some of the general benefits of better corporate data management and developing a corporate data strategy. The effort required to implement a data strategy program isn’t inconsequential and planning should be rational and realistic; however, as data management across a business becomes more and more ‘strategic’ the faster you adopt an approach, the faster your business will benefit. If you believe the tasks involved are difficult (or near impossible) now, they are only likely to get more so; therefore, it would be best to get started now.
  • 30. PersonaNonData: Predictions & Commentary 2006 - 2013 29 | P a g e Business Out of the Ordinary By nature, business development is often more disappointing than validating. That brilliant deal you thought was so obvious and beneficial can fail for all kinds of reasons. Rallying internal support for a business initiative can be harder than finding the perfect external partner and there’s no accounting for how short-sighted or uninspired a company might be in assessing its position and prospects. Most of my experience is in publishing and I am routinely surprised by the wide gulf between the small number of companies interested in exploring new ideas and the majority who can’t or choose not to. Business development is all about mutual benefit-- defining an outcome that balances the inputs to the product or project and the output of success. Publishers, in particular, often seem to go out of their way to discourage advances from potential partners. Thinking about this topic recently, I was reminded of a series of meetings I organized while at Berlitz, the language company. When I assumed income statement responsibility for one of their business units, I found I’d inherited an unexpectedly lucrative licensing deal when the first-quarter royalty check came in at more than $98,000 when we had budgeted $100,000 for the full year. In addition to being pleasantly surprised, I realized we could do even more with the relationship, now about to enter its second year. As business partnerships went, this was one of the best. One day a couple of years before, two IT professors talked their way into Berlitz and happened to reach the right executive – purely by chance – with the idea that they could marry their CD-ROM technology with Berlitz’s self- teaching language material. The product got off to a slow start and revenues were small for the first year, but the Berlitz executive who did the deal had pressed his advantage and we were taking 20% off each sale. By the time that check came in, the guy who did the deal was gone and I (happily) was running my first real business. The second-quarter check that year was $150,000 and we finished the year over $400,000. That first foray seemed to indicate that technology had a role to play in the delivery of educational material despite the fact that, in those early years of ‘multi-media’ publishing, the CD-ROM product was simply a direct translation of the print/audio product. But more sophisticated opportunities were obvious to anyone willing to think differently and, even before our one-product CD-ROM partner was acquired by a much larger company, I’d started to explore how we could expand this relationship. Once The Learning Company (publisher of the Reader Rabbit series) acquired our partner, I began discussions to form a deeper association. I got very little support from senior management at Berlitz, many of whom believed self-teaching materials would cannibalize in-class instruction and somehow damage the Berlitz brand (- an insane proposition). Over the course of the next six months, I wrote business plans and extrapolated models with the sole goal of forming an equal partnership between Berlitz and The Learning Company (it took that long to get the two groups together). The idea was simple: Apply the Berlitz brand to TLS’s entire language learning product line rather than to just the one product they had acquired with their purchase.
  • 31. PersonaNonData: Predictions & Commentary 2006 - 2013 30 | P a g e By that time, I had decided to leave Berlitz--arranging the CEO meeting was the last meaningful thing I accomplished and I did it proudly. On my way out, I commented to our CEO was that there wasn’t anything else to do: We had agreed terms which were being reviewed, but it still took another year to complete the deal. That unproductive year probably cost Berlitz over $1mm in profit. Even more dispiriting was that fact that, once the deal was done, neither Berlitz nor TLC was able to expand beyond the language titles each already produced. Had they done so, they would have created an education brand of far greater strength and depth. Successful business development requires vision and commitment and, if these aren’t in play, all initiatives fail. Good business development requires that you do your homework and have a clear idea how the project will be mutually beneficial. In my development dealings, I’m less concerned about wasting my potential partner’s time than I am about wasting my own. This isn’t arrogance on my part: I’m always comfortable in the knowledge that I’m proposing something of significant value to the target and, given the chance, I can present my case convincingly. Sadly, many other industry overtures don’t fall into this category because no real thought has gone into the proposal and it’s often glaringly obvious. In the run- up to the London Book Fair last month, I received many proposals for business services bearing no relationship to what our company does. Bogus proposals like these cause all of us to be circumspect about any business development initiative because, all too frequently, they are irrelevant. It can be hard to overcome this barrier. Nevertheless, despite these spurious approaches, I am frequently baffled by the sheer lack of curiosity exhibited by some of the companies I’ve contacted. Publishing and media are currently undergoing wrenching change—the way we now do just about everything will be very different five years from now (and maybe even next year). Some, with withered hands firmly pressing against the figurative front doors of their publishing houses, deny entry to any new force as if they could overcome change with feigned ignorance and exasperation. I don’t forget the few slammed phone receivers or nasty outbursts but I do relish when they come back for a second look--it proves that good business development eventually wins out. Curiously, digital conferences are big business because, supposedly, there is a collective striving for greater knowledge or understanding about the direction of our business. If attending a digital conference is a substitute for real, proactive business development –actively looking for new ideas and partners – have we changed how we conduct daily business operations in order to be more receptive to development opportunities from potential partners from outside our traditional universe? Not in my experience. (Presumably, your conventional partners are all tapped out!) Who on your staff is responsible for fielding the occasional new business opportunity? Are they identified on your web site? Let me take a step back; is anyone identified on your web site? And, if they are, do you list their phone numbers and e-mail addresses? I’ve visited many web sites where the only communication option is via an anonymous “info@” e-mail
  • 32. PersonaNonData: Predictions & Commentary 2006 - 2013 31 | P a g e address! I understand there’s resistance to hearing from all kinds of wackos but this is the way business is conducted. And I’ve got news for you: For anyone with a modicum of patience, tracking down almost any phone number and email address is ridiculously easy so not putting these on your website is only an irritant. A far better approach would be to identify the executive responsible for fielding business development proposals, describe the kind of business ideas you are interested in and define a format for proposing them. Present press releases from completed deals to give visitors to your site essential background and context for prospective partnerships. Proactively manage this process and you will not only save yourself (and your potential partner) valuable time while still opening up your company to new and interesting business opportunities. One thing I learned when I attained a certain level of management responsibility was to be immediately and also politely forthright about my interest level in proposals. Don’t just slam the phone down—take the time to clarify your understanding of the overture and then be clear about why it’s not something you’re interesting in pursing. There’s nothing worse than wasting everyone’s time on repeated attempts to make something work if one party is really not interested: End it as soon as possible but do so professionally and with an explanation. You say you don’t need to provide an explanation? I would disagree; not only is it professional courtesy but the intellectual exercise of thinking things through is essential for seeing how, in the future, out of the ordinary ideas could support your business objectives in ways you hadn’t thought of. As our industry continues to corkscrew through changes, we increasingly feel like its 'business out of the ordinary', but this is manageable and, while some of us might want to ignore our circumstances, most of us would like to survive in this new and different world. That requires a receptiveness to new ways of working and openness to new partners offering ideas and solutions we’ve never thought of. Therein lies the true opportunity for good business development. Once, Berlitz was a company virtually devoid of technology: The customer experience was completely dependent on humans delivering live language instruction; their salaries and rent for school locations around the world comprised the vast majority of the company’s expense. We knew nothing of technology but saw the potential in a simple CD-ROM product and it is sometimes upon the most tenuous of platforms that key relationships can be established. I believe that those companies open to extrapolating beyond their current circumstances are those that will embrace business development as a function for growth beyond the confines of their existing environments and will prosper as a result. Companies who don’t, won’t. I Steal Stuff – March 15, 2011 I remember a conversation early on in my tenure at my last company that ended along the lines of ‘well, you know, consultants just take credit for the ideas they hear from the employees’.
  • 33. PersonaNonData: Predictions & Commentary 2006 - 2013 32 | P a g e There is some truth to this and, having been on the employee side of that observation/ accusation, I can attest it can be demotivating to hear consultants spouting your ideas and concepts to a room full of people. Consultants (me, for example) might counter that, if they’re doing their job well, they are able weave together the implications of discrete ideas and potential strategies across the organization. That, of course, isn’t always within the purview of individual employees. Every one of the operations jobs I have undertaken in my career has either been a new position or I’ve replaced someone who was fired. I’ve never come into a job where the person who held my new position was still with company. What I’ve found consistently is that good ideas and strategies for building or expanding the business were all in plain sight and that, with a little diligence and execution, success can come quickly. Another interesting aspect of my career is that I’ve been hired into positions where I had little to no specific experience. I have also found that skillful ‘adaptation’ and execution of the ideas already known or present in the organization can aid in building a deeper and more rapid understanding of the dynamics of the business you’ve been tasked with operating Early in my career at Berlitz, I was given responsibility for their direct-mail business, selling language learning materials in airline and travel magazines. I had no experience in direct mail and the manager in charge of the unit had neglected the business and was no longer employed. Direct mail is all about testing and math, and the prior manager had done a test which called for placing a four-color tri-fold insert in the American Airlines Advantage mileage statement received each month by members. The test had been done six months before I took responsibility and had been successful (it actually made money) but, through inattention, he had not expanded the program to the full membership list. My staff told me about this and we decided to go all in with a full mailing to over 12million recipients. When I inherited the business mid-year we were already running well short of budget but, thanks to the strength of this program and other improvements, we made our budget that year with room to spare. That American Airlines program was a great start and in the following year we placed the insert four more times. We also explored other, similar mileage programs but none performed as well. With this strong performance as a foundation, we began looking at other aspects of the business for opportunities. I had a small telemarking team that took all the incoming phone orders but this team had never been properly trained (nor really paid any attention). As I began working with them it became clear they could do much more to expand sales. One aspect of language self-learning is that very few people finish the course. We even used to joke that no one ever got past the first of the twelve cassettes. (Cynical I know, but the customer generally blames themself not the product – just like health clubs). Berlitz offered a second-level product for the same price but, unsurprisingly, less than 10% of buyers purchased the second (more advanced) level course. We now had a tremendous number of calls coming in to our little telemarketing operation due to bigger, more frequent mailings to American frequent flyers and all the other direct response ads we continued to place. As a team, we determined that if we could hook the callers into purchasing both level 1 and level 2 at the same time, we could potentially generate far more
  • 34. PersonaNonData: Predictions & Commentary 2006 - 2013 33 | P a g e revenues per call. As we implemented this upselling activity we also put in place the first sales targets and rewards program the team had ever had. Starved of attention and recognition by prior managers, my little team of telemarketers – who worked in an office park in Delran, NJ – took to the program like ducks to water. What, in retrospect, seem like quite basic operating improvements were implemented and ultimately very successful because of a fresh perspective on and an open attitude to the potential of the business and the employees. Some of the improvements occurred simply because someone listened more attentively to the idea. Obviously, business improvement opportunities are not always so easily identified but they can be found by anyone with an open mind about the business and, perhaps, a different view of how the ideas could be implemented. In this context, anyone in the business can be a “consultant” and “steal” ideas that, in retrospect, might appear to be in plain sight. In the second year of my tenure, our direct mail business doubled profit and contributed twice the profit per employee than any other Berlitz business unit and we did it all by working with what was already in the business. Welcome to the migration (and other lessons) – February 8th, 2011 The day before we went live with our first web product at Bowker, customer service were fielding the typical customer complaints. One customer in Cleveland was missing volume two of a three-volume set and another in Jacksonville was questioning their standing order discount. These queries and many more like them were synonymous with moving physical units to customers but all that changed the day we went live. The changes we were forced to make didn’t happen instantaneously and, while progressive, only became apparent when we reflected periodically on our progress. Over time, ‘customer service’ morphed into ‘technical support’ and became concerned with logins, sluggish search times and IP ranges. Customer service was only one example of a line function forced to reevaluate how it operated and interacted with customers. I wish I could admit that our transition was managed and handled methodically and perfectly but, like many businesses in our situation, we oscillated at times between confused and frantic. We did reach a point where we took the unanticipated in stride and became expert at dealing with unforeseen developments. Ironically, the supposed ‘impersonal’ nature of the internet created an environment for us where we were far closer to the customer than we ever were in the print world. Expediency can define the future When I joined Bowker in 1999, the future of our business was seriously challenged. No need was more imperative than that for a web version of our primary database product Books In Print. Deluged with cancelled print orders, we couldn’t even engage our customers in a discussion about migration because we had no online option. Expediency ruled our web development: We had little or no time to do extensive user testing and involve our customers
  • 35. PersonaNonData: Predictions & Commentary 2006 - 2013 34 | P a g e in any UI development; and, with limited options available to us, we chose to replicate the functionality of our CDROM product. That strategy proved highly effective (though perhaps not optimal) and we launched the product in 2000. That’s when the fun really began. Getting a field sales force in place became a strategic imperative if we wanted to effectively sell our online products. Selling static print products is completely different from selling an online product, which can be sold effectively via telesales; however, we had an appallingly bad approach to the sales function prior to 2001. As we implemented the field sales effort, the company became a more effective sales organization, and we gained deeper insight into our market. (I’m glossing over how difficult the task of putting a new team in place was – maybe for a later post). Feedback from the sales force, coupled with a directive from me that product managers and senior staff make frequent ‘visits’ into the market with the sales force meant we became far more aware of and attuned to what our customers were looking for. Had this sales organization been in place prior to our web transition, it would have fed our initial web development effort. Who's your buyer? Making the assumption your buyers are the same in changing circumstances should be challenged before you waste a lot of time. In the move to field sales we also found that the buyers of online products at our institutional customers (mainly libraries) were frequently different from the typical buyers of the print titles. The money for electronic products often came out of a different budget and, increasingly, consortia (group buying at various levels) sales became a regular component of our selling process. Historically, our sales approach hadn’t been sophisticated enough to address these changing market dynamics but we addressed that issue by hiring more effective sales management, which brought a different philosophy to sales that we hadn’t had in the past. More staff attuned to the market did lead to more insight. An important missing input to our initial development effort was the deep knowledge about how our customers were using our products. This sounds startling (and is) but what became clear to us was that many customers of the print version were using BIP as a simple look-up tool. This print ‘look up’ tool was pulled off the shelf to find an ISBN or confirm a title name and then the volume was returned to its place. When Amazon came along (and that data was never licensed to them by Bowker) retailers and librarians saw a far simpler and effective mechanism for finding this information. Suddenly we were competing with free, made worse by the fact we didn’t have an online version. (So, I guess ‘competing’ overstates our position). Had we understood their behavior at a deeper level during our development effort, it might have impacted how we designed the search and UI. That said, we were lucky enough to do very well with the initial development and the team pulled off a triumph in the launch and subsequent roll-out. While it sounds obvious, understanding how your customers use your products and what issues they face in their daily business should be considered vital to your initial planning process. Don’t take it for granted you understand this; prove it via primary research. Renewals are about usage
  • 36. PersonaNonData: Predictions & Commentary 2006 - 2013 35 | P a g e In the first year of launch, our renewal rate for Booksinprint.com was in the 70% range. Feeble, and, in a market that doesn’t grow, finding new customers to take the place of the 30% subscribers we were losing became an impossibility. Admittedly, our first-year subscriber base was low; however, this was the future of the company and, unless we raised the renewal rate, the future success of BooksinPrint.com would have been in jeopardy. As any sales manager knows, selling to a current customer is a lot easier than selling to a new one. A focus on the customer experience - particularly user stats - and strong sales management enabled us to get the renewal rate up to the low-to-mid 90% range, which is an incredible improvement and a testament to the strength of the sales team we had in place at the time. As I look back on our transition, I see three distinct stages - two of which I have described above. Firstly, the product development and market assessment phase, where we conceived the product. Secondly, the management of the implications of this transition (particularly on customer service and sales). This leads me to the third phase of our migration process - maintenance. Significant in our sales improvement was the sales administration support we gave sales reps in two areas: user statistics and training. When we started to analyze our renewal statistics in the early post-launch years, we saw we could predict which customers were likely to renew based on how and how much they were using the product. Sounds obvious, but we were learning as we went. We used this data to intervene throughout the subscription term via sales and sales admin outreach and training. On-site customer training in our market wasn’t a new thing and our parent company had embarked on a similar effort several years before. Our customer training program was designed to ensure that customers understood how to use our product and what features of the product were available to them. We hired specific trainers to travel around the country giving these sessions (often in a group setting) at client sites. Our trainers were able to generate significant customer engagement and the program proved instrumental in pushing the renewal numbers higher each year. The payback was measurable but having trainer staff face-to-face with customers also created a feedback loop for product development as we considered new enhancements to the product. Since we sold multiple products to the same institution, the trainers often delivered multiple product training sessions during each institutional visit. Maturity shouldn't mean complacency I often reflect on how distinct these migration phases were from development and launch through to maintenance, and how our activities changed over time to reflect those changes. For example, in the early launch days, our sales staff was focused on making sales and finding customers yet, as the cycle reached maturity and our renewals exceeded 90%, the sales activity became focused on making sure the customers were truly engaged with our product(s). New business was still important since any percentage point below 100% renewal means the organization needs to keep finding new customers to fill the gap (but the sales person’s time spent becomes weighted differently).
  • 37. PersonaNonData: Predictions & Commentary 2006 - 2013 36 | P a g e Migrating from a print focus to online delivery changes every part of a business and, in the discussion above, I’ve barely scratched the surface of how one organization made this tremendous turnaround over 36mths or so. In different circumstances we might have done this faster – the company was sold in the middle of this transition – but I do think the company managed exceedingly well to reestablish a future for itself that, in 1999, didn’t look so rosy. One note of caution: The maintenance phase can be dangerous because it has an ugly sister named complacency. Confusing a Silo with a Business – August 3rd, 2010 The strategy of organizing content around a common topic such as legal or medical information is mature in information publishing. As other publishers mimic the strategy of organizing their content into silos they would be wise not to confuse their efforts with community building or market making. Users are interested in accessing validated, useful and important topical information but this could just as easily be web based content as it is published content. Often it is just that. Whereas information companies formally organized their businesses around topics (medical, tax, legal, etc.) more than 15 years ago they quickly understood that their customers needed more. Initially, it was often the integration across what had been independent databases that produced the most utility for their users and, their early work led to the development of taxonomies, search techniques and applications which enabled work flow integration. But nothing stands still and as the information business continues to evolve what is happening currently in information should be of interest to all publishers. In short, their experience suggests it may be simplistic to believe establishing a silo of content will produce a community of willing publishing consumers. Having built platforms supporting information products, information companies now recognize that their customers are looking for integration across subject areas. Importantly, the customers are looking for ways to validate a much wider pool (ocean) of potentially useful and important information. To Thomson Reuters (and others) the silo increasingly looks like a pyramid and they have begun to conceptualize the management of information and data using this framework. In part, this has to do with the excessive growth of information: Increasingly information providers are as useful to their customers as filters of a vast catalog of information as they are providers of tools, techniques and proprietary data. Consequently, information providers are beginning to see themselves providing access to as much content and information as possible - available on their platforms - and then progressively adding value to the consumer as they move up the pyramid in terms of need and application. At the top of the pyramid are those publisher specific technologies and content that provide the most value to customers. Companies like Thomson Reuters recognize customers have broad needs and thus there is business logic to providing different services at each level of this pyramid as well as integration points with companies outside the Thomson Reuters family.
  • 38. PersonaNonData: Predictions & Commentary 2006 - 2013 37 | P a g e Inherent in this approach is the recognition by Thomson Reuters and others that it may not be possible to operate in a closed environment any longer. The information space is simply too large to organize in the manner in which information aggregated content in the 1990s. The more addressable issue is to provide consumers with the information critical to their needs and filter that information or content such that it is unambiguous. The lesson for less advanced publishers is that building a concentration around siloed content is not enough; in-fact, aggregating consumer interest and appeal around publishing content will fail unless that concentration includes content from the web, television, radio, newspapers, magazines, etc. which is also organized, validated and served up in the most effective manner for the consumer. Information publishers have been able to evolve their model to support the needs of their professional customers but the consumer market is more anarchic and it remains to be seen whether trade publishers can pull it off. Silos may not be worth the effort. United Artists Redux – July 20th, 2010 Amidst radical change forced on them by major advances in technology (largely out of their control), a small group of leading media producers have joined together to establish their own (insert word): broadcaster, publisher, studio, agency. Unlikely? Not now, because the functions that support these traditional media companies are increasingly becoming commoditized, enabling the creative producers (writers, authors, producers, etc.) to potentially collect more of the revenues generated from their creative output. While individual authors have gained some attention by 'going direct,' either by working through Amazon (J.A. Konrath) or direct to consumers via the iPad (Ryu Mirakami), it may be that traditional publishers have more to fear from groups of authors, editors and agents conspiring to establish their own media companies. These new companies would leverage the available low-cost 'back office' functions and the readily available supply-chain provision to dis-intermediate the traditional publishing monolith. In 1919, United Artists was formed by D. W. Griffith, Charlie Chaplin, Mary Pickford and Douglas Fairbanks. These four performers established United Artists to gain greater control over their own work and to produce other work they thought valuable. The four partners eventually hired an executive to run the operation who, in addition to signing new actors and producers to United Artists, also established a movie theater chain. United Artists, however, ultimately was unsuccessful as the changes in the industry largely exceeded the ability of the partners to adapt. Yet this model resonates in an age where 'infrastructure' is becoming less important than author, character and content branding. If a similar group of content creators were to establish a new "United Artists" organization they wouldn't find it difficult to hire executives to act on their behalf to establish a new publishing organization. This new organization would be unencumbered by either the traditional publishing model or (more importantly) the cost structure of the business. These United Artists would sit atop an organization that would be largely supported by external third-party agreements with accounting firms, editorial and production services, distribution and
  • 39. PersonaNonData: Predictions & Commentary 2006 - 2013 38 | P a g e fulfillment, etc. Important value-added services such as marketing, promotion, content rights and licensing - those functions that, by definition, worked closest to the content creators and added real value to the consumer experience would be full-time hires of United Artists. In discussing authors 'going direct,' there are frequent suggestions that this could become an avalanche with traditional publishers seeing their best and most profitable content producers leave the fold. This belies the difficulty of an author having to do all the nasty stuff the publisher does for them if they go it alone. However, what if the author became a partner in his or her own publishing company? Then, perhaps, the model changes and the options begin to look more appealing for the content producer and potentially problematic for the traditional publisher. Could recently reported news by Variety that Steve Ross had joined Abrams Artist Agency to provide "consulting services to a select list of clients" be an indication that PND isn't the first to revive the United Artists idea? The Baked Beans Are Off – July 13th, 2010 When I joined Macmillan, Inc. in 1989 the company was rounding out the decade nicely having gone from losing over $1mm per week and a share price less than $2 in 1980 to one sold to Robert Maxwell for 19x earnings and $92 per share. Application of economies of scale helped build Macmillan to a $2billion publishing conglomerate where each newly acquired publishing company was just ‘more beans for the baked bean company’ which was how senior executives referred to their “factory acquisition” process. In fact, some of the executives, notably CEO Bill Reilly, had come from industrial manufacturing and had a deep understanding of how to effectively apply scale economies to operations. All the largest publishing companies were following a similar ‘baked bean’ approach as the industry consolidated: Publishing lists were separated from their original companies and progressively (sometimes immediately) overhead expenses were eliminated as the acquired company was absorbed. At one point, I was tasked with following up on the ROI for a slew of companies acquired over a two year period. This proved difficult because their operations had been so effectively integrated into the parent company that constructing a post-acquisition income statement proved virtually impossible. Fast forward 20 years and the scale economic model is falling apart for trade publishing. So effective at applying scale to accounting, manufacturing, management, production and other overhead, it is ironic that in the internet world everyone now has access to similar scale benefits. Publishing companies now realize they have achieved scale advantages in the wrong functions. Scale advantage in editorial, marketing, promotion, and content management is almost non-existent to the degree that will ensure competitive advantage, yet these are the functions important to future success. (As an isolated example, I would argue that authonomy.com by HarperCollins represents an attempt to build scale into the editorial process).
  • 40. PersonaNonData: Predictions & Commentary 2006 - 2013 39 | P a g e We all know seismic change – prevalent everywhere - has to come to the cost structures of publishing companies. Squeezed by downward pricing and potential revenue share models that provide more to authors and contributors, publishers will wonder where the money is going to come from. The scale model that built companies like Macmillan, Inc. is irreparably dead to anyone thinking about the future of publishing. The only way out – and it’s not an easy suggestion – is to recognize that those functions that used to provide scale benefits are no longer doing so and need to be carved out. Some of this has happened in manufacturing where companies like Donnelly and Williams Lea have taken over the manufacturing and production function for companies: Those departments no longer exist at the publisher. Decisions to outsource non-value added functions such as accounting, distribution and fulfillment and information technology must be made as the publisher contemplates their future. Once unencumbered, the real test will be whether publishers can re-work their structures so that they build scale economies in those functions that do provide value: Content acquisition, editorial, marketing & promotion and content licensing and brand building. There is little evidence that this is happening or that the realization has set in. Instead of seeing a publishing company improve their performance over ten years as Macmillan did in the 1980s, we are likely to see many examples of the exact opposite over the next ten. Will companies rise to the challenge or are they so wedded to the old ‘baked bean’ model that they expect it to go on forever? Clearly, it won’t. Do You Sincerely Want to Sell Business? - June 29th, 2010 There are various approaches to selling a business and selling a publishing business is no different. The circumstances surrounding the decision to sell can greatly influence how smoothly the process goes; however, as with many things, the amount of preparation that goes into the process will ultimately determine whether there is a successful outcome. As a seller, your immediate task is to eliminate questions, cynicism and doubt about your business in the minds of potential buyers. No matter how excited the potential purchaser seems to be about your company, they are going to be skeptical about key information. Their job is to (cynically) use anything negative to undercut a purchase price; your job is to be open and effectively back up any questions they will have with facts. (Bear in mind that adequately addressing these issues to their seeming satisfaction early in the process doesn't mean the purchaser won't raise them again during negotiations, so keep your story straight and simple). If, as an owner, you always believed you would sell the business, then you should have a reasonable understanding when you would like this to happen. As a prelude to this event, you will want to focus on a number of key areas. First, your financial statements: If Aunt Sally has been doing your taxes for the life of the company and you have never had periodic management accounts, then you are not in a position to achieve full value for your company. Treat Aunt Sally with respect but get yourself an accountant and a bookkeeper to put the numbers in order. At least a full year's audited
  • 41. PersonaNonData: Predictions & Commentary 2006 - 2013 40 | P a g e financials and management accounts should be considered the basic financial reporting requirement when done by a qualified financial accountant. (They do not need to be full-time staff). Second, if Aunt Sally is just one of several family members taking a salary in the company, you may want to think about their continued involvement in the run up to the sale. A buyer will want to know the actual operating cost of the business and you, as a seller, want to provide the best possible view of the business (that is, without extra expenses). Now, if the family member(s) has a legitimate and key role, then you may have other issues to address (such as their position with the company post-sale). Third, many buyers will focus on future revenue growth. Do you have formal contracts or handshake deals? Is revenue dependent on one source? The buyer is going to second-guess your revenue projections; therefore, if there are any 'soft spots' it will undercut their confidence in the business overall. Saying so and so has always bought from us is not as valuable as being able to say 'we have a negotiated five-year deal' and we are currently in year two. If your revenue growth is rock solid - even if it is based on a small number of authors, commercial accounts or subscribers but supported in each case contractually - that will place you in a stronger position. Fourth, your accountant will also create a balance sheet for the company and the key items concerning a buyer are those things that deal most immediately with cash. As a seller, you need details about your inventory turn, accounts receivable collections and accounts payable. Assuming you have prepared for the sale of the business more than twelve months in advance, you should have a clear picture of these items. Just because an item is listed as a company asset doesn’t mean a potential buyer is going to agree as to its value. Other balance sheet items that require attention are fixed assets, which may include the building in which the company is located. Sometimes a seller wants to keep the building (if they own it) in which case you and your accountant will need to determine the best way to handle this. Bear in mind that the property could be the most valuable asset owned by the company. Similarly, the company may own patents and intellectual property that must be properly accounted for and (for the benefit of the acquirer) properly documented. In summary, get your accounts audited, create a 'clean' income statement, deal proactively to get your revenue sources locked down and establish formal procedures to manage your cash flow and balance sheet items. Obviously, the value of a business is stated in black and white in its financial statements but to the potential buyer they will be just as interested in the products you're selling and their future value as they are in your accounting policies. You must have clear ownership rights to any content or technology that represents a primary asset(s) of the company. If contracts aren't transferable, if certain rights are retained by content producers or if you 'collected' data to create your products without proper authority, these issues and others like them should be addressed and resolved before you market your company. If there is any doubt in the mind of a
  • 42. PersonaNonData: Predictions & Commentary 2006 - 2013 41 | P a g e buyer that they will be able to carry the business forward, this will either scuttle a deal or significantly reduce your purchase price. And don't think they won't find out. Sixth, your organization's human capital is important to the business for continuity reasons (if not for other reasons). Don't believe that you can keep the selling process a secret because even if your employees don't know everything, they will make up the rest. As a seller, you must maintain momentum and, for that, you need to maintain decent employee morale. Bonuses and incentives can play a role, as can simple communication. Unfortunately, you can't control what the purchaser chooses to do with the business and placing restrictions on post- sale activities - even if you can get away with this - will only reduce your take. Key employees are important to the purchaser and they will want to know who these people are. The purchaser may want some guarantee that these key employees will remain with the business for some stated period after the sale and will be willing to pay the employees a bonus to stay. As an owner, you may have provided equity to employees over the years, which would give them a piece of any sale. Often these deals can be 'casual' which is not what a buyer wants to hear. The last place a buyer wants to find him or herself is in the middle of an ownership dispute, so, no matter how painful this process may be, get those agreements formalized in advance of a sale. Finally, as a seller you will want to practice speaking about your company so you are effective in communicating to potential buyers why acquiring your company represents good strategy. Your understanding of your market, your competitors' market positioning and market trends and opportunities all represent key components of your company's selling attributes - and reasons why a purchaser will see opportunity in acquiring your company. Work to prepare a briefing document of your company which you can use in presentations and discussions. Importantly, at industry events, seek out speaking and panel discussion opportunities where you can both present your company and your understanding of the market, as well as learn about what other similar companies are doing in your marketplace. Not everyone is comfortable with this type of communication; however, during a sales process the buyer is going to rely a lot on your perspective about the business, and the more comfortable you are, the better your views will come across. The only way to become a better and more effective communicator is through practice. In summary, any hiccup in the process of acquiring your company could result in a buyer or buyers either getting cold feet or simply moving on to something else. There are lots of companies drawing acquisition attention and, having gained attention, you don't want to lose it and fall to the bottom of the pile. By the time you regain their interest, circumstances could have changed significantly and no longer exist to your advantage or worse - the opportunity maybe permanently lost to you.
  • 43. PersonaNonData: Predictions & Commentary 2006 - 2013 42 | P a g e The Curator and the Docent – June 22nd, 2010 Walking around a vast museum can be interesting and, sometimes serendipitous, but often it is an incomplete experience. Items are organized in specific groups yet not always in a manner that encourages exploration of the most important items. Presented with a gallery full of amphorae, it can be difficult to recognize the single important item while on your own and without a guide. Surfing the web for information and knowledge can offer a similar experience: Access and proximity is no guarantee you will happen on relevance. Museums and libraries are good proxies for the concept of “curation,” which we’re hearing a lot about at the moment. Private equity (for one) has found its next buzz word and funding vultures are lacing their presentations with references to ‘curation’ in an effort to gain financial support for their new business ideas. But curation is an old concept: Television networks, newspapers, magazines, journals and other media have all practiced a form of content curation for hundreds of years. We’ve just recently latched onto the idea of curation as though it were something new. The need for curation in the old media world wasn’t as obvious as in the internet world because, on the web, ‘everything carries the same weight’ and the average user has difficulty discerning good content from bad. Indeed, as content on the web exploded over the past fifteen years, users accepted the “good enough” concept – free content was plentiful – and were content to ‘satisfice’ either knowingly or obliviously. User behavior and expectations are changing and investors are now chasing businesses that profess to actively curate content and communities of interest. In recent years content curation has emerged out of the wild, wild, west of ‘mere’ content. Sites such as The Huffington Post, Red State and Politico all represent new attempts to build audiences around curated content. While they appear to be successful, at the same time there are other sites (such as Associated Content and Demand Media) contributing to the morass of filler content that can plague the web users’ experience. The buzz word ‘curation’ does carry with it some logic: As the sheer amount of information and content grows, consumers seek help parsing the good from the bad. And that’s where curation comes in. The amount of content available to consumers – much of it free of charge, but scattered across thousands of websites – is growing exponentially every day. At the same time, consumers are increasingly doing independent research and attempting on their own to source important information to support their increasingly complicated lives. Questions or information relating to healthcare, finances, education and leisure activities represent a small sample of the range of topics on which consumers look for accuracy and relevance, yet encounter an immense sea of specious or outdated content. In many ways, the web - in its entirety - is the new dictionary, directory or reference encyclopedia, but users with specific interests are increasingly beginning to understand they need to spend as much time validating what they find as they do consuming their research. In the old days, it was as simple as pulling the volume off the shelf and, while the web offers a depth and accuracy of content that far outstrips any from the old days, finding content of similar veracity can be a challenge.
  • 44. PersonaNonData: Predictions & Commentary 2006 - 2013 43 | P a g e For the past two years, I was working on a project with Louis Borders at Mywire.com in an attempt to build curated news and information service we called Week’s Best. For a variety of reasons we put the project on hold in February, but the concept was simple: Identify experts that can curate content on a range of specific topics and build a community of interested subscribers around the content. Our model was to find expert ‘content producers’ who retain unique knowledge and understanding of a specific topic and would filter content from across the web specific to their topic of expertise. Mywire.com built a unique editorial tool to make this process almost routine by pre-selecting topic-specific content from both brand name sources and from across the web. Our experts - the content producers – logged on each day and selected from this pre-sorted list only those items they considered the best content. Consumers interested in each of these topics subscribe to a free weekly email digest of the material selected. Our revenue model was based on turning a subset of our free email subscribers into paid subscribers who would gain access to high-quality content – such as content from Oxford University Press. While we were unable to execute as we expected, we did gain validation of our concept from both the publishing and the private equity community. Publishers, who we were chasing to be our ‘content experts’ liked that there was a low cost of entry for their participation and liked the editorial platform we had invested in. The equity community liked the ‘curation’ model, the people involved in the project and the investment that Mywire had made in the platform. However, we suffered the ‘prove it’ syndrome. Both publishers and equity partners wanted to see the model work before they committed and we ran out of time and resources. Mywire.com continues to invest in other curation type models. I remain convinced that applying technology to the selection of useful, valid and appropriate content is only part of the solution. At Mywire, we used a text mining tool as part of the editorial process and on simple news items – which are increasingly generic – placing content items into subject/topic groupings was relatively easy. The process isn’t perfect and requires frequent ‘fine tuning’ but while the tools are improving, human intervention is still required. Earlier this month we learned that even Google was applying some human filtering to their news site. There is a real debate whether consumers will pay for real expertise and knowledge: I believe they will, just as they paid for specialist magazines, journals, cable channels and similar media in the analog centuries. The atomization of content has complicated matters in that it has taken the proverbial covers off the print limitation of the traditional magazine. While a reader or subscriber will buy into the expertise of ‘Glamour’ or ‘Men’s Health,’ they now expect all important and relevant content and not just the content prepared by the magazine’s writers. After all, there is a low hurdle in the user’s ability to search for content on their own and it is silly to ignore this ability. Acting as a ‘content producer,’ the editors of ‘Glamour’ should be able to provide their paying subscribers with a collective representation of all content that’s important and relevant to their readers even if the content is produced by Glamour’s competitors. This is an important service and doesn’t limit the ability of Glamour to produce their own content; rather, it enhances it because they are able to view in detail the interests of their subscribers and produce applicable content to match.
  • 45. PersonaNonData: Predictions & Commentary 2006 - 2013 44 | P a g e In the above example, generic news is never going to be the basis for paid subscriptions. For example, the news that suntan lotion causes skin cancer is a hyped news story. In the Glamour example, this news story would always remain in the free section of their site; however, available to subscribers would be a curated selection of in-depth content including reference material, added to over time, with commentary and discussion from their ‘expert’ editors and advisors about the real issue of sun protection products. With a brand such as Glamour, the number of expert curated topics made available to subscribers could easily exceed fifty and over time would be likely to grow. Strongly associated with this approach would be the development of communities around each topic, leading in turn to additional business opportunities such as ad programs, events and special publishing programs. The interest of consumers across a wide variety of subjects and topics continues unabated and the internet has only facilitated that interest, although our expectations have been reduced or marginalized due to undifferentiated content. The consumer is increasingly smarter about the content they consume and they also continue to impress with their ability to seek out and absorb what, in the analog world, was considered too “advanced” for their understanding. There was always an arbitrary wall between “professional or academic” content and consumer content: Increasingly, consumers are making it clear that they want to make the decision themselves whether particular content is or is not too advanced for their comprehension or enjoyment. Recently, as I wandered around a museum with overwhelming breadth and depth of content, I was lucky to be guided in my travels by a professional. When she introduced herself to me, she used the term ‘docent’ to describe her function. A docent is a ‘knowledgeable guide’ and the function seems to me to perfectly complement the process of curation. In an online world, where more and more content appears to “carry the same weight,” we will look to and pay for the combination of curator and docent – sometimes the same person or entity – who can organize and manage a range of content and also engage with the user so they gain insight and meaning from the material. At Mywire.com, we intentionally approached branded media companies because they were recognized as experts in their segments. These are the companies which should be able to build revenue models around the curation of content to offer subscribers a materially different experience than simply performing a Google search query delivering up generic news and semi-relevant content. A Database of Riches: A Report on the market and pricing for the Google Books Project – April 20, 2010 Notes on this Report: In the summer of 2009, I started to wonder at the potential market opportunity that the Google Book Settlement could represent. Fellow industry consultant Mike Shatzkin and I began to discuss the agreement and I agreed to pull together a spreadsheet that could represent an ‘order of magnitude’ estimate of the market opportunity. This report does not rely on any direct interviews with Google nor representatives of the Book Rights Registry (BRR) and, as such, it only represents a structured approach to analyzing the opportunity. Nor is this report a
  • 46. PersonaNonData: Predictions & Commentary 2006 - 2013 45 | P a g e definitive declaration of pricing, market penetration or approach in the manner in which this market opportunity may be leveraged. Introduction: Almost five years ago, Google embarked on the most ambitious library development project ever conceived: To create a “Noah’s Ark” of every book ever published and to start by digitizing books held by a rarefied group of five major academic libraries. The immediate response from US publishers was muted, until the implications of the project became clear: That Google proposed no boundaries to the digitization effort and initiated the scanning of books both in and out of copyright and in and out of print. Adding to publisher’s concerns, Google planned to display “snippets” (small selections) of the book’s content in search results. Despite some hurried conversations among publishers, author groups and Google, Google remained convinced that what they were doing represented a social ‘good’ and the partial display of the scanned books was legally within the boundaries of fair use. From the publisher perspective, this was a make-or-break moment, and the implications were more acutely felt by trade publishers who saw the potential for their business models to be obliterated by easy and ready access to high-quality content via a Google search over which they would exert little or no control. Even worse was the fear that rampant piracy of content would also develop – a debated and contentious point - given the easy access to a digitized version of a work that could be e-mailed or printed at will. The publishers determined that if Google were to ‘get away with it’ without challenge, then anyone would be able to digitize publisher content and possibly replicate what has been going on in the music and motion picture industries for almost ten years. In mid-2005, prompted by a law suit filed by The Authors Guild, the Association of American Publishers (AAP) led by four primary publishers filed suit against Google in an effort to halt the scanning of in-copyright materials. (The Authors Guild and AAP ultimately combined their filings). The initial Google Book Settlement (GBS) agreement, given preliminary approval by a court in October 2008, generated a vast amount of argument both in support of the agreement and in challenges to it. A revised agreement was drafted after the Federal District Court of Southern New York and Judge Chin agreed to delay the adjudication and final arguments which were heard in late February 2010. To date, Judge Chin has not given a timetable nor an indication of when and how he will decide the case. From the perspective of the early leading library participants, Google’s arrival and promise to digitize their purposefully conserved print collections looked like a miracle. Faced with forced declines in the dollars spent on monographs and the ever-rising expense of maintaining over 100 years of print archives, the Google digitization program provided a possible solution to many problems. All libraries believe they hold a social covenant to collect, maintain and preserve the most relevant materials of interest to their communities but maintaining that
  • 47. PersonaNonData: Predictions & Commentary 2006 - 2013 46 | P a g e covenant becomes a challenge in an environment of increasing expenses while also enduring the challenges of migrating to an on-line world.1 The library world is typically segmented into public and academic institutions and while these often varied ‘communities’ may differ in their philosophy towards, for example, collection development or preservation, they do share some common practices. Most importantly, all libraries are committed to resource sharing and while materials use has historically and primarily been ‘local’ to the library, every institution wants to make its collections available to virtually any patron and institution who requests them. In short, these library collections were always ‘accessible’ to all regardless of geography or copyright: First US Mail, FedEx, e-mail and then the Internet progressively made this sharing easier but, until Google arrived with their digitization program, any sharing beyond the local institution was via physical distribution2 . In effect, it could be argued that the Google scanning program simply makes an existing practice vastly more efficient. Even though, the approval of the Google Book Settlement (GBS) hangs in the balance under review by Judge Chin of the Federal District Court of Southern New York, an Executive Director has been named to head the Book Rights Registry (BRR)3 and is preparing the groundwork to establish the organization (BRR) in advance of approval. This report represents an attempt to analyze the market size opportunity for Google as it seeks to exploit the Google Book Settlement. Following are our summary findings which are discussed in more detail in the ensuing pages of this report. Summary Findings of the Report:  Libraries will see tremendous advantages – both immediate and over time - from the GBS, although concerns have been voiced (notably from Robert Darnton of Harvard4 )  Google’s annual subscription revenue for licensing to libraries could approach $260mm by year three of launch  Over time, publishers (and content owners) will recognize the GBS service as an effective way to reach the library community and are likely to add titles to the service5  Google will add services and may open the platform for other application providers to enhance and broaden the user experience 1 It is important to acknowledge that, initially, the GBS may have been seen as a solution to libraries’ conservation and preservation needs; however, subsequently, libraries have determined that they need to develop their own preservation options in which The Hathi Trust is a clear leader. 2 Resource sharing and improvements in the ‘logistics’ provided by OCLC (WorldCat) or via consortia such as OhioLink has made physical distribution effective and comparatively efficient. 3 The BRR is the management body tasked with administering the GBS and representing the interests of authors and publishers once approval has been granted by the court. 4 Robert Darnton, NY Review of Books 5 The settlement doesn’t provide for adding content prior to 1/5/09; however, we are suggesting that, by mutual consent, additional published content may be added as an expedient method of reaching the library market.
  • 48. PersonaNonData: Predictions & Commentary 2006 - 2013 47 | P a g e  The manner in which the GBS deals with orphan works will provide a roadmap for other communities of ‘orphans’ in photography, arts, and similar content and intellectual property Business Analysis: By mid-2008, the lawsuit was background noise adding to the general malaise and discomfort characterizing the media industry and the announcement that the parties had agreed to settle their differences was initially greeted with support, relief and some surprise. Yet, as the implications of the complex settlement agreement became clearer, a strong (and, at times, strident) opposition developed to argue for substantial revisions to, or the elimination of, key sections of the agreement. Importantly, this opposition also succeeded in enjoining the Department of Justice (DoJ) to voice ‘strong opposition’ to segments of the agreement. When combined with the concerns expressed by DoJ, the opposition to the agreement was able to exact significant changes to the agreement’s terms. A ‘revised agreement’ was presented to and is now pending approval by Judge Denny Chin of the Federal District Court of Southern New York. Among the principal arguments against approval of the original settlement agreement were the following:  Opponents argued Google would attain an insurmountable monopoly over in-copyright but out-of-print works  The obligation to ‘opt-out’ of the agreement places an undue burden on the copyright holder (author)  Foreign rights holders were under represented (or insufficiently consulted) and thus disadvantaged by the original agreement  Monies collected on behalf of copyright holders but never disbursed would be paid into a ‘general expenses’ fund to benefit the Books Rights Registry6  Some authors believed their moral rights to determine the use and replication of their works were circumvented.  The agreement itself will in effect create copyright ‘legislation’ which should be the purview of Congress The revision to the agreement has partially addressed these issues (excepting the last item) but the settlement revision has not fully incorporated all of the challenges supported by the settlement opposition and the Department of Justice. Two aspects of the agreement which generated attention and hyperbole concerned the number of “orphan works” and the revenue model Google would implement to market their full-text database. Both of these issues are used by settlement opponents to justify the 6 Changed in the second version of the settlement so that uncollected funds would eventually be distributed to designated charities.
  • 49. PersonaNonData: Predictions & Commentary 2006 - 2013 48 | P a g e agreement’s rejection by the Court. In each case, very little real analysis has been conducted to determine the true parameters of both the ‘orphan’ issue and the market opportunity. In August 2009, we published an estimate of the potential number of orphan works that may exist. We are unaware of any other detailed analysis that attempts to quantify the collection of titles which remain in copyright but whose copyright holder has not been located. This analysis is included as an attachment to this document7 . The following chart summarizes the findings of potential orphan works: Estimate of Orphan Works Percent of Title Output: 1920 – 2000 580,388 Base Case 24% 824,553 High/Aggressive 34% In summary, the orphan analysis estimated a potential orphan population of 580,388 based on a review of pre-existing statistical information documenting the numbers of new titles published in the US since 1920. While we estimated that ‘orphans’ would be more prevalent among older titles, the total annual title output only exceeded 15,000 for the first time in 1960 (according to our source data); therefore, the universe of all titles published between 1920 and 1980 is actually relatively small. Publishing output only rapidly increased during the late 1980s and it is assumed that the majority of these titles will not be ‘orphans’ because copyright information is readily available and confirmable. As noted, the full report is included as an attachment to this report. We believe our analysis to be sound and the results were supported by a different methodology based on data from OCLC’s WorldCat database (as noted in the full report). After estimating the total number of ‘orphans’ we also estimated the number of foreign works that could potentially be included in the GBS. This analysis is more tenuous statistically because we relied entirely on the OCLC WorldCat database8 and made several key assumptions and extrapolations. Based on this conditional estimate, we determined there could be approximately 1.2million titles from the ten largest languages published and an additional 0.2million from all other languages. Currently, the content potentially covered by the GBS represents over 12mm titles scanned. Multiple versions of the same work are included in this total; however, even if all foreign works are to be excluded from the database and authors and publishers voluntarily remove their titles from inclusion, the Google Book subscription product will remain a compelling database for the academic and public library market as well as schools and certain corporations. A significant 7 A related analysis that extrapolates the potential number of foreign language titles that may fall under the umbrella of the settlement has also been completed but is not included in this document. 8 This is not to assert that the WorldCat data is inaccurate in any way; rather, our assumptions should be considered ‘best-guess’.
  • 50. PersonaNonData: Predictions & Commentary 2006 - 2013 49 | P a g e change adopted in the amended settlement agreement has narrowed the class to UK, Australian and Canadian published books in addition to those registered with the US copyright office.9 The Google Books Database Subscription and Revenue Model Opponents have suggested that Google will be in a position to exercise monopolistic pricing and to ‘overcharge’ to extract maximum revenues from their customers. We agree that their market position could be abused; however, we believe there is a counter-balance included in the agreement that obviates this tendency. Google seeks maximum exposure for the content - not only to support its stated mission of providing wide and broad access to this ‘hidden’ content, but also to support other business opportunities they may implement (such as advertising programs). We believe Google will see overly aggressive pricing as an inhibitor to wide market acceptance of the product. The Book Rights Registry will represent the interests of authors and publishers who will argue for pricing that maximizes their opportunity. Together, balancing wide access (Google’s position) with pricing considerations will result in an optimal pricing matrix. In developing our financial and market analysis, there are several key assumptions we have relied upon10 :  Pricing will be variable based on type of institution  This will be considered a ‘must have’ database product for all libraries  The Google product will effectively “level the playing field” from small to large academic libraries for the types of books covered by the Settlement  Google will continue to invest in the Book database product by adding content, functionality and applications/tools to aid usage over time and may raise pricing  Penetration will not reach 100% for any segment, but is likely to grow over time  Corporations will be important customers (e.g., science, aeronautics and engineering-based firms) In the following analysis, we attempt to define the Google Books Database market opportunity and estimate the potential annual revenues the company may be able to generate each year from database subscriptions. Google currently markets several services to publishers which include Google Scholar, Google Partner Program and Google Editions (which will be launched in mid-2010). These current products and services are not included or assumed in this analysis. 9 As an upper limit, the number of ‘non-English’ language titles could be 50% of the total books scanned. 10 Business models that include advertising are not assumed in this analysis. It may be possible that Google will use the scanned content as content around which they can tailor advertising offers; however, the second amended version has narrowed the application of varied business models and it is difficult to determine that any model other than a subscription-based service will be the primary revenue generator to Google and the BRR. Over time, this may change but that circumstance is not anticipated in this analysis.
  • 51. PersonaNonData: Predictions & Commentary 2006 - 2013 50 | P a g e In estimating the market potential for the Google Settlement database product, we have taken three primary components (or drivers) into account: Market segmentation, penetration and pricing. Market Segment The agreement provides Google with the right to exploit certain markets including academic, public and special libraries, corporate customers, print-on-demand (POD)11 and direct-to- consumer sales. In our analysis, we have used American Library Association data itemizing the type and number of libraries in the US and used “best guess” estimates of the market opportunity represented by corporations and consumers. Most commentary to date has focused on the library community, which is where this analysis is strongest in its estimates and where we concentrate our discussion. An important accommodation of the Settlement is the provision of free access to the database product for all public libraries and certain “Carnegie” classed libraries. Each library accepting this access will receive the equivalent of a single user sign-on that will allow patrons and/or staff to access the Settlement database without restriction. While an important accommodation for some libraries, for the majority of libraries this access will not be appropriately functional and, thus, site-wide and unlimited user access provided under the terms of the subscription product will remain the better option. We do not believe this free access will materially impact the revenue opportunity for Google and have allowed for this circumstance in our financial model. In our opinion, academic libraries will consider a subscription to the Google Books database as a competitive necessity. For the first time, any subscribing library within the United States may gain direct access to the collections of some of the largest and most renowned academic collections in North America12 . In addition, this access will far surpass the inter-library loan process of years past simply because the content is completely indexed. Researchers will no longer have to ‘guess’ that a title may be relevant to their research based on an index or table of contents and, moreover, they eliminate the risk that upon requesting the title be delivered to them, they discover the content to be irrelevant. Many academic library collections have been built over centuries and titles in their collections are often unique, which is another compelling reason supporting the argument that the Google database represents a singular opportunity for all academic institutions to “narrow the gap” between their research capabilities and those of the country’s largest and best endowed institutions. While some academic collections’ titles are available via inter-library loan, many older, fragile and unique works are only available at the institution itself by special request. The digitization of many (not all) of these works significantly broadens access to and distribution of 11 POD is a right that may be granted to Google in the future pending approval of the Book Rights Registry and the rightsholders they will represent. 12 The amended settlement has narrowed the class and effectively excludes non-English titles from the database.
  • 52. PersonaNonData: Predictions & Commentary 2006 - 2013 51 | P a g e this content. Undoubtedly, researchers, educators and students at all academic institutions will pressure their administrators and librarians to subscribe to the product13 . The following chart represents our construct for the potential addressable market segments for the Google book database14 : Total Number of Academic Libraries 3,617 Total Public Libraries 9,198 School Libraries 99,783 Special Libraries 9,066 Armed Forces 296 Government 1,159 Market Penetration: We estimate that sales penetration will vary considerably across the segments; however, for the reasons presented earlier, we believe penetration into the academic library segment will lead all other markets. Public libraries (particularly metropolitan library systems) will find value in the database and, as a group, will represent the largest concentration of customers overall. School libraries are unlikely to subscribe to the database in great numbers for budgetary or relevance reasons and, moreover, students will be encouraged to gain access to the product via their public library remote-access facilities. We expect larger research public libraries (such as The New York Public Library) will be treated as academic libraries for the sake of pricing. We also expect some corporations to access the database product and, while pricing for these ‘for profit’ entities should be comparatively high, the absolute number of customers in this segment will be small. Pricing: Database subscription pricing can be complicated and confusing. Models can be based on population served, purchasing budgets and/or enrollment, and then be subject to multiplication factors such as number of simultaneous users, number of physical locations and other factors. We don’t know which method Google will choose; however, in order to keep our analysis as simple and transparent as possible, we have built our pricing model on the basis of the following criteria:  Unlimited users per location  Branch public libraries priced at 25% of base fee per additional branch  3% price increases per year 13 It is likely that an extensive database of user behavior maybe generated by usage of this database. This is data that publishers (and authors) may be interested in mining for product development and/or insights into consumer behavior. 14 Source: American Library Association
  • 53. PersonaNonData: Predictions & Commentary 2006 - 2013 52 | P a g e  Institution ‘classification’ based on ALA data  Full ramp-up will occur over the first three years Additionally, we expect Google will sell to the ‘highest’ administrative level possible15 . For example, the University System of Georgia manages licensing contracts under their Galileo program for both public and academic libraries and, therefore, this agency would be the customer rather than individual or local libraries. In New York, Google would license access to the library authorities in each borough. In New York City (Manhattan), this would mean the main library and roughly 50 satellite libraries would have unlimited access via one contract and, based on our pricing matrix, the NYPL would pay approximately $340,000 per year for access ($25,000 for the main and $6,250 per 50 locations) For-profit organizations (corporations and businesses) will have a pricing matrix higher than for non-profit libraries and institutions (generally standard practice). We would expect that only a relatively small percentage of businesses would subscribe to the entire database and we have segmented the target market into Fortune 500, 1,000 and all others. The corporate customers most likely to subscribe would be those companies with large research needs such as pharmaceutical, aeronautics, engineering and the like. Options to better address this market may include shorter subscription terms, usage based on metering systems or topic/subject specific packages. Market Opportunity Summary: We believe Google and the Book Rights Registry (a proxy for authors, authors’ heirs and publishers) will be motivated to maximize access to the Google database in order to maximize viewing of the content which will, in turn, result in optimal revenues for both. We do not believe Google will implement a monopolistic approach to pricing and, in comparison with smaller and more segmented databases, we believe the Google pricing will appear reasonable considering the breadth and depth of content in the database. Approach to the Market: In our view, Google has several options for marketing and selling this database product:  Google sells the product themselves with their own sales force  Google designates one supplier for each segment  Google allows all vendors to integrate the books database product into their existing database products and pays Google a defined fee per user. In our view, it is unlikely that Google will establish their own sales force to sell into the library and corporate marketplaces. While Google does have an ad sales force supporting its SEM program(s), this activity is vastly different from building a sales team to call on libraries and 15 Consortia pricing, while an important consideration, would represent a discount to the pricing matrix we present and would be negotiated on a case-by-case basis. We have not made accommodations for Consortia pricing.
  • 54. PersonaNonData: Predictions & Commentary 2006 - 2013 53 | P a g e corporate clients. Additionally, given Google’s predilection for automation, the hiring of a human sales team doesn’t seem culturally acceptable. Lastly, and possibly more important, we believe licensing this product will become more a ‘renewal’ business as the market matures (after 3-4yrs) which could require far less sales effort – or one significantly different than that required in the first three years. We estimate a fully staffed Google sales force could cost the company $15million annually but, in short, Google is unlikely to want the headache. Given the limitations of the above approach, we believe it is more likely Google will contract with one or more of the established players and pay a standard sales commission to the provider. In this model, Google will be able to set prices and targets and retain a degree of control over both the provider of this sales effort and the market delivery (pricing) of the product. Existing providers would bid on the right to sell this database on behalf of Google and, because the product will be highly valued, the bidding would likely be highly competitive. Likely providers to Google would include ProQuest, Gale/Cengage, OCLC or EBSCO. It is also possible that an ‘outlier’ such as Ingram, Baker & Taylor or Hudson News (LibreDigital) would also see representing this database as a significant opportunity. For an established player, it is likely the provider would see increased sales in their current offering – simply representing the Google Books database would open new market opportunities. For an ‘outlier’, the Google Books product may represent an opportunity to enter the market using the Google product as a foundation. In our estimation, the above scenario is not only practical (not having to administer their own sales force is a major advantage), but may also be cost effective. Given the ‘prize’ of representing the Google database, we believe the average cost to Google maybe less than 10% of revenues. (“Renewal” sales may also be commissioned less than initial sales). Working with a single provider thus represents an effective solution for Google but this strategy may not also be efficient. In order to achieve greater efficiency in reaching their target market while also eliminating possible “political” issues caused by selecting one vendor over the others, the company may consider allowing any provider to sign a standard distribution agreement with the company and sell and market the product into all markets. This approach has several advantages:  Immediately leverages the competitive position of all major providers that otherwise may be mutually exclusive  Gives a library subscriber a choice of provider and/or allows them to work with an existing ‘preferred’ vendor  Potentially enables providers to integrate the Google product with their existing products thus providing rapid development initiatives and built-in content ‘handcuffs’ supporting renewals  Minimizes Google’s exposure to any supplier limitations and negative customer support issues  Provides maximum exposure to all market segments virtually immediately  As part of these agreements, Google may gain access to index all content supplied by their third-party sales partners
  • 55. PersonaNonData: Predictions & Commentary 2006 - 2013 54 | P a g e Approach to the Market Summary: Based on this review of Google’s tactical options, we believe the company will enable multiple (initially ‘preferred’) vendors to market and sell the product into the market. Google will establish pricing and the vendors will be required to pay Google based on this set price schedule (less vendor commission). Under this model, any vendor will be free to charge the end-customer less than the ‘set price’; however, the vendor would still pay Google based on the higher ‘full’ price. (Selling below the set price could occur due to bundling different products provided by the vendor). Forecasted Revenue Expectations: Based on our assumptions documented above, we believe the revenue Google may generate from the Google Books database product could approach $260million per year. Our revenue model was based on the following set of assumptions:  Base pricing by segment  Price discounts based on size of library holdings or population served  Penetration levels based on library size  Revenue represents full implementation, which we expect by year three
  • 56. PersonaNonData: Predictions & Commentary 2006 - 2013 55 | P a g e The following chart documents our estimates: Segment Total Market Avg. Penetration Avg. Pricing Revenue ($MM) Academics 3,617 65% $55,000 $130.1 Publics 9,198 47% $21,000 $112.8 School 99,783 0.5% $10,000 $4.9 Special 9,066 0.5% $25,000 $1.1 Armed Forces 296 5% $11,000 $0.1 Government 1,159 25% $11,000 3.1 Corporate 100,000 2% $37.500 $7.5 Total $260.0 As noted, we believe it will take Google three years to ramp up this full implementation revenue (we do not see this as a limitation on Google’s part, rather, a typical expectation for a new-product roll out). At the above levels, we believe pricing is not only reasonable and affordable, but compares favorably with existing database publishers’ pricing. There are few, if any, other publishers who have products which serve as many (all) segments as the Google Book database. At this revenue level, each of the 12mm titles in the Google database has a nominal value of $22 (per year) to Google. More importantly, the per-unit price paid by each library will be less than $0.05 (five cents). On a pure cost-avoidance basis, licensing the Google Books database appears good value given current costs. If the costs of handing, cataloging, special requests (such as interlibrary loans) and storage are added to the base wholesale price of any title, the title’s full ‘carrying costs’ can double. Some studies have indicated that fulfilling an interlibrary loan request can cost $25 for each segment from the library to requestor and back. This cost far exceeds the original (or, in many instances, the replacement) cost of the title16 . While we believe this database to be an important acquisition for most academic and many public libraries, we do expect that Google will need to sell this product aggressively in the early years to achieve the penetration levels we anticipate. There are several reasons for this: Firstly, the content of the database is largely unknown and, while representative of many important library collections, Google will need to market this collection as important and complementary to the library customers in question. Secondly, the sheer size of the database could be an inhibiting (or intimidating) factor and therefore the navigation, bibliographic data quality and the delivery of subject ‘collections’ will be important customer acquisition and retention areas for the company to focus on. 16 Users may print all or portions of the titles they select – although the ability (functionality) to do this may be a subsequent grant provided by the BRR to Google – and there is a cost to these activities;; however, we maintain the utility of the database and the ability of the user to be precise in their printing requests will thus produce only a marginal negative cost (if any) relative the costs of avoidance that is endemic to the current solution.
  • 57. PersonaNonData: Predictions & Commentary 2006 - 2013 56 | P a g e In summary, we believe Google will be able to successfully launch their Book Database product into the market with fair and reasonable pricing that will encourage a broad base of target customers to subscribe. Future Market Growth Opportunities 17 : While launch of this product is a focus of attention, we do believe the company has numerous opportunities to expand the product over time. We do not expect the Google Books database product to ‘stand still’; rather, we believe this product could become the primary access point for textural (monograph) materials into the library market.  The addition of other content: Publishers may see this product as a viable library market entrance point for all their book content  Provision of usage data to publishers (and others) for business and product development needs  Pricing increases over time and penetration will increase  Inclusion of international/non-US market content – English language  Inclusion of international/non-US market content – Non-English language  Access to international markets  Addition of more in-copyright materials closer to current pub dates; perhaps becomes a major distribution mechanism for book content  Topic/segmented collections  Potential to open the database for third party application development Summary: This analysis argues that the Google Books Database product will be seen as a ‘must have’ product for a large proportion of academic and public libraries and is, thus, valuable on its merits. Google will price this product at levels both lower than existing database providers and at levels that are ‘economically viable’ given cost avoidance justifications. The company retains flexibility in how they will approach selling and marketing the product; however, we believe they will contract these services. Lastly, we believe there is potential upside to the revenue model based on adding new markets and expanding content. Your Price May Vary – November 18th, 2009 I was enamored with the airline industry as I grew up and close readers will know I’ve always traveled a lot. Out of business school I interviewed with three airlines in their pricing departments where newly hired MBA’s went to learn the business. In that role, staff managed pricing of airline seats to maximize revenue per flight. Remembering that once a flight left the 17 We expect these opportunities to ‘evolve’ over time based on discussion, negotiation and mutual agreement of the parties.
  • 58. PersonaNonData: Predictions & Commentary 2006 - 2013 57 | P a g e gate any open seat amounted to zero revenue for the airline, this activity was potentially highly stressful as the job also required close comparison with competing airlines’ pricing. All this activity is now done with sophisticated real-time analytics and people rarely enter into the equation. Contrast this reliance on deep data analysis that helps the airlines maximize their revenue and the approach that media companies have used to price their products. For the most part, in the media business pricing is homogeneous across format with little consideration to the popularity (or lack) of the artist, author or show in question. Rather than a pricing model constructed on maximizing the revenue from individual products the content owner places a band of pricing across the range of their content. This is particularly the case in trade publishing, and in this model each artist is considered equal in their ability to generate revenue. Historically, publishers and other media companies ‘jimmied’ this lack of sophistication by assuming long backlist life, format sales – trade paper, mass-market, video rental, etc. – but those options look increasingly unworkable as the market migrates to e-Content. Publishers in particular are gun shy about experimenting with pricing; opting to use the blunt instrument of scarcity rather than more sophisticated options. Numerous big name titles this year have been ‘held back’ from eBook distribution in deference to their print versions. This approach has already caused consternation among the consumers who have already made the transition to eBook content and want the newest titles when (even before) everyone else gets them. At some point many of these e-Book owners will look upon this situation as a ‘first mover’ penalty. As e-content becomes more ubiquitous pricing should become more science than current practice would dictate. For the health of all parties in the publishing supply chain, it is vital that the price paid by consumers maximizes revenue. Understanding how the demand curve arcs is critical to pricing accurately and many factors (some more important than others) play into this calculation including the author’ brand, time from publication, exclusive content, competition, etc. Obviously, knowing how much someone is willing to pay for something (at a point in time) is difficult but think about how airlines do this: A seasonal traveler has far different characteristics than an executive who just has to get to Miami tomorrow. They both end up on the same flight but pay significantly different prices. Publishers can be forgiven for a lack of understanding of the metrics of pricing in a print based world with many intermediaries and little ability to gather empirical data. Online things have changed and The Economist recently reported on research published by two economists at the University of Pennsylvania which examined pricing for on-line music. In this research, the authors looked at iTunes and attempted to determine whether students would be more or less willing to pay a different price per song than the rigid 99cents per tune. (There may be some correlation here between what Apple did with music and what Amazon is attempting to do with Kindle titles, and maybe Publishers should ask the researchers to expand the analysis.) The authors of this study found that the market could sustain a higher uniform price and knowing (via the results) the higher uniform price they were then able to expand their analysis to look at per song pricing and make some other extrapolations. The authors also experimented with a
  • 59. PersonaNonData: Predictions & Commentary 2006 - 2013 58 | P a g e subscription type model that had a fixed price component with a per-use fee, and this model appeared to be more effective at maximizing revenue and value for both retailer and consumer. Pricing is complicated: publishers can approach this in an unsophisticated manner but in doing so they are unlikely to maximize their revenue. More analysis is likely to show that a variable approach to pricing and packaging will generate more revenue. For example, in an approach the authors suggest for music, a publisher with a selection of 10 political/legal thrillers could generate more revenue selling the package for $29.95 than relying on selling each separately for a total of $79.00. The other advantage for both publishers and consumers is that more content can be purchased thereby increasing the market and customer base. Regardless, the decisions around pricing are worth spending more time on rather than reactively applying old pricing models to new circumstances. Perhaps we will see ‘Pricing Analyst’ as a new publishing job title. Segmenting the Publishing Industry – November 9, 2009 There is a self-publishing conference in NYC this weekend which reminded me of a project I worked on several years ago. After reading an interesting article in the Harvard Business Review about defining a company's corporate strategy, I decided to use the ideas in the article to spur discussion about my client's strategy. The HBS article Charting Your Company's Future is available from the HBS site and is summarized as follows: Few companies have a clear strategic vision. The problem, say the authors, stems from the strategic-planning process itself, which usually involves preparing a large document, culled from a mishmash of data provided by people with conflicting agendas. That kind of process almost guarantees an unfocused strategy. Instead, companies should design the strategic-planning process by drawing a picture: a strategy canvas. A strategy canvas shows the strategic profile of your industry by depicting the various factors that affect competition. And it shows the strategic profiles of your current and potential competitors as well as your own company's strategic profile--how it invests in the factors of competition and how it might in the future. The basic component of a strategy canvas--the value curve--is a tool the authors created in their consulting work and have written about in previous HBR articles. This article introduces a four- step process for actually drawing and discussing a strategy canvas. Readers will learn how one European financial services company used this process to create a distinct and easily communicable strategy. The process begins with a visual awakening. Managers compare their business's value curve with competitors' to discover where their strategy needs to change. In the next step--visual exploration--managers do field research on customers and alternative products. At the visual strategy fair, the third step, managers draw new strategic profiles based on field observations and get feedback from customers and peers about these new proposals. Once the best strategy is created from that feedback, it's time for the last step--visual communication. Executives
  • 60. PersonaNonData: Predictions & Commentary 2006 - 2013 59 | P a g e distribute "before" and "after" strategic profiles to the whole company, and only projects that will help move the company closer to the "after" profile are supported. My client was a medium-sized publishing company in a rapidly growing market and we met to brainstorm about redefining the organization's business strategy. Using the HBR article as a guide, we constructed a set of 'straw-man' profiles describing our client base and key characteristics. Firstly, we constructed the following customer type segmentation as follows: Professional nave either a track record of selling titles and/or have commercial interests, such as a seminar business, where the book is a component (but not the main source) of revenue. In the latter case, the author/publisher may be less concerned with the commercial success of the title but retain a strong desire to produce a quality published product in the traditional sense. This group is likely to understand the publishing business. Amateurs may have significant misconceptions about the industry and their capacity to be successful. They will require significant education and (possibly) even motivation to complete their “product.” They may develop a personal relationship with the publisher rather than a business relationship and will become more demanding of time and effort than the Professional. Non-Commercial versus Commercial could be a choice of the publisher as well as a representation of the commercial potential of the product. For example, to a "pragmatist", a book could be a 'give-away' that supports some other aspect of their business and is thus 'non- commercial' but to an amateur the book may be 'non-commercial' because it doesn't have a market. My client's customer base had expectations about the commercial merits of their products, which often, did not match reality and this was important for my client's management to recognize. Most of our customers in the lower-left quadrant would place themselves much further to the right on the commercial spectrum than reality would dictate. We also recognized that placing customers into the lower right quadrant could not be planned with a degree of accuracy and depended on the willingness of the client to promote and market their title aggressively. Realistically, we felt it was next to impossible to anticipate success in this quadrant. In the upper-right quadrant, we would most likely find established authors, professional speakers and back-in-print titles. (We didn't look at profitability in this exercise but that would be an obvious additional task).
  • 61. PersonaNonData: Predictions & Commentary 2006 - 2013 60 | P a g e We then selected a spectrum of key attributes that we believed the publisher's customers valued: Price, speed, contact, quality, control, product sales, community, education, ease of use, reputation. Using these attributes (which would be confirmed by research later), we attempted to plot how our customers in each quadrant valued each attribute. Importantly, we understood these drivers to be 'valued' differently by the customers in each quadrant. Pragmatists: The resulting chart for Pragmatists plotted for the client and one of their competitors looked like this: This draft profile suggests key areas of differentiation from one player to the other. The competitor (black line) operates at the top of the chart for the drivers that their customers view as critical and give low consideration (limit time and effort) on those that do not and which don't support their strategy. In my clients’ case, we believed customers valued education highly but we also knew this aspect of the business cost a lot to deliver. Dreamers: We also looked at the 'dreamer' segment and chose a different competitor which had made a conscious decision to build sales volume with clients in that quadrant. To support this strategy their revenue model was partially driven by unit sales (of the finished book), and they determined that many of their authors did not care about quality in the same way a traditional publisher/ author would. The competitor believed that ‘Dreamers’ were interested in receiving the end product as soon as possible. In contrast, my client publisher sought to actively engage with the ‘dreamer’ to produce a better end product. Paradoxically, in the case of the competitor the ‘dreamer’ may remain blissfully ignorant but happy, while in the case of my client the customer may be dissatisfied because the process took longer, the interactions with staff were frustrating and the choices overwhelming. Same type of customer - "Dreamer" - but different approaches produce different customer experiences and expectations. Strategy:
  • 62. PersonaNonData: Predictions & Commentary 2006 - 2013 61 | P a g e As we discussed these 'straw-man' profiles we recognized that, for our business, there was a lot of revenue in delivering services to the lower left quadrant if we could get the business driver mix just right. Our challenge was to understand how to produce that revenue profitably. One obvious solution was to withdraw/eliminate costly services the author/customer is uninterested in. Over-delivering to this segment is pointless (which is a philosophy that one of our competitors practiced.) We also recognized that classic business strategy suggests that companies endeavor to move their customers in the direction of the upper right quadrant. In the self-publishing market it would be virtually impossible to turn ‘Dreamers’ into ‘Moneyed’; however, it may be possible to move a small number into ‘lotto winners.’ The assumption would be that these authors have a product with a ‘hook’ that is somehow unique, and they are willing to work actively on the book to improve it and support it in the market. An added bonus would be one if the author was willing/able to publish additional titles. Rather than expend effort building marketing, promotion and editorial services (add-ons) for clients in the lower left, one potential strategy would be to expend this effort on the select titles/authors that showed promise in moving these titles/authors to the right along the commercial spectrum. Using the framework we hashed out over an afternoon, our next step was to confirm the key customer drivers by segment (Professionals, Amateurs), to plot our position and our competitor's, and then identify our ideal profile. Once we defined this ideal profile, we would build a strategy focused on moving the company from the 'old' curve to the 'new' one. In implementing this approach it is important to recognize that customers dictate and research is likely to identify a new driver and confirm that one or more suggested drivers are not important at all. Substitutions could occur and research should be tailored to uncovering these ‘unknown’ drivers not just confirming the ones the staff identifies. Lastly, communicating the strategy internally is important and using a visual tool like this strategy map makes this easier. Once the ‘big-picture’ strategy is defined, then other tactical aspects of the strategy should be easier to define. This can be both a fun exercise and one critical to the future success of an organization. 580,388 Orphan Works – Give or Take Clearly one of the most (if not the most) contentious issue regarding the Google Book Settlement (GBS) centers on the nebulous community of “orphans and orphan titles”. And yet, through the entirety of the discussion since the Google Book Settlement agreement was announced, no one has attempted to define how many orphans there really are. Allow me: 580,388. How do I know? Well, I admit, I do my share of guess work to get to this estimate, but I believe my analysis is based on key facts from which I have extrapolated a conclusion. Interestingly, I completed this analysis starting from two very different points and the first results were separated by only 3,000 works (before I made some minor adjustments).
  • 63. PersonaNonData: Predictions & Commentary 2006 - 2013 62 | P a g e Before I delve into my analysis, it might be useful to make some observations about the current discussion on the number of orphans. First, when commentators discuss this issue, they refer to the ‘millions’ of orphan titles. This is both deliberate obfuscation and lazy reporting: Most notably, the real issue is not titles but the number of works. My analysis attempts to identify the number of ‘works’; Titles are a multiple of works. A work will often have multiple manifestations or derivations (paperback, library version, large print, etc.) and thus, while the statement that there may be ‘millions of Orphans titles’ may be partially correct, it is entirely misleading when the true measure applicable to the GBS discussion is how many orphan works exist. It is the owner (or parent) of the work we want to find. To many reporters and commentators, suggesting there are millions of orphans makes sense because of the sheer number of books scanned by Google but, again, this is laziness. Because Google has scanned 7-10 million titles then, so the logic goes, there must be ‘millions of orphans’. However, as a 2005 report (which I understand they are updating) by OCLC noted, all types of disclaimers should be applied to this universe of titles such as titles in foreign languages, titles distributed in the US, titles published in the UK, to name a few. Accounting for these disclaimers significantly reduces the population of titles at the core of this Orphan discussion. These points were made in the 2005 OCLC report (although they were not looking specifically at orphans) when they looked at the overlap in title holdings among the first five Google libraries. (And if you like this stuff, this was pretty interesting). Prognosticators unfamiliar with the industry may also believe there are millions and millions of published titles since, well, there are just lots and lots in their local B&N and town library. The two methods I chose to try to estimate the population of orphans relied, firstly, on data from Bowker’s BooksinPrint and OCLC’s Worldcat databases and, secondly, on industry data published by Bowker since 1880 on title output. I accessed BooksinPrint via NYPL (Bowker cut off my sub) and Worldcat is free via the web. The Bowker title data has been published and referred to numerous times over the years and I found this data via Google Book Search; I also purchased an old copy of The Bowker Annual from Alibris. In using these databases, my goal was to determine whether there are consistencies across the two databases that I could then apply to the Google title counts. In addition to the ‘raw data’ I extracted from the databases, OCLC (Dempsey) also noted some specific numbers of ‘books’ in their database (91mm), titles from the US (13mm) and non-corporate ‘Authors’ (4mm). Against the title counts from both sets of data, I attributed percentages which I then applied to the Google universe of titles (7mm). (My analysis also 'limits' these numbers to print books excluding for example dissertations). In order to complete the analysis to determine a specific orphan population, I reduced my raw results based on best guess estimates for non-books in the count, public domain titles and titles where the copyright status is known. These final calculations result in a potential orphan population of 600,000 works. I also stress-tested this calculation by manipulating my percentages resulting in a possible universe of 1.6mm orphan works. This latter estimate is (in my view) illogical as I will show in my second analysis.
  • 64. PersonaNonData: Predictions & Commentary 2006 - 2013 63 | P a g e An important point should be made here. I am calculating the potential orphan population, not the number of orphans. These numbers represent a total before any effort is made to find the copyright holder. These efforts are already underway and will get easier once money collected by the Books Rights Registry is to be distributed. My second approach emanated from my desire to validate the first approach. If I could determine how many works had been published each year since 1924 then I could attribute percentages to this annual output based on my estimate of how likely it was that the copyright status would be in doubt. Simply put, my supposition was that the older the work, the more likely it was that it could be an orphan. Bowker has consistently calculated the number of works published in the US since 1880 (give or take) and the methodology for these calculations remained consistent through the mid-1990s. According to their numbers, approximately 2mm works were published between 1920 and 2000. Unsurprisingly, a look at the distribution of these numbers confirms that the bulk of those works were published recently. If there were (only) 2mm works published since the 1920s, it is impossible to conclude there are millions of orphan works. To complete this analysis, I aggressively estimated the percentage of works published each decade since 1920 which could be orphan works. The analysis suggests a total of 580K potential orphan works which, as a subset of the approximately 2mm works published in the US during this period, seems a reasonable estimate. My objective to ‘validate’ my first approach (using OCLC and BIP data) shows that both approaches, using different methodology, reach similar conclusions. There are several conclusions that can be drawn from this analysis. Firstly, since the universe of works is finite then, beyond a certain point, the Google scanning operation will begin to find ‘new’ orphans at a decreasing rate. I don’t know if this number is 5mm scanned titles or 12mm but my estimate is 7mm because, according to Worldcat, there are 3mm authors to 12mm titles. If you apply this ratio to the Bowker estimate of total of works published, the number is around 7-8mm titles. Secondly, publishing output accelerated in the latter part of the 20th century which means that, while my estimates in percentage terms of the number of latter day orphans were comparably lower than the percentages applied in the early part of the century, the base number of published titles is much higher, therefore the number of possible orphans is higher. Common sense dictates that it will be far easier to find the parents of these later ‘orphans’. In the aggregate, the 600K potential orphans may still seem high against a “work” population of 2.2mm (25%). I disagree, given the distribution of the ‘orphan’ works (above paragraph) and because I have assumed no estimate of the BRR’s effort to find and identify the parents. In my view, true orphans will be a much lower number than 600,000, which leads me to my final point. Money collected on behalf of unidentified orphan owners will eventually be disbursed to cover costs of BRR or to other publishers. There has been some controversy on this point and it derives, again, from the idea that there are millions of orphans and thus the pool of undisbursed revenues will be huge. The true numbers don’t support this conclusion. There will
  • 65. PersonaNonData: Predictions & Commentary 2006 - 2013 64 | P a g e not be a huge pool of royalty revenues to be ultimately disbursed to publishers who don’t ‘deserve’ this windfall because there won’t be very many true orphans. The other point here is that royalty revenues will be calculated on usage and, almost by definition, true orphan titles for the most part are not going to be popular titles and therefore will not generate significant revenues in comparison with all other titles. This analysis is not definitive, it is directional. Until someone else can present an argument that examines the true numbers and works in more detail, I think this analysis is more useful to the Google Settlement discussion than referring by rote to the ‘millions of orphans’. The prevailing approach is lazy, misleading and inaccurate. e methodology for these calculations remained consistent through the mid-1990s. According to their numbers, approximately 2mm works were published between 1920 and 2000. Unsurprisingly, a look at the distribution of these numbers confirms that the bulk of those works were published recently. If there were (only) 2mm works published since the 1920s, it is impossible to conclude there are millions of orphan works. To complete this analysis, I aggressively estimated the percentage of works published each decade since 1920 which could be orphan works. The analysis suggests a total of 580K potential orphan works which, as a subset of the approximately 2mm works published in the US during this period, seems a reasonable estimate. My objective to ‘validate’ my first approach (using OCLC and BIP data) shows that both approaches, using different methodology, reach similar conclusions. There are several conclusions that can be drawn from this analysis. Firstly, since the universe of works is finite then, beyond a certain point, the Google scanning operation will begin to find ‘new’ orphans at a decreasing rate. I don’t know if this number is 5mm scanned titles or 12mm but my estimate is 7mm because, according to Worldcat, there are 3mm authors to 12mm titles. If you apply this ratio to the Bowker estimate of total of works published, the number is around 7-8mm titles. Secondly, publishing output accelerated in the latter part of the 20th century which means that, while my estimates in percentage terms of the number of latter day orphans were comparably lower than the percentages applied in the early part of the century, the base number of published titles is much higher, therefore the number of possible orphans is higher. Common sense dictates that it will be far easier to find the parents of these later ‘orphans’. In the aggregate, the 600K potential orphans may still seem high against a “work” population of 2.2mm (25%). I disagree, given the distribution of the ‘orphan’ works (above paragraph) and because I have assumed no estimate of the BRR’s effort to find and identify the parents. In my view, true orphans will be a much lower number than 600,000, which leads me to my final point. Money collected on behalf of unidentified orphan owners will eventually be disbursed to cover costs of BRR or to other publishers. There has been some controversy on this point and it derives, again, from the idea that there are millions of orphans and thus the pool of undisbursed revenues will be huge. The true numbers don’t support this conclusion. There will not be a huge pool of royalty revenues to be ultimately disbursed to publishers who don’t ‘deserve’ this windfall because there won’t be very many true orphans. The other point here is
  • 66. PersonaNonData: Predictions & Commentary 2006 - 2013 65 | P a g e that royalty revenues will be calculated on usage and, almost by definition, true orphan titles for the most part are not going to be popular titles and therefore will not generate significant revenues in comparison with all other titles. This analysis is not definitive, it is directional. Until someone else can present an argument that examines the true numbers and works in more detail, I think this analysis is more useful to the Google Settlement discussion than referring by rote to the ‘millions of orphans’. The prevailing approach is lazy, misleading and inaccurate. The ISBN Is Dead – August 4th, 2009 There are few greater supporters of the ISBN standard than I (and most of us are named "Michael" so we are easily identified); however, I am increasingly concerned about the future health of the ISBN. In its current form the ISBN is not yet dead but therein lies the problem: 'in its current form.' In order to gain entry to the supply chain, most small and medium-sized publishers will continue to buy their ISBNs from agencies around the world as they have since the 1970's. (In contrast, most large publishers have reservoirs of ISBNs sufficient to last almost forever and only occasionally buy new prefixes to establish new imprints). Five years ago, I participated in the once-a-decade ISO ISBN revision process that resulted in the current ISBN standard. (Michael Healy ran this two year process on behalf of ISO). That revision included the expansion from 10 to 13 digits, but this was tame compared to the contentious issue of separate ISBNs for every eBook format. I support this position (although I did not have a vote in the revision) and agreed with others who viewed assigning separate ISBNs as consistent with the way ISBNs had historically been assigned to other title formats. Despite the passage of time, this issue continues to generate significant comment and has become (to me) one of several indications that the ISBN in its current form may not be sufficient to support the migration to a digital world. A second problem the ISBN faces is driven by some down-stream suppliers who don't see the ISBN as relevant. The most prominent (egregious - pick your label) of these has been Amazon - and this is not just because no Kindle title carries an ISBN. Amazon has long been disdainful of the ISBN and, almost from the opening of the bookstore, they assigned "ASINs" to books. In his defining Web 2.0 article, Tim O'Reilly used the example of Amazon's ASIN as an indicator of Amazon's application of the principles of Web 2.0. At the time (while I was at Bowker in 2005), I took a more sanguine view in an email: Amazon’s ASIN creation was built out of expediency. If they received a title from a publisher that (for whatever reason) had no ISBN, they assigned a number just so they could get it in their system. (Don’t laugh, we get frantic calls from publishers who are at their printer and don’t have a number). At first they were designating these as “ISBN”s which we had them change. There was never an intention to take ISBN and make something better and different. So while I would agree on your point about extending the bibliographic content, in the case of ASINs
  • 67. PersonaNonData: Predictions & Commentary 2006 - 2013 66 | P a g e Amazon were not looking to create additional value or take the identifier to some other more valuable place: they needed 10 digits to identify a SKU. Now they have polluted the supply chain with these numbers. No other vendor has seen a requirement to create their own SKUs; there has never been a need, because the ISBN has been the most effective product identifier ever established. Hence, at Amazon, the lack of ISBNs on Kindle titles isn't really new; although it was a fairly rare occurrence (albeit from a very large player). Others now new to the supply chain (including suppliers of print-on-demand titles) have decided not to use ISBNs. Some of these suppliers are using the Google Book settlement titles as their 'inventory' and thus, by definition, this issue becomes a significant challenge to the ubiquity of the ISBN. A third issue concerns the rapid influx of new titles as a result of digitization programs. At this point, it's unknown whether any of these titles will be subsequently broken down into parts, (although this seems inevitable,) but that further compounds the issue of how ISBNs - or other identifiers - will identify this content. Some may argue that, as the supply chain compacts the connection between producer and supplier becomes tighter and a specific item identifier isn't required. Maybe that's true; however, I believe it's far too early in the transition to digital content to make this judgment. Unfortunately, if we shrug our collective shoulders to these issues, this non-action will set a precedent from which we as a publishing industry will be unable to recover. The ISBN standard united the industry from author royalty statement to store shelf and, while I emphasize the ISBN is far from dead, there are sufficient warning signs to suggest that the ISBN may be unable to thrive in the 21st century as it has over the past 40 yrs. As a community, we need to recognize that the ISBN may not be meeting its intended market need and that the future may make this deficiency even more stark. From an international perspective, ISO could help by reconvening a partial (or full) revision of the standard; it seems incompatible with the speed at which all industry changes that we can continue to live with a 10 year revision cycle. In my view, ISBN could benefit from an accelerated revision cycle while the result of non-action could be increasing irrelevance. Into this mix I would also add that ISBN can no longer stand generally independent of other identifiers, such as a work ID or party ID. For example, while assigning ISBNs to pre-1970 titles may make an ISBN agency's revenues bulge, it may not be the most effective proposal for the supply chain. A more appropriate approach may be a combination of work ID, party ID and ISBN and, for this, we require a cohesive methodology and possibly a 'merging' of these standards in a more formal way. A Digital Concierge – May 21st, 2009 Authors, writers, illustrators, photographers, etc. all need to produce content for publishers but doing so in a world increasingly dominated by technology becomes a challenge. The more
  • 68. PersonaNonData: Predictions & Commentary 2006 - 2013 67 | P a g e technology is interwoven into the creation and leverage of content, the more it becomes clear that pro-actively managing the intersection between content creator and technology represents an imperative for publishers. Publishers want their contributors to focus on content creation not the help desk. As functional responsibilities change within publishing houses, we will begin to see the morphing of the roles of editorial, marketing and promotions assistants into something akin to a ‘digital concierge’ Functional responsibilities are changing within a publishing house not least because the publishing process becomes less linear. It will no-longer be typical that a book ‘commissioned’ or ‘acquired’ sits proudly at the front end of a long sequential set of steps that ultimately lands the book on a shelf somewhere. In the new model, a book may be the last item produced after what may look from today’s perspective like a meandering route to publication. Truth is, there may not be ‘a model’ as publishers become more attuned to how consumers want to interact with content and as they experiment. Finding and engaging with an audience becomes both fractured and expansive and options to interact can seem at odds: Facebook versus MySpace or twitter versus friendfeed, and a publisher is unlikely to want their ‘investment’ (i.e. The Author) to be distracted by those considerations. Not only will publishers build these relationships on their authors’ behalf, they will see doing so as an additional content creation opportunity. The ‘traditional book’ may reside at the center of additional supporting material from on-line chat to PowerPoint webinars to audio and video interviews. Of course, the book may also be a secondary rather than primary outcome of one of these publisher/author social communities. Social networking is a catch-all phrase that can describe many things, but typically we use it to explain the concept of reaching customers via the web; whether the consumer takes specific action – commenting or emailing – thereby involving them with the content, or the creator (author and publisher) pushes interaction using tools like Facebook, twitter and MySpace. This can all be overwhelming to an author and, left to their own devices, they are likely to be unsuccessful; hence, the concept of a digital concierge. The job of digital concierge grows in significance as more and more material is introduced to the market via the web. As mentioned above, the web community around an author almost becomes their studio where new material is introduced, discussed and ‘published’. The author will require a digital concierge who will marry and blend the appropriate technology tools so they are not a distraction to the content producer and they complement the experience of the consumer. There is much to ponder here as trade book content moves to the web and the role of the publisher changes. While the job description for the digital concierge may not be written yet, I see this position as potentially critical to the successful migration from a trade print world to one dominated by social communities. Silos of Curation - April 29th, 2009 Publishers curate content but they don’t really do it well. And that’s a shame, because curation will be a skill in high demand as attention spans waver, choices proliferate and quality is
  • 69. PersonaNonData: Predictions & Commentary 2006 - 2013 68 | P a g e mitigated by a preponderance of spurious material. That’s why companies such as LibraryThing, Goodreads, Shelfari, weread, BookArmy and others like them may be positioned to leverage communities of interest that mimic the ‘siloed’ offerings that larger and more mature publishing companies have successfully been offering their customers in law, tax and education for many years. Trade eBook sales are only 1% of total revenues and, while they are growing rapidly, they may take as long as five years to reach 5%. In other publishing segments, this growth curve would be viewed as a failure as information and education publishers actively manipulate the market to their advantage to force faster adoption of e-content. In pushing adoption of eBooks, trade publishers do not have many of the advantages that information and education publishers have. Generally speaking, information content is more transactional: Looking up a reference, seeking a specific citation or definition or article. Educational content is modular with material that is created for specific purposes but which can also be “rebuilt” for other purposes. In both cases, the producers of these products exert some control over their delivery and have impressed on their markets a platform for delivery. (I have discussed this platform approach here). There are several reasons information and educational publishers are able to pull off the platform approach. One is that they have successfully aggregated content around discrete segments: law, tax, financial, higher ed, k-12, etc. This, in turn, has enabled them to clearly identify their markets and build solutions that match their customers’ needs precisely. That is not the case in trade publishing. Trade publishing can be anarchic and it is not uncommon for a publisher of primarily gardening and lifestyle books to publish three or four mysteries as well. While aggregation into silos of content – becoming the Science Fiction or the Christian publisher (as West and Lexis became the legal content silos, for example) – is possible it may not be likely. While this strategy may appear logical, I don’t believe there is a sense of purpose within trade as there is in Information and Education. And as publishers continue to be preoccupied with their own corporate brands, the desire to focus on content silos becomes less apparent. Since content is the foundation of the ‘platform’ approach evidenced in the other segments, a similar strategy will not apply in trade. Something similar to the platform approach may take shape in a different way with intermediaries playing the role of curator. This is an approach that companies such as Publisher’s Weekly or The New York Review of Books might have adopted if they had been more prescient. The capability to guide consumers to the best books, stories and professional content within a specific segment (without regard to publisher or commerce) may come to define publishing in the years to 2020. (See Monday’s post). Expert curation can simplify the selection process for consumers, aggregate interest around topics and build homogeneous markets for commerce. As an added benefit to these intermediaries’ customers, publishers will chose to focus intensely on each segment and offer specialized value-adds particular to that segment. As content provision expands – witness the delivery of all the books in the Google
  • 70. PersonaNonData: Predictions & Commentary 2006 - 2013 69 | P a g e Book project – readers will become increasingly confused and looking for help. It seems inevitable that intermediaries between publisher and e-commerce will meet that need. BookArmy was launched by HarperCollins earlier this year while the other companies have been around for a while. BookArmy has taken a (thus far) universal approach and lists all books rather than those only published by HarperCollins. BookArmy may or may not be successful, and it is intensely difficult to launch a site like Librarything or Goodreads that grips the imagination and passion of its audience, but that is what makes these sites ideal incubators for new thinking and new approaches to publishing. In their current states most (all) of the curation is provided by the community and tends to be post-publication focused. Having said that, it would not be too difficult to see a new ‘layer’ of curators emerging who could provide direction and recommendations to readers on forthcoming titles. And, in addition, these curators could manage their subject “silo” to help readers better understand and explore their subject without regard to the publisher. Experiments in this area have started with, for example, LibraryThing working with publishers to provide a reviews program for forthcoming titles. Readers don’t really care how many books are published in a year but they do care about knowing which titles they should read based on their interests. Increasingly help will be on the way but it is most likely to be presented as agnostic of publisher and curated around logical subject classifications. Who Wants to Pay for “Content”? - March 9th, 2009 Suggestions newspapers charge for content ignores deeper questions about their value proposition, the fourth estate and democracy. Reports that the owners of Newsday plan to charge users of their web site for access have been received with equal parts hilarity and incredulity, but this is only one of many public displays of desperation on the part of newspaper owners over the past two or three months. Almost simultaneous with the deluge of bankruptcy filings and threats of closure that have run through Philadelphia, Miami, Chicago, San Francisco, Sacramento and Seattle since Christmas, newspaper owners have been openly discussing the idea of charging for online access. As most readers and users (aka customers) of online news sources know, that approach is not going to work because there is simply no value proposition presented by 99% of the incumbent newspaper businesses. At a base level, newspapers failed to understand how their customers’ needs had changed over the past twenty years. Instead, newspaper owners chose to focus on maintaining their margins and offering dividends at historically high levels, rather than in reinvesting in the future. Like many businesses, they made a simple but tragic mistake: They thought it would go on forever. Many large publishing companies were content to pat themselves on the back for attaining economies of scale across their trans-national companies which made newspapers in Salem, Oregon and Newport News, look and read virtually the same. In these mid-market locations,
  • 71. PersonaNonData: Predictions & Commentary 2006 - 2013 70 | P a g e while consolidation had made many cities one-newspaper towns, the genesis of what has become one of the biggest dangers to survival of the newspaper industry has emerged. Community reporting, with the diligence and aggression that supported the development and growth of newspapers all the way back to late 17th century England, has been on the wane for years. Sadly, local journalism, as we traditionally know it, is disappearing and, with it, a measure of democracy - particularly as it relates to local, county and state government. Last week, I was discussing this topic with an acquaintance living in a fairly affluent part of Central New Jersey. He noted that, in a wide swath covering eight to ten townships and a number of counties, he wasn’t aware of more than one journalist assigned to that market from the larger state-wide newspapers. In Hoboken (regional HQ for PND), where mayoral and city council budget incompetence has seen our property taxes increase 50% in the past six months, there is rarely any local media coverage nor any attendance at city business meetings by traditional media. And forget investigative reporting - even in a state where you could throw a rock in any direction and hit a shady politician. The lack of journalistic attention means that one of the mainstays of democracy (the fourth estate) is eroded and this is seen starkly in Hoboken, where private citizens are forced (on their own initiative) to file freedom of information requests to gain access to basic public interest materials such as meeting minutes and financial statements. Recently, a number of New York and New Jersey newspapers announced they would be beginning a content-sharing network that might enable each to focus more on their local news reporting. However, MediaDaily believes cost-cutting is the focus The latest iteration of the new content-sharing model brings together The Record of Hackensack, New Jersey, The Star-Ledger of Newark, the Times Union of Albany, the Buffalo News, and New York Daily News, which apparently organized the consortium. According to the papers, the Northeast Consortium "will enhance each publication's coverage in the region by exchanging articles, photographs and graphics." But the club would probably be better described as a cost-cutting measure, given the dire circumstances of many of America's daily newspapers. Few of the newspapers we currently recognize will survive in the US. It is just a fact, but the irony is that significant news and community markets exist. New entrants will address this market and, in numerous cases, they are already making in-roads to address particular market segments. This brings me back to the notion of charging for “content” which many newspaper companies are debating. Newsday might succeed but only if they are able to establish community, service(s) and context around the reporting they do. The reporting will need to be far deeper and almost, by definition, becomes unique: Both in terms of its relevance to the user and the fact of its collection (after all, no one else is doing it). That’s a tall order for an organization only thinking about slapping a fee on product that looks increasingly generic. In their case (and others are thinking the same thing), asking readers to pay for “content” will fail. About three years ago, a curious guy set up a website named Hoboken411. He started going to all the council meetings and actually reporting, visiting and reviewing local restaurants, keeping
  • 72. PersonaNonData: Predictions & Commentary 2006 - 2013 71 | P a g e up with local happenings, generally mouthing off and adding photos. The web site, now a virtual town square, appears to be providing Perry a decent wage but its popularity is really evidenced by the number of comments each article receives. Almost every post (of substance) garners 50 comments and often many more. The ‘discussions’ are often vitriolic and opinionated, but every local politician and concerned citizen of Hoboken now visits the site to understand what’s going on. Make no mistake - this is an ‘unprofessional’ site (all due respect) by old media’s definition, but Hoboken411 is a precursor of the emerging local journalism of the near-term future. Traditional newspaper media companies are still consumed by “the machine”. Obstacles as intransigent as union rules preclude a journalist from carrying a video camera and recording equipment and delivering multi-media presented on the newspaper’s website. Perry and those like him have no such restrictions. Whereas actual newspapers could logically be considered a ‘platform’ for the delivery of content, these old-line publishers have no online equivalent. Real success in local reporting would require an ability to templatize and automate the presentation of their news no matter how local the segment. This would allow the newspaper publishers to extend technologies including mapping, photo uploads, comments, polls, groups/community and other services similar to those offered by companies such as yelp.com and craigslist. (In fact, it is hard to understand why there is no local/community news on either of those sites). A few years ago, I predicted that the NYTimes would open their platform for other newspapers to use. In doing so, I saw the NYTimes had the potential to build a revenue stream as a service provider, as well as providing the company with a wider pool of potential product-development ideas. My thought was not that the Times license this technology to other large city newspapers but, rather, they do so to medium- and small-sized news organizations. Application of this technology would enable these local news organizations to focus on gathering hyper-local content and building community, while giving the NYTimes a much wider profile. And, obviously, the Times would benefit from important stories that surfaced up to their level from its wide variety of content partners. Last month, at an invitation-only open house for web developers, one of the attendees addressed this very issue and it seems the Times maybe thinking along these lines. How this idea develops will be interesting to watch. Regardless, the Times is something of a different beast even among large-city newspapers. In the US, the WSJ may be the only other paper that could do something similar. In February, the Times launched two ‘hyper-local’ websites which could represent their first step in developing a more local approach and it remains to be seen how successful this will be. We know ‘viral’ is impossible to bottle and it is very likely that, when local journalism returns in some organized and coordinated way to the local communities of central New Jersey, its origins are more likely to be anarchic; however, if these ‘journalists’ (like Perry) have access to powerful tools and platforms such as those the NYTimes could offer them, we will see a revitalization of this significant cornerstone of democracy.
  • 73. PersonaNonData: Predictions & Commentary 2006 - 2013 72 | P a g e Presuming No Book – February 17th, 2009 Henry Ford said “They can have any color as long as it’s black” and, in so doing, summed up what industrial production is all about. What we gain in scale, we ultimately lose in choice becoming - in the process - beholden to the manufacturer to deliver to us what they believe we desire. Manufacturing has obviously come a long way since the age of Henry Ford, and a few industries have even become so flexible that consumers sometimes don’t believe they are receiving specifically made products. I remember the Japanese bike manufacturer that had so improved their production processes that they could measure a client for any model of bike in one of their showrooms, build it and ship it to the customer the next day. The problem: Customers didn’t believe the bikes were made to fit since they received them so quickly. The solution: Hold production for a week and then send out the bikes. It would be a stretch today for anyone in publishing to agree to the proposition that the future of their business could depend on not publishing a ‘finished’ product. As the industry meanders forward, we are reaching the point when presuming (and, I think, limiting) how a consumer is going to use content will significantly restrict the potential market. Historically, a publisher would codify how their content was to be used by signing a series of agreements for versions in audio, large print, foreign language, book club, etc. Not only are those agreements increasingly cumbersome they are identifiably restrictive as more and more potential consumers are seeking out content which is flexible to their particular situation and purpose i.e.: I’ll take the French, large print, audio version please: Only they can’t - at least not legally. Making this concoction easy for a consumer to access may seem like a small market opportunity but the point is much larger: Let the consumer decide how they want to use and engage with your content. There is a huge preoccupation with e-Books today. E-Books are a format and a distribution mechanism and, as such, not particularly interesting. To me, the issue is a little like discussing the capabilities of a new type of printer. The real interest lies in the changing production processes that enable the rendering of content on e-Readers. Not perfect yet (by any means) and many publishers are still vandalizing print files to get to the format they need for the Kindle or what-have-you. Some may recall that, through the 1990s, consultants and publishing “know- it-all’s” spoke about developing production processes that were independent of format, suggesting the end product could appear in any form. Automation and changed processes have, of course, been implemented, but no one reached the point where they could produce multiple versions of the same product in an infinite number of formats and combinations. At the Start With XML conference (and subsequently at TOC), we witnessed the dawning of a new type of publishing. Fully customizable, adaptable and capable of matching the particular needs of the consumer, XML content will enable the type of flexible delivery of content that is coming. Companies such as SharedBook are positioned to facilitate the unanticipated, unique and creative ways a consumer may want to use your content. And, as a publisher, you should be comfortable with enabling the consumer to - in effect - make his or her own product. As an example, a publisher can make content available to consumers during what historically may
  • 74. PersonaNonData: Predictions & Commentary 2006 - 2013 73 | P a g e have been considered the production process: Consumers can comment, add their own notes and links, perhaps add their own content and, at the point the consumer is satisfied they have a product they want, they can ‘publish’ it. That point of publishing may or may not coincide with the publishers’ date and, in fact, the publisher may not ever ‘complete’ their books in a traditional sense but allow them to live and breathe and enable any future consumer to decide when they want to ‘publish’. Experiments such as those at Future of the Book have enabled a dialog between reader, author and other readers that may allow a ‘user’ to choose to ‘publish’ their book whenever they are ready, even as the conversation continues online. When the consumer does publish this title, they do it their way. As a publisher, I may be investing in an XML-based content warehouse, but if I continue to think of the traditional output (i.e., printed books where I select cover, trim size, paper or eBooks where I select the formats and ‘extras’) then my XML investment is underutilized. I will also be short-changing my customers. As publishers, we need to experiment (and educate authors) but I don’t need to (or can’t) presume how my customer is going to use the content; all I need to do is to ensure that they have the ability to use it any way they want. Across all media, we haven’t reached that spot yet. We still presume, based on a limitation of the technology we had at hand, about what our consumers will purchase and how they will use the material. Certainly, there are ‘degrees’ of XML and perhaps some publishers only have a title and header (for Chapters) tag for their titles, but others are experimenting and building more sophisticated products. Any buyer of a business book, for example, should be able to access the full content, read it and hear it, translate it, create a presentation in PowerPoint or download a ‘business briefs’ version. If they want, the user should be able to gain access to the document so if they want to print 75 copies of selected chapters for their executive outing and bundle the content with other material, they can do so. In the near future, publishers may decide not to print or ‘publish’ a finished product themselves. Aside from selected titles – maybe most front-list where (only as an example) volume may play a role - a publisher may not print any books. Rather, they will enable retailers to print their titles based on demand. Barnes & Noble prints all the titles they carry in their stores for example. They are effectively on-demand with very little, or no inventory carry. B&N has the ability to use the ‘content’ however they want in print form: From premium boxed and signed versions to mass market. In another example, Michael’s the crafts store, enables its customers to integrate ‘how to’ content from a publisher such as F&W into their scrap-book projects or print selected chapters from books at an in-store kiosk. But these are a small subset of the breadth of content distribution that could occur if publishers stop limiting themselves from thinking about the traditional ‘book’. As described, the application of flexible publishing using XML as a base is easier to comprehend for some publishing segments than others. However, even novels can be read as comic books: long form is always singled out as ‘oh, but we can’t do that with Christie’. Perhaps, but maybe your
  • 75. PersonaNonData: Predictions & Commentary 2006 - 2013 74 | P a g e customers can and enabling that will generate more revenues and more customer satisfaction. A real challenge would be to turn a comic into a long-form novel. Lastly, some argue that the publishing industry is doomed as the tools to reach customers become easier to access and use. That presupposes that the publishers and the industry remain static. The future isn’t static and, while some publishers are going to fail, those publishers that presented at the Start with XML conference such as Cengage, Simon & Schuster, Hachette and others prove that innovation and experimentation are alive and our future may not be so dire. Pimp My Print - December 10th, 2008 Many pundits pontificate on the demise of publishing (myself included and some others I could mention) and while many of these versions of the future are well intentioned they often lack substance. Today in Computer World - an obvious organ of reasoned strategic discussion about book publishing - is a perspective 'from technology' that decries the effort by Penguin and some others to launch their content on mobile platforms as 'painful'. The author's wider point seems to be that publishers need to place their full content - not just snippets - in as many places as possible so that readers/consumers can access it with as little difficulty as possible. Music publishers did not do that and became the victims of rampant piracy, and some have argued that because electronic access to music content was limited this drove piracy. Had there been easy access and easy payment options perhaps the music industry would be in a different place now. But that is 20/20 hindsight and at the time, you would have to have been a certified genius to have seen that. Publishers have a different issue. Reading is immersive: We are active readers and passive (music) listeners. 'Pimping' the content so that it appears on a smart phone or a web browser or a flat panel will only ever have limited success. It is tactically important to do this with the current inventory of content that a typical publisher will own, but that's not going to sustain the future of the business. Any publisher who's digital policies and activities are focused entirely on retro-active conversions and the migration of their historic product packaging to an electronic environment will see their market whither. It is possible that some publishers may make a choice to cash-cow the existing content and sell it on every available electronic platform they can. That makes some sense but not if in doing so they believe that model will sustain their future publishing programs built on delivering readers a 250 page novel or a 12 chapter business book with an index limited by the number of blank pages left in the last folio. Pimping the print compounds an issue publishers have faced for a long time (forever?). They don't really know what consumers want. To paraphrase Wannamaker 'I know only 50% of what I publish sells, I just don't know which 50%' (He said it about advertising). The publisher of the future is going to spend more time understanding the consumer and fulfilling their needs (marketing 101: a need is filled not created) than transferring the current model to phones, screens and digits. If I were heading a publishing house, I would hire a band of 25-30 year old editors/writers, give them a budget to acquire content and have them build a new 'publishing'
  • 76. PersonaNonData: Predictions & Commentary 2006 - 2013 75 | P a g e operation unfettered by print runs, business models and pub dates. Their responsibility would be to create content a target market valued enough to use, to experiment in how to monetize the content and to be able to replicate the model. With guidance - not oversight - provided by the many experienced managers existing in a typical publishing house the team won't fail. And yes, I would do this TODAY. So forget pimping existing print and think about delivering content consumers need. Death of the Big Box - December 3rd, 2008 Travel up Route 17 in northern New Jersey and you traverse the spectrum of retailing. These stores - from Ikea to K-mart - represent the shop windows on late 20th century retailing but, in contrast to their apparent ubiquity, the days of the stand-alone big box may be numbered. A number of years ago, I saw some old photos of Route 17 and was shocked to learn it used to be a four lane (two each side) parkway with a wide grass median strip bisecting its length. Today, it is a clogged eight-lane shopping aisle and is just one of similar examples across the US from Rockville Pike in MD to Beach Boulevard in Orange County, CA. Barnes and Noble, on their call a week ago, noted that many of their leases are coming due and these will be renegotiated at lower rates. While this sounds like good news to shareholders, the current dire economic situation coupled with the Border's situation will result in a significant reduction in superstore locations. Projecting current physical retailing trends will make many current locations simply unprofitable even at significantly lower rents. We may be witnessing the demise of the suburban book superstore and suburban consumers may be indifferent. Online retailing is going to be the huge winner across all retail segments, but particularly in book retailing. We have a perfect storm: An excess of media options reducing the time traditionally spent reading books, the economic slowdown reducing all spending, the increasing acceptance and comfort of online retailing to virtually all consumers and the advent of the online superstore which encourages a cost-conscious basket approach to consumption. Increasingly all of us - not just those of us who have been checking our bank account and buying airline tickets online for years - will be buying everything online, at the best combination of pricing and free delivery, and all without dealing with the expense and hassle of traveling. Multi-store malls will continue to live on for many years. In contrast, we will see many large, empty retailing boxes punctuating the sides of our traditional highway shopping aisles. Already this year, the big-box retailing environment is dire with a range of store liquidations and bankruptcies from Linen & Things to Circuit City. In years past, other retailers would fill these spaces with their new formats or new concepts, but those days are gone never to return. Retailing innovation - to the extent that it exists - is emerging on the web but not in physical retailing. The big losers will be the real estate owners who won't be able to find tenants (there are only so many ice rinks or roller rinks you can have in any one community).
  • 77. PersonaNonData: Predictions & Commentary 2006 - 2013 76 | P a g e Superstore physical book retailing, particularly its suburban version, may be a casualty. For a strong retailer like Barnes & Noble there will be plenty of time to adapt but others will fail. The current recession is going to change many things and some business segments just won't recover as consumers transfer all their shopping online. The economic crisis will push retailing over an imaginary Rubicon: More physical stores are unprofitable so they close, which reduces consumer access and pushes the consumer online. The cycle repeats itself and big-box book retailing will be no different. Ironically, big-box retailing made shopping convenient for suburbanites and retailers chased the consumer diaspora with vigor. The convenience that suburbanites sought is now the demise of the same retailers that promoted convenience. Physical can't compete with virtual. Tant pis. But perhaps it's not all bad news. Mitigation may be driven by a population migration back into city centers which is most apparent in big cities like NYC, Washington and even Los Angeles. Couple this urban population growth with the daily office crowd and we have the re-genesis of an old phenomenon: Main street shopping, which doesn't attempt to compete with the web stores abundance but serves deeper consumer needs. Retailing on a small scale operating with smaller inventories that turn rapidly, defined as 'scarcity' merchandising. The notion that if you don't buy it now it will be gone - which is the philosophy of The Gap, The Limited and some others. Rack Jobbing the E-Book - July 16th, 2008 A change, equivalent to the launch of the mass market paperback just took place but did you notice? Months in advance of the expected release of the new iPhone thoughts ran wild on the potential for an Apple iBooks store as much for its potential impact on sales as for its counter point to Amazon.com. With the launch of the 3G iPhone publishers have been found wanting, sadly waiting for the market to be gifted to us rather than proactively setting out to define it. This post from Kassia Krozer sums it up perfectly: On a weekend when headlines were there for the grabbing and customers were searching for both toys and content, the publishing industry, perhaps practicing summer hours, was curiously silent. Not a single major initiative, announcement, horns-blaring call to check out these great offerings on iTunes. Call me crazy, but I’d expect an industry that salivates over moving 150,000 units to be all over the potential for reaching seven million “mobile is the future” customers. Are you not out there, listening to readers, gauging their interest? They want, you have, and you’re still hiding the goods. I get this isn’t the largest market you have, but is that an excuse to sit on the sidelines? Publishers are again about to have a market dictated even as they continue to complain about the market power of the online retailers. Now $9.99 may become a defacto RRP for eBooks and as volume increases via the prodigious iPhone apps store publishers won't know whether to laugh or cry. When mass market paper backs gained market acceptance at Woolworths in the 1930s publishers gained access to a market they never would have developed on their own.
  • 78. PersonaNonData: Predictions & Commentary 2006 - 2013 77 | P a g e Books were suddenly available for a dime and as publishers stood on the sidelines it wasn't until years later that they entered the market directly or bought up the main suppliers. Will history repeat itself with publishers buying eBook apps suppliers like Fictionwise or build their own applications? Hopefully, at least one or the other. Traditionally, we think of distribution and content development as separate disciplines within publishing companies but in the e-Publishing world they co-mingle. Content optimization becomes the normative state where the end-user builds their own product out of a content repository created by the publisher without limitation on how the end product is rendered. The 'distance' between publisher and end-user (where distribution as a function currently sits) is wide but becomes virtually non-existent in the future state. To bring us back to the iPhone circumstance, as long as publishers continue to think in terms of traditional functional silos and roles and responsibilities they limit themselves in their ability to leverage their assets. In contrast witness Amazon which has never considered any aspect of the publishing value chain to be off limits and more publishers need to think in this manner if they want to redress some of the advantages Amazon and others retain (or new competitors develop) in the marketplace. Amazon The Monopoly – March 28, 2008 Trouble at Mill. Manufacturing of old had it that the mill owner owned the means of production and the mill workers toiled within an inch of their lives, lived in company barracks, spent scrip at the company store and if they had anything left they banked at the company bank. Amazon is a latter day mill owner. The company is attempting to tie their client/POD publishers to them to the exclusion of other relationships the client publishers may have through Amazon's web of administrative, financial, distribution and content tools. As a practical matter, it is becoming harder (and may be financially impossible for many small POD publishers) to maintain separate relationships with Amazon and all the rest of the publishing community. The blog world is enraged at the moment over Amazon's new policy on POD. The company is effectively telling POD customers that if you want to sell your POD products via the Amazon store you need to be on our platform using our tools. If that means all your titles need to be converted then that's your problem. This is not a situation where these POD publishers can say 'I'll just go some place else'. Amazon has sucked them in because of all the wonderful tools they offer the publishers and of course the sales penetration. In announcing the Booksurge/ CreateSpace merge in August 2007, Amazon's senior v-p, North American retail, Jeff Wilke said, "The new CreateSpace Books on Demand service removes substantial economic barriers and makes it really easy for authors who want to self-publish their books and distribute them on Amazon.com." As it turns out this is true, but there are some significant caveats. The Wall Street Journal was kind (and misleading) in its assessment of this Amazon initiative:
  • 79. PersonaNonData: Predictions & Commentary 2006 - 2013 78 | P a g e "Amazon.com Inc., flexing its muscles as a major book retailer, notified publishers who print books on demand that they will have to use its on-demand printing facilities if they want their books directly sold on Amazon's Web site. The move signals that Amazon is intent on using its position as the premier online bookseller to strengthen its presence in other phases of bookselling and manufacturing. Amazon hasn't been merely a book retailer for some time. While many in the industry - PND included - can't help but have admiration for this company they have amassed a level of market influence across the publishing value chain that should concern everyone. Today, the issue is focused on a small (ardent and vocal) minority of POD publishers who's entire livelihood in many cases is dependent on the Amazon retail expanse. The WSJ should know better. Without being too dramatic, the release of Windows 3.1 heralded a period of intense exclusion at Microsoft: If you didn't play ball with them you essentially had no marketplace. Perhaps at first blush the publishing industry doesn't appear to have any correlation to the software world but with the migration to 'platform' based publishing (a publishing version of iTunes for example) we are seeing the germination of a world where there are only one or two legitimate channels to the consumer. If their actions in the POD world over these past two months are anything to go by then Amazon definitely has monopolistic tendencies. Munich: February 6th 1958 - February 6, 2008 Today is the 50th anniversary of the air crash that killed eight members of the Manchester United football team among 23 who died when a plane they were on crashed on take-off. It was the aircraft’s third attempt to gain altitude but the snow and ice that had accumulated on the plane and slush at the end of the runway ensured it never achieved the lift necessary for take-off. The plane clipped a fence at the end of the runway and split open on impact. The team members who died were Roger Byrne, Billy Whelan, Tommy Taylor, Duncan Edwards, Mark Jones, Eddie Colman, Geoff Bent and David Pegg. It is hard to underestimate the impact the tragedy had on Manchester and England at the time. All were members of a youthful team dubbed the Busby Babes so named after the team's manager. Many of the dead not only played team football but had already been named to the full England team despite their youth. There was a wider context in that the crash occurred only 13 years after the end of WW2 and this team somehow represented a new generation free from the expectation of deprivation and war. In contrast to the US, the nation was only just starting to come out of the war years and rationing had only just ended. Duncan Edwards, ‘the young colossus’ was the soul of the team. At 21, he survived the actual crash but died from his injuries 15 days later. Perhaps the intervening years have added to his mystique but even in his day he was considered a special footballer. Bobby Charlton who survived the crash and went on to a phenomenal club and international career has said he was "his hero and a beautiful, beautiful footballer and he has never seen one better." Bobby played with George Best and against Pele, Eusebio, Beckenbauer among other great players of the
  • 80. PersonaNonData: Predictions & Commentary 2006 - 2013 79 | P a g e 1960s and 1970s. Family legend has it that some of the team were billeted in a rooming house my grandfather owned near the ground and that my father had a kick-around with Duncan and the other team members from time to time. Manchester United is a world club just like the Yankees but bigger. The Munich disaster punctuates any discussion about the team - even today, whether the fan is in Japan, China or England. It is one of those club facts that a new fan - or in my case a young fan becoming more aware - is confronted with. At the ground, despite all its changes in the years since, still has a clock set to the time and date of the crash. No one visiting the ground can fail to see it. No one knows what the Busby Babes team could have accomplished. This team, with an average age about 22, had already won the league title twice and the night of the crash they went into the semi-final of the European Cup for the second straight year. Sir Matt Busby, who almost died in the crash, went on to rebuild the team around the nucleus of the remaining players. It took another ten years until the team led by Bobby Charlton and another Busby wunderkind named George Best conquered Europe. Today, and this weekend there will be commemorations about this event for ‘the young players with the world at their feet – suddenly no more,” lest we not forget them. Brands to Publish – January 13th, 2007 Nancy Drew has always held a fascination for me, not because I clamor for a good girlie mystery but because of how The Nancy Drew series evolved. Established by Edward Stratemeyer, The Drew books were written by a number of ‘house’ writers (Mildred Benson) and the books were never dependent upon one author for their success. While the publisher of the titles was little recognized, the Drew series grew to become a strong branded product line and, as such, represents a model today's publishers may want to emulate. Corporate branding exercises little impact in the publishing world: We all know this and, while some publishers have tried to create brand strength (i.e., Paramount Publishing), success has been sparse and probably – in truth - not aggressively sought after. There are exceptions. I used to start my Intro to Publishing courses at Price Waterhouse by asking the group to name a publisher. I stopped doing this when a partner once popped up and said HARLEQUIN! While some consumers might be able to identify Harlequin or Hungry Minds or Fodors, they would be hard-pressed to cite HarperCollins or Simon & Schuster with any relevance. Consumers have little emotive connection with publishing trademarks (a fundamental facet of brand awareness) and publishers are unlikely to ever achieve this connection with consumers. So, in an age in which the author transcends the publisher (Patterson, Grisham, Ludlum, Courtnay) what is a publisher to do? Investing in a branding campaign would be expensive and ultimately pointless, but embarking on a strategy similar to that which produced the Drew books might be more constructive.
  • 81. PersonaNonData: Predictions & Commentary 2006 - 2013 80 | P a g e My extrapolation of the Drew example led me to wonder why publishers don’t establish their own character-based brands. More publishers will do what Nelson has done and drop imprints, but will they also start to develop their own character-based franchises? Clearly, it is hard to ‘bottle’ what makes John Grisham a popular writer, but there are examples where existing characters have been extended in new ways. For example, there is a cottage industry of TV soap-opera lovers who create stories, novelizations and back-stories for the characters that appear in the TV soap operas. George Macdonald Frasier took a minor character out of Tom Brown’s School Days and created The Flashman series of satirical historical novels. The book packager Alloy Entertainment (which got caught up in a plagiarism charge last year) also operates a Nancy Drew model. There must be many others. Publishers don’t have to look far to see how powerful character-based publishing could be. The comic book industry has been doing this for 50 years. In this industry the corporate brands (Marvel, DC Comics, etc.) have benefited from some of the reflected brand identity that characters such as Superman, Spiderman, Aquaman and others have created in the minds and behavior of consumers. In book publishing, the opportunities to create character franchises are there for the asking. James Patterson has embarked on developing an author/character franchise and, if publishers were smart, they would be thinking about creating contracts that gave them the ability to broadly leverage the characters that authors create. This would include (with the author's permission) ghost-written books and stories of both the main characters and development of derivative story lines out of the books (as in the Flashman example). The opportunity to expand the content output and publish to a ‘template’ would generate higher revenues for publisher and author, stable consistent output and content consumers could enjoy. The above scenario still accords some level of risk for publishers that the ‘powerful’ author may go off on his or her own. Given the examples in the music industry of late, some have suggested that major authors will do what Radiohead has done and walk away from the traditional publishing model. Some may, but it will hardly be an avalanche and this threat is no worse for a publisher than losing an established author to a rival house. The bigger question is how publishers can maintain a consistent funnel of marketable branded content. I believe publishers should be attempting to develop their own proprietary content franchises by building character properties in the same way the Nancy Drew series was created. There are several ways to develop this: Firstly, publishers can simply buy out an author’s work so that they own it in total and can leverage it anyway they want. Secondly, they can license characters from other media: Who wouldn’t want to read a hard-boiled procedural featuring Law & Order’s Lennie Brisco, for example? As publishers begin to travel down this road, they could evolve into character based enterprises similar to Disney and Marvel. This, in turn, would make them less susceptible to the whims of authors and the corresponding limitations of their contracts. HarperCollins is owned by NewsCorp which owns Fox. Assume that Fox owns the character "Dr. House"; why don’t you see a series of House mysteries written to a formula by ‘house’ (sorry)
  • 82. PersonaNonData: Predictions & Commentary 2006 - 2013 81 | P a g e authors whose job it is to churn these out every two weeks? And there is no need to limit the books to Dr House; any of the characters in the show should be fair game. Publishers who focus on their publishing brands have things backwards: They should see things from the consumer's point of view and that view is more than likely focused on either an author or a character. Build the product pipeline up with a character based publishing approach and the publisher may grow in the ascendancy. Obviously, authors are a critical component to a publishing house’s viability but as distribution flattens, barriers to entry drop and generally the industry changes. Publishers need to reassess their content-acquisition strategies to ensure they have access to revenue-producing assets that will remain with them for an extended period of time. Perhaps the Drew model will become more widespread. New Model Army of Self-Publishers – September 19th, 2007 The news that Author House and iUniverse.com were merging was not entirely unexpected, but it is interesting to me that the publishing community basically ignored the event. While it was reported in Publisher’s Lunch and Publisher’s Weekly, the report in PW focused on the question of job cuts which may reflect a limited interest in the strategic ramifications this segment poses to mainstream publishers. Led by Lulu.com, this publishing segment is exploding and the last thing being considered will be job cuts. Just look at the capabilities on offer at Lulu. Author House and iUniverse complement each other: A number of years ago, iUniverse.com made the strategic choice to add an extensive selection of professional editorial services to their suite of services, which surpass the service offered by Author House (and others in the market). Tactically, I think the two companies will slot together like jigsaw pieces. Random House has a relationship with Xlibris and is alone among the major publishing houses in building formal relationships with the self-publishing marketplace. I would expect other major publishers to jump into this space, in the short-term, through acquisition. The leverage these companies achieve over their technology, employees and fixed expenses, the processes they have established and the market they have built make these companies appealing. Ironically, there is a ‘democratization of access’ underway in publishing, which to date, most “publishers” have not participated in; but, this will change as traditional publishers look to the self-publisher market as a natural product extension. In the case of Author House and iUniverse.com, they each produce over 5,000 titles per year with total staff of approximately 100. In terms of titles per month and titles per employee, they shame a traditional large publisher. Everyone will argue that the quality of the content produced by self-publishers is poor, but this is no more true than the statement that all content produced by traditional publishers is exemplary. How often has a traditional publisher invested significantly in a title’s success only to watch it sell 300 copies? For the self-publisher—with an author pays model, no inventory and no promotion expense—there is only upside if a title takes off unexpectedly (and sells 300 copies).
  • 83. PersonaNonData: Predictions & Commentary 2006 - 2013 82 | P a g e I am not suggesting that the self-publishing business model will be adopted anytime soon by a major publishing house, but there are lessons to be learned from the success that the self- publishing industry has built in the last 10 years. Enabling technology has produced this ‘democratization of access’ and, while it is hard to imagine that there is that much content to produce, the numbers prove the case. Lulu is producing 4,000 new titles per week for a total of 300,000 newly released titles, Author House has over 30,000 authors and 40,000 titles, and iUniverse says they have sold over 5mm books. Amazon has invested in this area (B&N is getting out via iUniverse.com) and I see some convergence between the traditional publishing model and self-publishing. The content quality issue is irrelevant: Firstly because good content will always find its market and Secondly, because quality in the self-publishing segment depends not on the content but the service the author received. Get ready to see traditional publishers adopt some of the practices of the self- publishing industry. The New Publishing Experience: Build Your Own Book - July 10, 2007 Traveling to a new location for vacation (and sometimes business) can be an exciting event and generally a lot of planning goes into the effort so you make the best use of your time. Often building your ideal itinerary may necessitate the purchase of several travel guides (or in my case diligent note taking in the cafe at BN) and I can only imagine that this situation is even more relevant if you travel as a family. Having had a great time - and probably seeing only half of what you thought you would - you leave the travel guides behind in the hotel room because they don't fit in the bags. What if you were able to build a specific guide before you left that you could either print out before or carry with you as an electronic e-book? This is an idea that Penguin publishing unit DK are experimenting with which allows users to select content from their travel guides and build their own guide. I found the site a little clunky but the idea is sound and as a electronic platform DK could be in a position to offer far more content than appears in their DK travel books. If Penguin has other travel related content this could also be integrated with the DK travel content to create a distinct product that perhaps has more breadth than a user could get other than buying multiple books. Travel (book) related websites are (or have the potential to) generating decent advertising revenues. Since a travel guide is a glorified directory it will not be long until the web is the primary mode of distribution for this content as has been the case with traditional data driven directories (i.e. booksinprint). As e-products, the integration with content from other publishers, map applications, photos, video and Podcasting is not far away. For example, I want to visit Boston and I build a travel book that includes a history and background information on Boston, a walking tour of North Boston, a satellite map, restaurant recommendations in an around the walk and after lunch I want to go to the Museum of Fine Arts where I buy admission tickets, add the highlights of the collection tour and download the MP3 audio tour. Ultimately, I
  • 84. PersonaNonData: Predictions & Commentary 2006 - 2013 83 | P a g e want this 'packaged' so that I can either print it out and/or retain as an e-book or e-collection for future use. But wait a minute, does the interaction end there? Conceivably, I will be taking pictures and forging my own impressions about the visit. And perhaps I want to include experiential things, like what I had for lunch and whether I liked it. So the publishing platform I use to create my travel book of Boston should be something I can edit outside the confines of the publisher supplied content. As such the DK application is not so functional but there are options elsewhere that are starting to appear - and in the future there maybe nothing to stop DK from adding this functionality. One such application has been developed by SharedBook a software company in lower Manhattan. Sharedbook works with content owners who want to extend their relationship with their customers and enable them to self-select content and build their own book and in the process adding their own content. SharedBook works with customers who may not seem like publishers such as Regent Cruises and legacy.com but the functionality is similar to what I describe above. Clients of Regent cruises are able to select some core content to create their book while also adding their own specific content. So they can add pictures, annotations or full length essays on their cruise experience. There are a surprising number of clients who take advantage of this program since it serves as a high quality memento of their journey. Sharedbook has a relatively easy to implement solution and their model has enabled 'non- publishers' to treat as 'content' assets that otherwise would remain one-dimensional as marketing or promotional material. In the case of traditional publishers, the Sharedbook platform can allow publishers to engage their customers directly and perhaps with a stronger link because the publishers content goes along with the customers positive experience. Obviously, customers pay for the privilege of creating their unique books but the prices are both reasonable and set by the content owner. Back to my Boston example and using a SharedBook I could have a coffee table book produced with all the elements I selected before I left, those I added during my trip and the those I added after I return home. Once home I could scan the MFA ticket stub, the restaurant menu and add photos with annotations. Then I have my own memento of my trip. Models such as those I have described above will become more prevalent as publishers see the value in opening up their content repositories and allowing consumers to interact with their content. It is a trend worth following Hail the Death of the Book Review Section – April 30t, 2007 Over the past several months, there has been a lot of hand wringing and wailing regarding the demise of newspaper book review sections. The prevailing view is that if books are not supported by reviews in these publications then books will be less read. This is nonsense. I am a
  • 85. PersonaNonData: Predictions & Commentary 2006 - 2013 84 | P a g e staunch supporter of newspapers, but they are locked in a vortex of decreasing print circulation which the reviews sections are just a part. Perhaps it should be no surprise that publishers do not want to believe that a paper based medium is fast becoming irrelevant, but rather than try to buck a trend, publishers should be evacuating this medium just as other advertisers have already done. If I advertised on a bill board at an intersection that was made redundant by a by- road, I would be a mug if I continued to advertise on the same billboard rather than seek to advertise closer to the bi-road. 'Advertising' has to morph into something different. Word of mouth is incredibly powerful as is replicating some of the in-store benefits (excepts and chapters) which is how book focused sites and bloggers can support publisher's efforts. It is doubtful that the existing print display ads work at all. They are not frequent enough and obviously one dimensional. Other than for identifiable authors it is unlikely that one of these ads will hit a reader on precisely the right occasion. These print ads also expire virtually immediately when the paper is read. On the web however, a 'body of work' can develop around a title that includes multiple reviews and supporting material from a publisher that will grow in depth and value over time. Think about how this supports the 'long tail' of publishing. Publishers have more not fewer options when it comes to supporting their titles via review sites however they do not seem to be doing so aggressively. Currently both publishers and book review editors seem locked into presenting reviews in antiquated ways. The branding and site traffic that newspapers exhibit on their web pages could be better maximized by publishers to support their titles. In Sunday’s NYT review section, Clare Messud reviewed Edith Wharton. There were no first chapters (excerpts), no similar titles previously reviewed by NYT, no reviews or books written by Messud and no purchase option. For many years, UK national newspapers have offered more functionality and purchase options (not via Amazon.co.uk) for their book sites, and is a lead US publishers should encourage. (The Times). The suggestion that eliminating review sections from major newspapers will reduce exposure to books in uninformed. There has been an explosion in the number of sites dedicated to providing good, authoritative reviews of books. Most of the work of identifying the best of these has been done and some sites have become strong ‘brands’ themselves. It may require more administration dealing with these sites/reviewers versus the metro newspapers and Publisher’s Weekly but increasingly the people who buy books are looking to these types of sites to aid their purchase decisions. Not surprisingly, these sites should represent an increasing part of a publisher’s promotion plan suggesting that galleys and pre-publication materials should be circulated to these influencers to support book launch activities. (Librarian’s Place, Grumpy, TheMillions, to name three). A hidden third benefit derives from the breadth of the web itself that begets a wide expanse of coverage. Reviews of obscure titles can be found to be supported by proficient and/or professional writing and readers do not have to wait until Sunday for the NYT or LAT to tell us what is important or not worth reading. Additionally, these reviews are supported and linked to by other reviewers and together with comments by consumers further ‘legitimizes’ the review (and reviewer). Hence, over time, a 'body of work' to support the titles and continued sales
  • 86. PersonaNonData: Predictions & Commentary 2006 - 2013 85 | P a g e down the long tail. There is so much more the web affords in support of reading and books that it is tragic that so much attention is paid to supporting a delivery mechanism that is not only sub-optimal but in its death throws. Why don't Libraries Have Publishing Programs? - April 9th, 2007 My introduction to Charles Bukowski occurred via the display cases inside the Boston University library lobby, and I was drawn to them because I happened to be working in the library's special collections department at the time. The special collections department at BU is quite renowned and was established by Dr. Howard Gotlieb who recently died. (Gotlieb actually wrote one of my recommendations for business school). My job was less intellectual than hired muscle since the library was becoming so overwhelmed with boxed submissions they needed someone to unload the stuff and place the materials in uniform boxes on shelves. I didn't have too much time to peruse the material in some of these boxes but I do recall a wealth of material from Herbert Swope and Fletcher Knebel, who's boxes were filled with photos of JFK and his family while they were all in the White House. Some of the material deposited wasn't quite so moving or important (at least to my eyes) and in many cases it was clear that entire desk draws had been upended into a box and sent off to BU. These boxes often included things like gum, blank paper, pens, pennies, paper clips and other detritus which had minimal residual value to scholars. BU did have several archivists responsible for cataloging the vast amount of stuff that was deposited. They seemed to work fairly methodically (slowly) to identify the important material and provide tables of content for scholars. Increasingly, the material in formal special collections libraries like BU and in local libraries is being digitized and there is little doubt that this will accelerate. Books constitute some of this material and are included in scanning projects but the bulk of material in these collections would be non-book format material such as documents, letters, posters, art work, banners, etc. Displaying this material is regarded as an important activity at libraries. After all, they have expended the effort to collect and catalog the material and they want people to know they have it. Hence the display cases at BU and in the lobbies of many other libraries. On a sales call to a small public library in Redlands CA a number of years ago, our meeting was held in the special collections room which contained their collection of local southern Californian historical material. Much of this material probably doesn't exist anywhere else and sadly patrons had to ask for permission to enter the room. With the glacial progression towards digitization of this material it does mean that patrons will eventually have more access to this material online but it will take some time. Digitization will enable more opportunities for the library to benefit commercially from this material and I am curious why more libraries are not recognizing these opportunities. Two of these include the electronic version of the traditional display case and traditional publishing. Both of these require the touch of the archivist/curator to prioritize, explain and make relevant
  • 87. PersonaNonData: Predictions & Commentary 2006 - 2013 86 | P a g e the chosen material. Not everything in a collection will be important or interesting enough for the average patron and the editing function remains important to ensure that the interest of the patron is held through the presentation. The electronic version of the display cases are computer terminals and/or online access that enable some self-directed exploration of the material and these are showing up in some libraries. In an electronic collection, this material should be available to other institutions that want to access it where the material could add to or enhance material they may be also be featuring. The network aspects of intermingling collections and expertise is nascent in the library world but could become a very exciting area of study. Increasingly, much like museums, libraries will be able to develop programs and special events that feature their special collections content not only at a reasonable cost but also as a revenue generator. Traditional publishing can also support and enhance the display and exploitation of library special collections. Many of us are familiar with the Museum shop experience which can be irritating because it often appears overly commercial; however, the reason these shops exist is basic economics. The products sold are a material support to the institutions. In the case of virtually all museums the institutions retain extensive publishing programs for everything from books and exhibition catalogs, to greeting cards, post cards and posters. Digitization will allow even small libraries to leverage their content in revenue producing ways. Ideally, the most savvy library administrators are going to realize that the opportunity for revenue could actually pay for the digitization. After I graduated from BU, I became the book buyer at the Museum of Fine Arts in Boston and that experience showed me the intense focus on leveraging their collection in all commercial aspects is critical to retail revenues, special events and shows and donor participation. Obviously, the correlation with your typical local library and the MFA is tenuous but the lessons are there to be learned in how to build new and recurring revenue streams that can be channeled back into the library. Once in digital (leverageable) format it is no slam dunk that your typical librarian is going to be able to produce a printed book but today there are more readily available options for print production. All of the 'photo book' providers such as Blurb.com, picaboo.com etc. offer templates and functionality that could provide that basis for a publishing program. At least something they could test without too much downside. The downside of these providers is that the retail price point for these products would probably be too high to create much demand. On the other hand, the self-publishing programs offered by lulu.com, exlibris and iUniverse may be the answer especially as they become more sophisticated about format and color. Even now, quality from these vendors is high enough that patrons would pay for the books. As any Museum publisher will tell you, the popularity of their in-house titles published to support both specific events and to show case their collection would amaze in the number of annual units sold. I am convinced that there is a business opportunity or consulting practice here for someone to help libraries build publishing programs or digital collections that will enhance their revenue base.
  • 88. PersonaNonData: Predictions & Commentary 2006 - 2013 87 | P a g e Not every library is going to have a collection worthy of digitization, but those that do will increasingly see revenue opportunities from catalogs or a publishing program. Who knows perhaps BU will get around to publishing their Charles Bukowski collection. Borders Strategy Plan: What They Could Have Said - March 26, 2007 Last week George Jones, the recently appointed CEO of Borders Stores, Inc. released his strategic vision for the next three years. There was little in the document to inspire, and it was replete with suggestions that the route to success for Borders was to travel the road already trod by their stronger competitors rather than develop a set of bold new ideas. Coupled with this mediocre set of objectives was a time frame that seems embarrassing given the critical issues Borders and the retail book industry are facing. Borders sales per store and per square foot which lag their competition are declining, they have embarked on a diversification program that continues to draw attention away for the core products and they propose to withdraw from the international market that appears to produce 50% more revenue per store than the domestic business. What then might George Jones have said..... Dear shareholders, Borders is a company in transition in an industry in transition. Borders customers now find the products we sell in more non-traditional outlets and at lower prices. Our customers now have more entertainment options limiting the time they spend on reading and the changes in the music and dvd industry is fundamentally changing the way our customers use and purchase these products. It may be only a matter of time before it becomes unprofitable for Borders to sell physical units of music and dvd products. These are not issues that Borders faces exclusively, but over the past three years, the company has failed to proactively address these marketplace changes. While our in- store experience has grown confused and directionless, miss-steps in our internal operations now limit our ability to support an effective platform for growth. We have to admit that continued investment in our store management and merchandising technology will not produce or enable the rapid changes in operating efficiency that is required to effectively implement our strategic goals. The Borders brand remains highly valued both domestically and internationally and as we consider our strategic options, we must resist the urge to adopt ‘me-too’ or duplicative retail models that succeed in this crowded marketplace. To that end, we will focus on maintaining a unique value proposition for the Borders brand and retail experience. Our goals over the next three years are to:
  • 89. PersonaNonData: Predictions & Commentary 2006 - 2013 88 | P a g e  Lead the industry in sales per store, sales per square foot, fill rates and inventory turn while, maintaining growth in new store openings.  Aggressively eliminate non-core expenses in operations via strategic partnerships across the supply-chain.  Revamp the Borders retail experience by redesigning our stores, implementing state-of-the-art technology and integrating web retail into the stores.  We will expand our international retail operations in combination with partners, expand our Seattle’s Best Coffee relationship but devolve our investment in PaperChase. Over the next three years the company will focus our improvement program in three areas: (1) Store improvement and better merchandising, (2) operational improvement and (3) efficiencies and expanding retail internationally and via the web. Each of these initiatives is addressed as follows: Improving the In-Store Experience: We are in the process of simplifying our in-store product mix and plan to temporarily reduce the amount of in-store product by 25-35% over the next six months. At the same time, we are aggressively revising and reassessing how we use our store-level sales data and have established a task force with an aggressive time horizon that will identify an effective management reporting package so that the company can better plan store inventory and product mix. (We also plan technology improvements at the store level discussed below). Once we have better management information, we will begin to experiment with incremental additions and regional additions to store mix that we expect will support store profitability. Selling books, music and movies is our strength. Music and DVDs are important but the long-term viability for these product segments is suspect as on-demand, downloading and other direct to consumer distribution patterns become predominant. Frankly, we are not a music and DVD destination store and we recognize we sell these items as add- ons to book purchases which do improve average revenue per customer, but selling the products in their current form is not a long term strategy. Borders will continue to experiment with different mechanisms for selling music and movie content that will enable these segments to remain important revenue sources for us. We also recognize that our in-store layout and retailing environment has grown stale and boring. As previously mentioned, we are in the process of redesigning the in-store concept and this is a matter of significant importance for the company. We expect to launch the new store concept no later than the fourth quarter 2007. In this important initiative, we do not expect a ‘me-too’ design or to simply replicate the store features of our competitors; rather, our objective will be to develop a unique approach to book retailing that combines an increased awareness of the correct product mix for our stores, market research and marketing statistics to determine the store features that
  • 90. PersonaNonData: Predictions & Commentary 2006 - 2013 89 | P a g e resonate with our customers. Initial research suggests our current stores are bland and confusing to customers, who often leave our stores without finding the books they seek. As previously disclosed, the company will reduce the number, and revamp the product mix, of our Walden Books mall stores. Critical to this effort is effective management reporting metrics enabling correct executive management decisions to close or significantly revise specific Walden stores. We expect to close 25% of our Walden stores over the next 18mths. While our small mall retailing business has declined, we still believe that mall-based retail outlets represent legitimate opportunities for Borders to retail our products. Along-side our Walden rationalization plan, we plan to test and launch a ‘mini-POD’ bookstore concept. These small stores will be located in high-traffic areas such as medium-to-small sized mall spaces, public spaces, high-traffic retail space and potentially within other retailers spaces. These ‘mini-POD’ stores will sell less than 200 titles (all best sellers based on our store POS data) be staffed by one clerk and cover less than 200 sq/ft; they can be either permanent or temporary fixtures, dependent on context. If the tests prove successful, we expect to have over 1000 of these mini-POD stores in place by the end of 2008. Our airport store growth and re-branding effort has been resoundingly successful and we will continue to expand this program in the North American market. As part of our international expansion, we will consider opportunities to extend the model into the developing air transport markets of Asia. The PaperChase acquisition has been an interesting experiment and the company stores continue to do well under Borders management. Regardless, the company’s future is in the sale of entertainment products and PaperChase will be carved off as a separate business and eventually sold to its management. We believe there is a future for this line of business but the synergy with entertainment products, our internal processes and between our vendors and those of Paperchase is tenuous at best. We believe we can generate higher sales/sq/ft from our traditional produce mix and Seattle’s Best Coffee. Operations Review: Borders must rationalize our internal operations so that we can focus on our core expertise. We are not proficient at software development, distribution, fulfillment or logistics, and over the years these areas have diverted too much management time, resources and money away from merchandising, retailing and brand development. It is our goal to seek strategic partners to further outsource our warehouse, fulfillment and distribution operations and to seek efficiencies in our logistics operations – particularly store fulfillment. Lastly, under discussion is the possibility of outsourcing our management information systems that support our store level point of sale systems and which connect these store systems to our merchandising systems. We believe the only way Borders can achieve the state-of-the-art technology critical to our success is to
  • 91. PersonaNonData: Predictions & Commentary 2006 - 2013 90 | P a g e partner with a provider(s) who is simply better at implementation and IT management than we are. Discussions are advanced in these areas. Coupled with this operations review, we will work with our vendors to implement Radio Frequency Identification throughout our supply chain. We believe this initiative will have particular value at the store level. Experience in other businesses indicates that this will be an expensive initiative but will lead to the following material benefits:  RFID on all book product and in all stores within 18mths (on 85% of in-store products and 100% by end 2009)  100% location data for all products in store: ability to locate any item  Virtually eliminate theft  Increase in-store fill rates: Expect incremental sales increases between 5-10% of current store revenue ($250-500,000/store annually).  Reduce out of stocks by 10-20%  Reduce to 10% the current amount of time to stock new stores (to two days from 2 weeks)  Remove/reallocate slow-moving stock: Rapid/immediate understanding of stock mix versus sales  Daily inventory count: Eliminate need for physical inventory  Speed product receipt and returns process We expect to lead the book industry in this initiative and we also expect the initiative to pay for itself in increased sales, better merchandising and higher customer loyalty within a 36-month period. Complementary store-level technology enhancements will also enable wireless couponing, self-check out and cross- and up-selling opportunities. These are the types of critical success factors that will lead to the industry-leading revenues per store and per square foot to which we aspire. Expanding Our Retail Outlets: Finally, the company has questioned the continued development of our international operations, but we remain committed to expanding this business via merger with a leading non-US retailer. Together, we will aggressively expand the super-store concept to Asia and ME particularly China, India and South and Eastern Africa. Additionally, we will seek a similar partnership arrangement in South and Central American where we believe the super-store market for books, music and DVDs is largely untapped. The development of these markets is expected to take place through a combination of franchising and store-owned operations. We expect to announce our first Borders stores in Buenos Aires and Santiago by the end of the year. The UK book retail market is currently in turmoil and we will seek to take aggressive advantage of this and leverage our strong market position in that market with assertive merchandising and product discounting to drive traffic to our stores. We will close underperforming stores and expand the superstore concept in the UK and Ireland as results dictate. All told, we
  • 92. PersonaNonData: Predictions & Commentary 2006 - 2013 91 | P a g e expect our international operations to grow to over 300 stores by the end of 2012 and believe our continued success internationally will derive from our steep investment in our US store operations. Lastly, we commenced development of a Borders.com web retailing site which we believe will tap a missing link between Borders and its customers. We will use this site as a marketing and customer service solution to strengthen the bond between Borders and our customer base (particularly the 17,000 members of the Borders awards program). There will be a close coordination between the redesign of this program and the development of the web site. The success of this site will be determined by the strength of the connection we can make between the physical stores and the web experience. We expect to make the Borders web-site a destination site that will enable social networking, user tagging and customer retailing options similar to those available on auction sites. We recognize this strategic plan represents a series of ambitious but attainable goals. Importantly, across our business we will focus on our core competency in selling entertainment products in a conducive retail environment supported by strong merchandising and management information. Supporting this development will be an aggressive efficiency program that will support the above objectives by realigning our operations to reduce time and expense spent on activities we are not good at and investing in technology solutions that will support our long term goals. Many of these initiatives are in process and we look forward to bringing you up to date with them in the next three months. Gift Registry for Libraries – January 19, 2007 Mrs PND and I got married at city hall and our marriage was announced to friends sometime later mainly because we worked in the same department at the same company. As a result we didn't get too many wedding presents because strictly speaking we didn't have one. But I digress. I have always thought that the concept of the Bridal registry could be extended to many other business areas in particular libraries and charities. In particular a number of years ago, I suggested to our BooksinPrint development team that we add this type of feature to BIP. My idea was that similar to a bride and groom selecting the gifts that they would like to receive, a library could have a list of books, materials and other stuff they would like to have purchased for addition to their collection. As the patrons selected the items they wanted to purchase the transaction would run through the library's acquistion module so that the purchases would take advantage of any vendor discounts the library received (Baker & Taylor, Brodart, etc.) and the books/materials would be delivered shelf ready as though the librarian had ordered them. (The supplied MARC record could even indicate who purchased the items as a 'fund code'). The patron would be charged via credit card for the
  • 93. PersonaNonData: Predictions & Commentary 2006 - 2013 92 | P a g e purchase. Similar to a bridal registry the items would be removed from the list as they were purchased. How a list was created is a detail that could require some work and maintenance by librarians unless there was only one purchase list. Suppose the patrons saw the title acquisition or selection list that librarians were proposing to buy and were able to select titles from this list that they wanted to buy for the library? Simple. There may be other benefits that could result from something like this but three are obvious: firstly, the library gets a cost effective way of adding titles to its collection without spending over its budget, secondly the library gets titles it wants rather than donated titles of dubious value and thirdly, the librarian retains 'control' of title selection and collection development. Lastly, the concept doesn't need to be limited to books and materials but could include donations for anything the library spends money on from a new roof to a book mobile. With access available via the web patrons wouldn't necessarily need to be local to take advantage of this program. What reminded me of this idea was an announcement today of a program launched by Amazon.com named "wish for lit." While I suppose I have to laud the gesture, it is pretty small beans given the volume of business libraires must push their way and it is a one off. Also, isn't it generally accepted that libraries are in need; so why have them jump through hoops to enter the contest for a measly $5,000? Why not pick names out of a hat? Offering a tool similar to my idea of a 'donation registry' at a network level as part of BooksInPrint or WorldCat would be easy to implement and painless to integrate with selection and purchase modules. Just Trying to Keep My Customers Satisfied - January 18, 2007 I feel like a total monkey when I walk into Starbucks; that is, since I read Breaking the Trade-Off Between Efficiency and Service by Francis X. Frei in the November 2006 edition of The Harvard Business Review. The article is about how service businesses struggle with the impact customers have on their daily operations: the fact that customers interfere with the smooth running of their operations. Who hasn’t heard someone in the service business lament "…if it wasn’t for the customers this business would work perfectly." Service business by definition rely on customer interaction. The problem is that this interaction is often unmanaged and unmanageable by the service provider. The impact of this is often seen in inconsistent service and the not insignificant task of service providers is to be able to deliver a consistent level of service despite the level of interruption by the customer. Frei goes on to discuss five types of customer variability;  Arrival - you can’t always anticipate when customers will show up  Request - sometimes they want it their way  Capability - perhaps the customer knows a lot and sometimes they are clueless which is especially relevant if they play an active role in the process
  • 94. PersonaNonData: Predictions & Commentary 2006 - 2013 93 | P a g e  Effort – the customer may be more or less willing to participate in the process  Subjective Preference – Is the customer happy with hand-holding or embarrassed by it? In managing the variability, the manager faces a choice of accommodating the variability or attempting to reduce it. This is a trade off that could bankrupt the organization if it goes to extremes in either direction. Offer no flexibility and customers leave; offer too much and it costs too much. The actual solution is more nuanced and Frei discusses a number of options and companies which have been able to maintain expected service levels without going broke. Which is where the Starbucks reference is relevant. I sometimes get a ‘bar’ drink at Starbucks rather than a regular coffee. As I stand in line I find myself reciting the proper syntax so that when I get to the counter I don’t embarrass myself by getting the order wrong. (The Economist recently recited a Starbucks bar drink order in an article, got it wrong and it was corrected in a letter to the editor – which they dutifully printed). Mine is a Grande, 2% Extra Hot, Whip, Hot Chocolate. Starbucks do this by design. If you notice the Starbucks employees call out the drinks each time, and this is to teach you the customer to remember it so you can get it right next time and reduce the variability, speed up the line and not embarrass yourself. This article was given to me in relation to some consulting work I was doing, but it is interesting reading for anyone involved in the service business who needs to ensure a consistent cost effective customer experience. My Education Space: 'Ed-Space' - October 17th, 2006 Did you ever wonder what it would be like to re-visit some of the projects and papers you wrote in college or recall some of the essays you either wrote or read for books you are now re- reading? If you are like me, you probably don't care about everything you were studying in school but for some of the material it could be fun to experience again the material that is still meaningful to your interests. When we experience life we generally do not take time to gather the detritus that reminds us years later of the experience or enables some recent connection to the earlier experience. As time goes on we often regret not being more careful about some of this stuff. At least I do. The social networks that My Space, Friendster and others create have not yet reached their potential in terms of the functionality and services that these sites could deliver. One area in particular that I believe we will see more application of the My Space experience is education. In the not too distant future I believe students at universities will have their own 'Ed-spaces' that will be hosted by their institution and will provide access to all university services, course content, testing and comprehension applications, lecture notes, text material and other ancillary services such as administration modules. Additionally, this 'Ed-space' will also host all the content the student produced - test papers, essays, writing assignments, presentations, etc.
  • 95. PersonaNonData: Predictions & Commentary 2006 - 2013 94 | P a g e - during their education. The textbook material will be maintained as an electronic bookshelf which the student can access for as long as they retain the relationship with institution. The establishment of this university 'Ed-space' will create a long term compact with the student that will tie the student to the institution. In effect, the educational institution will become an accessible repository for the student which will in turn support a long term mutually beneficial relationship between the student and the institution. Perhaps the student maintains some limited functionality or access immediately after graduation but as they age they are able to participate at different levels that enable greater functionality and access to more content and services. Once the student graduates, this 'Ed-space' will become the basis for all alumni relations, social networks with classmates, job and message boards and the like. For the institution this would become a powerful tool for life-time learning, alumni relations and fund raising. As the student's interests develop and grow over the ensuing years the 'Ed-space' would allow access to educational content, library materials and academic experts provided by the institution. The student would also benefit from the relationships with other ex-students who were interested in similar subjects. The community would also enable new services that the university could sponsor such as conferences, field trips and webinars particular to alumni interests. All of which would strengthen the relationship with alumni and also generate additional revenues for the institution. This model would also mean that educational institutions could wrest control of the student away from publishers who are also trying to establish long term relationships with students. Publishers would be able to market their life-long learning materials and perhaps engage in specific community development but it would all be in the confines of the institutional 'Ed- Space' paradigm. Naturally, students would be suspicious of aspects of this model but encouraging a degree of freedom while also serving as their access point for their personal content repository and enabling access to content and a social network would be material benefits to them. Books are sold exceptionally well online but their merchandising could adapt to smaller format retailing. Urban book retailing will continue to be dominated by chains; costs will simply be prohibitive for independents to support sophisticated merchandising and supply chains that will be needed in the type of retail environment foreseen. Regardless, store size will shrink as the inventory mix skews to movie style 'openings' and 'events' designed to bring in a volume of buyers in a short time frame. A publisher once told me that if he owned a store he would only stock 40 best sellers. That concept (or a variation) will become the next phase in physical book retailing. Will it be the last hurrah? The Textbook in the 21st Century – July 13, 2006 As I have mentioned before, I believe educational publishing - particularly in College - to be an industry ripe with opportunity and therefore very interesting. Challenges obviously exist but
  • 96. PersonaNonData: Predictions & Commentary 2006 - 2013 95 | P a g e educational publishers have an opportunity, facilitated by the internet, to build a virtuous circle connecting the publisher/author, student, educator, advisor and institution. (Even the parent could be part of this grouping). Traditional publishing content remains the 'glue' within this grouping but publishers are also building 'platforms' adding sophisticated testing and evaluative modules, administrative modules and ultimately a social networking component that will further facilitate a level of communication among the groups heretofore unheard of. I have spoken a little about this in an earlier post. There will be many changes resulting from this different publishing paradigm not least of which the content itself. I doubt many publishers would contest the notion that the existing construct of the traditional published text book will remain the same for much longer. In the not too distant future the course textbook is going to have more in common with an online newspaper that it will with a physical print product encased in board. Editorial and authorship may become more important than it currently is since the product will become dynamic and subject to on- going news events, reinterpretations and the feedback from users. Incorporation of audio and video, blogging and chat also add a 'real time' component that will require monitoring and management. What is interesting about this article in the NYT today is that it highlights the significant fallibility of the textbook unit when viewed from today’s 'instant update' environment. The article points out a number of things including the surprising similarity across texts, the apparent lack of motivation to change - evidenced by continuing to publish 'name' authors in updated editions even after they were decreased and the clear lack of feedback from user to publisher/author that allowed continued publication of the same material year after year. In many ways the article makes publishers out to be dummies but there may have been important reasons to leverage a known author for many years. Profits. Institutions, Professors, etc. act conservatively and go with what they know. Rebuilding around a new author increases the risk that the customer may go to a competitor. In addition, from a publisher point of view it is easier to tweak an existing text than start over with a new one. (Although in reading this article you may gain the impression that none of them ever start from scratch). All the big educational publishers - Wiley, Pearson, Harcourt, McGraw Hill are building online educational content that is - or will represent - a fully interactive educational product. For publishers to gain direct access to a student that enables the student to build an online bookshelf of educational material that they can carry with them forever, and furthermore to establish a relationship with the student after they leave college, is what these publishers are really looking for. Exciting stuff if you are a publisher. And great benefits for students and educators as well.