Content Solution Quick Start (June 2014)


Published on

An annotated slide deck from a webinar hosted by Stilo International and conducted on June 24, 2014.

The talk introduces tactics for moving a content solution project forward quickly while also attending to essential details.

Published in: Internet
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Content Solution Quick Start (June 2014)

  1. 1. Copyright © Joe Gollner 2014 Content Solution Quick Start Program @joegollner
  2. 2. Commentary: Introduction This presentation was delivered as a webinar hosted by Stilo International on June 24, 2014. Initially titled a “DITA Quick Start Program” this talk is in fact more general than that. The talk does touch on how the Darwin Information Typing Architecture (DITA) encourages and supports quick start programs. The goal of this talk was to introduce some tactics that have proven useful in getting content solution projects off the ground quickly…
  3. 3. An Acronym for All Seasons “A good plan violently executed now is better than a perfect plan executed next week” - George S. Patton It should still be a good plan…
  4. 4. nalyze An Acronym for All Seasons urvey rticulate rototype
  5. 5. • Analyze current content & processes • Identify & prioritize improvement opportunities A Left-Right Combination: Left Jab nalyze urvey • Solicit stakeholder inputs on opportunities & risks • Gain insights into the “political dynamics” at work
  6. 6. rototype • Document the change steps to be taken • Explain the business drivers behind the changes A Left-Right Combination: Right Hook • Illustrate new capabilities and keys benefits • Make the improvement plan tangible & compelling rticulate
  7. 7. Commentary: The ASAP Acronym Adopting an acronym will always make things appear somewhat artificial. Hopefully it also makes them more memorable. In this case, ASAP reminds us that in each wave of activity should include an element of analysis (where we try to understand the needs & goals) and an element of engagement (where we try get stakeholders involved in the process). Hence Analysis is balanced by Survey (asking for inputs) & Articulate is balanced by Prototyping (showing what is possible)
  8. 8. Analyze: Adopt a Content Life Cycle Model Content Acquisition Content Management Content Delivery Content Engagement
  9. 9. Commentary: The Content Lifecycle Executive Management is typically familiar with quadrant models. This content lifecycle model works with this common structure for setting out the activities governing content lifecycles. Quadrants on the left are “internal” and those on the right are “client facing”. The upper two are focused on the content itself while the lower two on data & actions applied to content. See The Content Lifecycle
  10. 10. • Content Engagement • Stands out as the most novel element in this Content Life Cycle Model • It focuses on how content is used & and how the user community can become actively engaged in a process of continuous & constructive change Radical Element: Content Engagement
  11. 11. Analyze: Apply an Evaluation Framework Content Delivery Content Acquisition Content Management Content Engagement 1 10 1 1 1 10 1010
  12. 12. Commentary: Evaluation Criteria Each quadrant is amenable to measurement and therefore improvement. It is possible to overlay an evaluation framework where each quadrant can be evaluated and assigned a score between 0 (non-existent capability) and 10 (excellent). The trick is to identify evaluation criteria that can be improved over time (increasing their objective nature) and that can be used to describe target capability in a meaningful way.
  13. 13. Analyze: Rate Capabilities & Targets Content Delivery Content Acquisition Content Management Content Engagement 2.4 4.2 As Is 2.8 4.0 8.3 8.77.8 To Be 1 10 1 1 1 10 1010 8.8 As Is Score: 45 To Be Score: 281
  14. 14. • Normally started in an “information vacuum” • Identifying what can be, or should be, measured is a start • On one project: Analyze: Evaluation Considerations Score Assigned 0 – No Score 2 – Poor 4 – Weak 6 – Fair 8 – Good 10 – Excellent Capability Level None Minimal Inadequate Adequate Competitive Industry Leading Optional Weighting Scheme Applied to Evaluation Criteria 0.5 – Less important 1.5 – More important Every criterion weighted as more important must be balanced by one that is rated as less important Calculating a Total Score Scores are assigned to each criteria for each quadrant. Scores for criteria are averaged & plotted on an axis from 0 to 10 for each quadrant. The area of the polygon that results is the total score. Marking Scheme Evaluation Criteria Competitiveness Consistency Responsiveness Maintainability Measurability Usability What is being Evaluated Benchmark comparisons against comparable organizations The consistency of content details across the collection (measure of reuse) The extent to which new demands can be met quickly & affordably Maintainability & supportability of the overall solution Completeness & quality of the measurement data provided Efficiency & intuitiveness of all user interactions (supporting user success)
  15. 15. Commentary Different organizations will have wildly different management cultures & wildly different views on what constitutes meaningful measurement. The approach introduced here is open to adopting whatever measurement strategies that an organization will accept. Note that different measurement criteria can be used for different quadrants. Also multiple criteria can be aggregated (e.g., averaged) into the measurement for a given quadrant.
  16. 16. • Apply a structure to organize information • About the current state (as is) – limitations & problems • About the future state (to be) – improvement opportunities • Establishes the basis for future refinements • In what is measured • In how it is measured • In how measurements can be converted into financial terms • Analysis results usually need corroboration • Engaging stakeholders with influence and / or insight • Leads to the need to survey… Analyze: The Key Points
  17. 17. • Formal & structured approach to gathering inputs • Designed to collect information and insights in as authoritative a way as possible • Analysis results can be provided with the survey • A way for people to “say what they really think” • Surveys designed to support both • Quantitative research • Qualitative research People provide initial response from choices then elaborate Survey: Putting the Analysis into Context
  18. 18. Research Questionnaire # Question 1 Based on your understanding, why is your organization interested in adopting a CCMS? 2 From your perspective, are there advantages in adopting a CCMS? 3 From your perspective, are there risks and challenges in adopting a CCMS? 4 How would you describe the culture of your organization? 5 How would you describe the culture of your particular work group? 6a Do you think the CCMS will change the culture in your work group? 6b1 If Yes to 6a: How do you think the CCMS will change the culture of your work group? 6b2 If No to 6a: Please elaborate on why you think the CCMS will not change the culture in your work group. 6c Do you think other members of your work group will be receptive to the change? 7 How would you describe the division of roles and responsibilities in your work group? 8 How would you describe your role and responsibilities within your work group? 9 Do you think the transition to the CCMS will affect your role and responsibilities within your work group? 10 During the CCMS transition, what role and responsibilities would you like to take on? 11 After the CCMS transition, what role and responsibilities would you like to take on? 12 How would you describe the status of your work group in your organization? 13 Do you think the CCMS will change the status of your work group in your organization? 14 How do you think other work groups in your organization view the CCMS initiative? 15 Likert-like scale (five levels from negative to positive with the middle value being neutral) applied to five perceived attitudes (respected, trusted, understood, valued, appreciated) 16 Are you comfortable with the prospect of learning and using new CCMS technology? 17 What are your expectations for a CCMS solution? 18 Do you have concerns about the transition to a CCMS? Survey: Ask Questions Online Survey Tools (e.g., Fluid Surveys) Anonymous responses Independent coding Kept brief Can be tailored to different stakeholder groups & research questions Sound methodology Project example 
  19. 19. • Adds a dimension to the analysis results • Can highlight issues that call for specific planning measures • In this case, users highlighted transition challenges due to resource overloading • Measures were taken Survey: Leverage Responses Work Group Worth Transition Challenges Expected Outcomes Reactionsto OrganizationalChange Reactionsto TechnologyandProcesses Anticipated OrganizationalChange +5 +10 4 2 -2 -2 -2 +10 +3 6 5 -5 5 +4 +4 16 -11 -12 +16 +12 6 1 -9 -3 +1 10 8 -3 +10 +1 7 11 -4 TechnoCorp EduOrgPositive, Neutral, Negative codes applied to:
  20. 20. Commentary: Survey Analysis More information about this particular survey, and its analysis, is being provided in an article in an upcoming issue of the IEEE Transactions on Professional Communication. The analysis of the survey responses involved the “coding” of responses using a coding scheme established using a Grounded Theory Methodology – meaning the coding criteria are drawn from an analysis of the responses. This means that the general categories on the preceding slides emerged from the responses.
  21. 21. Commentary: Survey Analysis – 2 In this particular example, the survey responses were analyzed & coded in a way that helped stakeholders to see what was most germane in the inputs that the survey had collected. The center of the diagram just happened to exhibit a higher than usual number of responses that conveyed a “negative” disposition. This negativity turned out to be very specific. People were concerned about the transition challenges associated with changing tools and processes. The survey highlighted this.
  22. 22. • Surveys provide valuable insights into stakeholder domains • When used with internal team members: • Taps into in-house tribal knowledge • Can let people see that others share the same concerns • Can show management what concerns need addressing • When used with external stakeholders such as customers: • Highlights what changes are most important • Highlights where improvements might have far-reaching impacts • Gets some managers very excited… Survey: The Key Points
  23. 23. • Built on the findings from the Analyze & Survey tasks • Documents a Plan of Action • Justifies the investments needed • Takes a number of forms • Written document • Presentation deck • Short scenarios highlighting benefits • Elevator stories • Choice customer inputs Articulate: Setting out the Plan I get teased a lot because I write “long form” documents. This teasing comes sometimes from colleagues & sometimes from customers. Then I find, months later, that these customers have very well-worn printed copies of these plans, all covered with bookmarks and annotations. All forms are necessary.
  24. 24. Articulate: Framing a real Content Strategy Content Delivery Content Acquisition Content Management Content Engagement 1 10 1 1 1 10 1010 Content Strategy
  25. 25. Commentary: Content Strategy In less than a perfectly charitable way, I have been heard to say that most of what passes as Content Strategy these days has almost nothing to do with Content and even less to do with Strategy. This inflammatory statement can be defended once certain definitions of Content and Strategy are adopted. A Content Strategy must be a plan of action that focuses on improving how content is acquired, delivered, used, and managed as well as clarifying what content is needed and why.
  26. 26. • The Plan of Action needs to be made tangible • It needs to be something people can get excited about • This is important on several levels • Executives need to see something to understand it • Team members want to see how their efforts will pay off • Customers want to see how things will be better • Prototyping can play different roles • The first one • The business prototype or an “information prototype” • Demonstrating how things will be different for key stakeholders Prototype: A Plan by itself is not enough
  27. 27. Prototype: A Recent Experience Drug Product Database Clinical Trials Register Adverse Reaction Dataset Part III Drug Monographs Reference Data Sources Drug Product Data Select data items around each drug product that will be useful to citizens within identified scenarios Drug Product Part III Details Adverse Event Reports Clinical Trials Aggregate data and organize to support key user tasks Export XML representation of the data for use as input to prototype interface Develop Mock-up of User Experience (UX) Interface using Canadian Government Web Experience Toolkit enhanced for responsive design Confirm UX controls with first iteration of prototype. Automate build from XML data & templates to generate next prototype iterations. Canadian Health Product Register (HPR) Prototype Development & Evolution Process
  28. 28. Commentary: Information Prototyping Information Prototyping is the technique of helping business stakeholders to see how their information resources might be leveraged more effectively. An information prototype helps these stakeholders to see how they might operate differently in the future and how they will be able to effectively engage their stakeholders. Information prototypes can be explosively effective. They can push a key “hot button” for executives – the prospect of a quick hit.
  29. 29. • Lay the groundwork: Left Jab • Analyze – current state & improvement opportunities • Survey – engage stakeholders to gather inputs • Make the case: Right Hook • Articulate – frame your Content Strategy as a Plan of Action • Prototype – Make your plan something tangible & compelling • Some Benefits of this Approach • Encourages a combination of speed & substance • A “good plan” • Encourages the involvement of all stakeholders • Management, team members, customers, others… Review of the ASAP Approach
  30. 30. • This approach has thrived on the Darwin Information Typing Architecture (DITA) • Among the key attractions of DITA is that it encourages this type of implementation approach • Available models for common information types • An evolving Open Toolkit that supports rapid prototyping • An extensibility framework for tailoring what is available to address what is needed • The nuts & bolts of DITA work pretty well so the challenges that remain are not technical • The quick start program must tackle these challenges head-on The ASAP Quick Start Program & DITA
  31. 31. The Cycle Repeats: Content Evolution Content Acquisition Content Management Content Delivery Content Engagement Content Strategy
  32. 32. • What stands out about the ASAP approach is that it precedes tool selection • One goal of the ASAP approach is to equip organizations with what they need to effectively engage the technology marketplace • Another goal of the ASAP approach is to align & bolster the support of all stakeholders so that implementation efforts are tackled effectively • SaaS / On-demand technology offerings can be effectively leveraged as part of this overall approach Key Observations
  33. 33. Commentary: Content Technologies Among the main drivers behind content standards in the first place was the desire to separate the long term optimization of content assets from the limitations historically imposed once technologies have been adopted. The ASAP approach tries to follow in this original spirit by quickly equipping business stakeholders with a tangible understanding of their content needs in a way that fully leverages open content standards – so that they can be fully independent of any one technology.
  34. 34. Questions, Contributions & Conclusions This ASAP approach is the latest formulation of tactics (some might say dirty tricks) that have evolved over the last 25 years. The trend has been towards increasing both the formality of the analysis conducted and the tangibility of future state prototyping. Elements of this are touched upon in The Reason & Passion of XML reason-and-passion-of-xml.html It can be difficult to maintain the balance between speed & substance.