We hear those doubters all the time: “Quality documentation can’t be written in agile”, “We technical writers are not appreciated”, “We don’t have the budget to do that”.
I’d like to share how at doc-department we’ve dispelled these myths by using the information and advice we learned at the TCUK, through the ISTC and from the progressive tech comm community, willing to share their knowledge.
I shall share how we applied the lessons we learned in areas such as content strategy, agile writing, and DITA to address common issues; and in the process create a complete product information platform that enables us to deliver content to many areas of a business within the constraints of an agile development cycle.
Delivered at Technical Communications UK - 2015
This afternoon I’d like to share with you how doc-department have been able to achieve this by developing a complete product information solution using information that we learned from forums such as this conference, from the ISTC’s communicator magazine and from the wider technical communication community.
With the additional capabilities that such a solution provides, we have been able to break out of the traditional tech comm boundaries and become an integral part of our client’s processes.
In the first part of the talk I’ll share with you some of the information that helped doc-department lay the foundations of the solution.
In the second part I’ll share some of the lessons we learned while deploying the solution.
And to wrap up, I’d like to highlight some areas that doc-department can now support because we have a process that can consistently deliver quality content within our client’s 2 weekly agile cycle.
Through the course of the talk I’ll also lay some common tech-comm myths to rest.
…
The challenge
Why did we need to create such a product information solution in the first place?
Our client Resilient plc provides a hosted telephony services call smartnumbers
The services are accessed through a mobile app and desktop browsers.
They have a 2 week development cycle
Their application platform is deployed, monitored and supported by their operations team.
Their customers are supported by Help Desk agents
Customers have administrators who control the services for users in their organisation.
As the services matures, it will be resold by channel partners, who will also provision and support their own customers.
In addition, for compliance reasons, each release requires a set of PDF documentation to be achieved
So you can see that we have several distinct audience groups to we need to deliver the documentation to:
Operations
Help Desk
Channel partners
Administrators
Users
And multiple channels that the services are accessed through:
Mobile app
Desktop browser
PDF
And all of these need to be updated every 2 weeks.
Part I: Foundation
So where did we start?
Before embarking on any tool selection or process development JoAnn tells us that we need to understand the people who will use the information, the audience.
Once we understand their needs, we can work back from there to build our requirements.
One of the methods we used to help us understand the audience was personas.
Personas
For those of you who are unfamiliar with personas, they are representations of groups of individuals that are used to humanise those groups.
Using Personas During Design and Documentation
- http://www.uxmatters.com/mt/archives/2010/10/using-personas-during-design-and-documentation.php
- Niranjan Jahagirdar
These are examples of how Mailchimp depict their personas.
New MailChimp: User Persona Research
- https://blog.mailchimp.com/new-mailchimp-user-persona-research/
In the process of creating our personas, we learned not only about the type of information the different audience groups require to do their job, but also how they access this information.
For example
How much time they are willing to devote to learning about a product
And the devices they commonly use to access information
This gives us not only an understanding of what information is required, and how it should be written
But also how it should be delivered.
Using personas to create user documentation
- http://www.cooper.com/journal/2004/12/using_personas_to_create_user
- Steve Calde
And here are some of the personas we created when planning our solutions.
As you can see, a simple representation can convey a lot – so no need to get hung up on style.
We continue to use these personas whenever we are creating content so that we have that clear image in our minds about who we are communicating with.
Content Strategy & Structured writing
Now that we have a clear idea of the type of information required and how it should be delivered
How do we go about organising it?
Sarah and Noz inform us that a content strategy provides the over-arching structure into which we fit content.
We used this knowledge to build a content model.
Content Strategy 101
Sarah O’Keefe & Alan Pringle
The incremental steps towards dynamic and embedded content delivery
TCUK 2012 Talk
Noz Urbina
Our content model details the types of things we describe, for example, services, components, and capabilities.
And what information is required to comprehensively describe each of these things.
We built up our content model from our understanding of the personas, and the requirements of our client’s business.
It is important to understand that the content model is not a ridged structure, rather it is a framework to which we can add new elements as the product evolves.
Having a detailed content model in this way allows us to break down the information into distinct topics, and the topics into distinct sections.
This allows us to manage each element individually, and is a key requirement automation.
More on that later
Part II: Implementation
So with the help of Steve, and Niranjan we know who we need to communicate with, and how
And with the help of Sarah and Noz, we have been able to figure out what information our audience require and how we are going to structure that information.
How did we turn this understanding into a solution that can deliver quality content consistently every 2 weeks?
For the past few years a number of people have talked about DITA at the TCUK. Among them was Andrew Westfold.
He has spoken a couple of times to describe his ongoing experiences implementing DITA in an agile development environment.
Among the key benefits he described were:
Automation to cut out costly manual layout efforts.
Easier to solicit reviews using smaller stand-alone chunks of content
Faster, more consistent creation of deliverables
Documentation processes are align with agile development processes.
All of which lead to a more efficient, streamlined workflow
These seemed like what we needed in order to meet our challenge:
So with such benefits, and with our content model requiring structured authoring, we chose to go with DITA.
DITA is there; this is what it was designed for; so why reinvent the wheel.
Having created a detailed content model, it was easy to map the content elements in the model to the DITA structure.
But we all know that DITA implementation is expensive.
This figure is from a 2009 survey conducted by Scriptorium, and things have changed since then.
But this is still a figure you will see quoted a lot.
Calculating the ROI of DITA
- Sarah O'Keefe
- http://www.scriptorium.com/2011/04/calculating-the-roi-of-dita/
A significant amount of this cost is to implement a component content management system.
In our example we have a manageable number of files, and there is no requirement to translate.
So in terms of content components we are not dealing with great complexity.
Flare is based on the same topic-base, single-sourcing principles as DITA, but MadCap don’t insist that you need a $40k CCMS to use Flare?
As our documentation grows in complexity, then maybe we will need to invest in a CCMS.
DITA without a CMS: Tools for Small Teams
http://drmacros-xml-rants.blogspot.co.uk/2014/01/dita-without-cms-tools-for-small-teams.html
ELIOT KIMBER
But for now we don’t need a Component content Management system.
And so our DITA implementation did not cost us $100,000
When talking about producing documentation in an agile development environment we here all the time that it can’t be done,
Documentation needs to be one sprint behind
Is every one familiar with agile development and Sprints?
I’m not going to go in to the Agile methodology in this presentation, but suffice to say that it involves developing small chunks of functionality in short time frames often called Sprints.
This approach enables priorities to be easily rearranged and product updates to be deployed frequently.
So the key challenges typically sighted by technical authors working in an agile environment are:
Frequently changing priorities
Not enough time
Agile development is often associated with Continuous Deployment. Which means that the new features are released to customers as and when they are completed.
So the reality is that the documentation just has to be done – one sprint behind is too late.
If you believe that the documentation is necessary – if it is not, then there is no need for it anyway - it just has to be done within the 2 weekly sprint.
In our example, when the sprint is complete, the platform is updated.
The new features are available to users, they are being supported by operations engineers, and the customers are being supported by the help desk.
Not having documentation means support staff can’t do their job.
How would it look if you phone a support desk and you know more about a new feature than they do?
So it is a strategic imperative to for an organisation such as our client to ensure that the documentation is done by the end of the sprint.
So how did we do it?
Back in 2012 Ellis Pratt started talking about applying the Lean principles to documentation.
A core component of the Lean methodology is the build-measure-learn feedback loop.
The Lean user guide
http://www.cherryleaf.com/blog/2012/07/the-lean-user-manual-2/
Ellis Pratt
The first step to this process is developing a minimum viable product
In our example, we only have 2 weeks to document a new feature, but 2 weeks later we will be able to update the documentation.
So we should not get hung up on delivering all the content in 2 weeks
If we can focus on delivering only that content which is absolutely required in those first 2 weeks we are more likely to achieve that goal.
That is what we refer to as the Minimum Viable Documentation.
So how did we decide what is the minimum viable documentation?
Well, the solid foundations we laid to understand the personas and their requirements allowed us to identify the key information they need to sufficiently understand a feature to do their job.
This is the Minimum Viable documentation.
And because of the structure provided by our Content Model, all we needed to do was flag those topics that must be created each time a new feature is developed.
So now, at the beginning of each sprint we analyse the requirements detailed in the development story and assess what topics and content are required for the minimum viable documentation.
If a story is an update to an existing feature, then maybe no documentation is required.
Great, we have written the Minimum Viable Documentation in the first week of the sprint,
But as we all know, getting content reviewed and approved is the biggest cause of delays.
In Agile, for a story to be closed it needs to be done. That means all associated tasks, including testing, need to have been completed.
So if we can ensure that the documentation is part of the definition of what is meant by done, then our goals will be aligned with the goals of the development team.
They will be committed to providing the information and the time to complete the documentation tasks, so they can complete their work by the end of the sprint.
In our example we made the case for including the documentation as part of the definition of done.
And we did this by demonstrating that it is a business imperative:
As I mentioned earlier, the operations team need to have information to support the platform, and the Help desk agents need to have information to support customers
Without the documentation, the product and service would not be delivered to the quality expected by product management
Now that the documentation task is part of the process, it is second nature to development and QA teams to review the content in a timely manner.
Agile Principle 7: Done Means DONE!
- http://www.allaboutagile.com/agile-principle-7-done-means-done/
- Kelly Waters
When talking about agile, a lot of emphasis is put on the work done during the 2 weeks of the sprint.
But features and stories need to be sufficiently defined before the sprint even starts.
So in fact, much of the conceptual content can be created before the start of the sprint with the details finalised during the sprint.
In our example, we have access to the development planning tools, and we are involved in the some of the planning.
This enables us to review stories in advance of the sprint and make preparation prior to the start of the sprint.
Agile Principle 4: Agile Requirements Are Barely Sufficient
- http://www.allaboutagile.com/agile-principle-4-agile-requirements-are-barely-sufficient/
- Kelly Waters
So, as we know from standard documentation processes, the more that can be done earlier in the product development cycle, the more likely we are to meet our deadline.
And by focusing on the minimum viable documentation, we save additional time during the sprint, and, therefore, making it possible to document new features by the end of the sprint.
Part III: Liberation
Although our focus at the outset of this project was to meet the requirements for the technical documentation for Resilient’s smartnumbers service.
To recap:
Provide content for multiple audiences
Deliver that content to multiple devices and in multiple formats
Updated the content every 2 weeks
Having implemented the solution we realised that we had the capabilities to do more:
What we had created was a content production solution that delivers quality content, consistently and in sync with product development.
These capabilities open up opportunities for us to do more with the content that we create.
They could liberate us from the typical content silos and enable us to start providing product information for other parts of the business.
Changing our mindset for the future of content work
- http://www.slideshare.net/nozurbina/rejigging-your-mindset-for-the-future-of-content-work-istc13 | http://technicalcommunicationuk.com/index.php/archives/1713
Noz Urbina
TCUK 2013
Embedded content
One area we could now commit to was delivering Embedded Content.
Is every one familiar with embedded content?
We had proved that we could deliver the content in the timeframe of the sprint. So we could move to persuade product management that embedded content was a benefit to the product.
But more importantly that we could deliver on it.
As Andrew Westfold told us: Automation cuts out costly manual layout efforts and helps create an efficient, streamlined workflow.
To enable us automate our build and publishing processes, doc-department teamed up with a software development company who takes care of the XML transforms for us.
Is everyone familiar with XML transforms and their role in DITA?
Having automated the build process, we are able to do extensive content manipulation from our DITA source.
This means we can leave all the mundane heavy lifting to the build and publishing process, and so can benefit from a more efficient production process.
Our authors can focus on creating the content for the audience within the framework of the Content Model, and the build process takes care of the rest.
So having automated the build and publishing process, we could develop a method of creating embedded content within the standard documentation.
This approach simplified the typical content production and file management required for embedded content.
This works by authors applying attributes to the DITA tags to indicate content that is to be embedded.
Then these content fragments are extracted during the build process.
From this point on, the process is automated, so there is no additional file management requirements. So no need for a costly CCMS.
The content fragments can then be pulled from our standard CMS in to the UI as required
In this screenshot from a demonstration site, the same content fragment has been used for the inline content as for the tool tip.
The key to this capability is the automated build and publishing process which is made possible by having a well-defined content model which is one of the foundations of the solution.
Support staff
I mentioned at the beginning that there are several audience groups in the chain that required information.
Now that we were able to produce quality content consistently at the end of each sprint, we were in a position to help them by providing them with the information they need to do their jobs.
With new features released every 2 weeks, traditional training is not practical. So we stepped in to meet this requirement by producing content targeted for each audience.
So for the Operations team which support the platform:
We produce high-level information that helps them understand the new features being deployed
And also detail data flow diagrams that help them troubleshoot issue by understanding interaction within the system.
For the help desk agents who support customers:
Again, the high-level information ensures that they know the new features that customers are using.
And we will also be providing them with embedded user assistance to help them use the Management Portal to manage the service for customers.
Marketing communications
We have all been told that Marketing departments are challenged to produce enough content.
2015 Benchmarks, Budgets, and Trends
- Content Marketing Institute
- http://contentmarketinginstitute.com/wp-content/uploads/2014/10/2015_B2B_Research.pdf
And that content produced by technical communications is good for marketing
So doc-department are currently working with Resilient’s Marketing team to provide them with content.
We have refined our content model to ensure topics are aligned with marketing requirements so that marketing can reuse or refer to content in their campaigns
For example, on-boarding or major releases.
We are also looking and using the embedded content capability to deliver marketing messages directly in the UI.
- IBM Technical Communications Body of Knowledge
Again, the main reason we are able to engage with Marketing is because we have proved that we have a process that consistently delivers quality content in sync with the product development.
So Marketing is able to rely on us to provide content so they can focus on strategy
Conclusion
So I hope that over the last 30 minutes you have seen that by using the information that we have learned from forums like these, we were able to lay the solid foundations that made it possible for us to develop a complete product information solution that allows us to deliver quality content within a 2 weekly agile development cycle
Having put such a process in place, we could address broader strategic issues for our client by identifying and delivering quality product information where it is a requirement.
So I feel – at least for one company - we have been able to liberated technical communications by changing its perception from being the last, unappreciated step in a process to being a key part of the product cycle.
And this have been achieved by putting a robust process in place that provides quality information to other processes in the business.
I’d like to leave you with the following thought from Noz which I think summarised doc-department’s approach to addressing the challenge I have outlined this afternoon.
Additional parting note that summarises the philosophy behind this case study.
Validate Your Dream
TCUK 2015
Chris Atherton
Version 1 of Help Centre: desktop and mobile versions