Jane Wilson, VMware
November 1, 2019

It’s always frustrating to hear about our deliverables, “The documentation is bad.” Beyond the fact that it’s negative, that simple statement is not actionable. There are no specifics about what part of the documentation is “bad,” much less what’s wrong with it or what “bad” means. Follow-up questions might lead to a little more detail: “I just hear it’s bad,” or “A customer told me it’s bad.” Obviously, this is not helpful in trying to address, much less fix, the problem.

If we keep pressing for more specifics, something that could be actionable, the original “doc is bad” comment could lead anywhere. It could be that customers are not finding the information they need, so we know we should look at findability, organization, and SEO. Perhaps the specific use case the user needs is not clearly defined in the documentation. Or maybe the feedback is misdirected – it could be a problem with content that our team does not own (in our case, VMware offers knowledge bases and white papers owned by support and tech marketing), so in this case we need to pass along the feedback and work to create better consistency across all customer-facing content. Sometimes, the problem may not be with the content at all. It could be that a customer is dealing with a difficult custom installation or a less-than-optimal interface, and the documentation becomes a catch-all for an all-around bad customer experience.

Sometimes, though, the problem is with our documentation product. We can’t, and we shouldn’t ignore the feedback. If there is a problem we need to identify it and address it. A big part of being able to do that is that we also need to own the definition of what quality looks like for our content.

To address this problem, the Information Experience team at VMware took on the task of defining quality for our documentation. We began a campaign in 2019 to look at different aspects of quality and how we address those within our team. At VMware, we often turn to the Flawless Execution methodology to manage changing a process or to develop a new paradigm, especially when it comes to matters of managing processes and performance.

For the quality campaign, it’s been a strong team effort, with a big commitment from the team. We identified fourteen missions that are meant to define quality standards at VMware. Across these fourteen missions, we have approximately 140 participant roles. For a team of around one hundred writers and managers, that means some of our team members are dedicating the time to work on multiple missions simultaneously to help with this undertaking.

The missions themselves range from specific (Defining the Standard for Effective Visuals, Ensuring Technical Accuracy, Writing Localization-friendly Content, SEO Standards) to more high roader, more high-level missions that will allow us to set priorities for how we address any quality issues (How Do Internal Stakeholders Measure Quality?, How Do Customers Measure Quality?, How Are We Applying Editorial Standards?). As a whole, they were chosen to cover standards of quality that our team identified, as well as uncover elements of quality that might be important to our stakeholders.

Flawless Execution was developed by the military to address and break down complex situations with discipline and strategy. Using Flawless Execution, a large issue is identified as a campaign. In this case, our campaign was about Documentation Quality. The campaign is broken down into multiple missions with a single focus. Missions should be simple, measurable, have boundaries, and have a clear objective. Each mission team is led by an Ace, who is the leader and facilitator, but each member of the mission team has an equal voice and is responsible for participating. The tenets of Flawless Execution can be boiled down to a simple cycle of Plan – Brief – Execute – Debrief.

The teams are self-organizing. In the Planning stage, the team determines what the actual goal or objective of the mission is and identifies any threats that might be in the way of successfully completing the mission. This is also where the team also identifies the resources they have and any other resources they might require in order to complete the mission. During the planning stage, the team will also develop a course of action for achieving their objective.

During the Briefing stage, the team presents the proposed course of action to a Red Team – a group of interested stakeholders or people with experience in the subject area of the campaign. The Red Team cannot approve or disapprove of a course of action. They can offer feedback in the form of “have you considered…” statements.

Finally, the team executes the course of action and then debriefs with any findings to the larger team. In the case of our quality campaign, the missions might result in changing a process for the team, acquiring a new tool, or changing our Content Standards. The Debrief includes presenting findings to the entire IX team through emails or team meetings, as well as education through brown bag sessions or workshops.

One example of an early successful mission in our campaign was the mission to define how we would present use cases/scenarios in our user documentation. The team was formed with an Ace and fifteen team members. The team first defined their objective: to produce a detailed workflow for making a use case topic that will be later introduced into the Content Standards website.

The mission team then moved on to their course of action. They agreed to accept a definition of “use case” that had already been included in our Content Standards (but without instructions for how to create one). Their ultimate goal was to develop a workflow with instructions for creating use case content that would include criteria for what makes an acceptable use case, steps for developing content, tips for working with subject matter experts to define a use case, DITA guidelines, and options for testing. As part of the course of action, they advocated that each team member should create a use case for their respective product teams. These sample use cases were considered a product of the mission, but the experience of creating them would inform the mission. They agreed that the measurement of the new workflow would be how well it adequately matches each contributing team member’s experience in creating a sample use case. Once the results went through Red Team review, the workflow was added to Content Standards, and the entire IX team was educated on the new workflow.

The Use Case mission also identified the resources they would need to complete the mission, primarily the mission team’s expertise as technical communicators and their expertise within their own product teams. In addition, they required assistance from their product teams in the form of subject matter expertise and customer expertise. Identified risks included choosing sample use cases that were minimally impactful or obsolete and lack of support from subject matter experts. It was determined that, as a contingency, any mission team member who could not write a use case would instead serve on the Red Team.

The result of the mission was a usable workflow that is already helping the writers on our larger IX team identify use cases and scenarios that will better serve our customers’ needs. That workflow is providing a framework for creating those use cases and including them into our customer-facing documentation.

The Use Case mission is just one small part of our quality campaign. The larger focus of that campaign is to help us determine what our quality standards are at VMware so that we can improve our content to meet those standards. This effort will also allow us to take this fledgling campaign and begin to apply metrics to it so that we can provide analytics on the quality standards we have defined. This will enable us to uncover (really!) what’s behind a simple statement like “the documentation is bad” and turn it into useful, actionable feedback.