CIDM

Spring 2024


The more things change…

Dawn Stevens, Comtech Services


As a Comtech consultant and the director of CIDM, I regularly get inquiries from individuals asking for metrics that will help them make a business case for a change or improvement they want to make in their department; for example, people are looking for metrics that:

  • Illustrate actual productivity improvements after implementing DITA
  • Show real translation cost savings resulting from reuse of content
  • Predict the number of people required to meet the content demands
  • Demonstrate user demand for video, topic-based content, PDFs, or similar
  • Tie a reduction in support costs to improvements in technical content
  • Prove that content is more readily found with a solid taxonomy
  • Justify attendance at a conference or membership within CIDM

I find that at the root of these questions is the underlying need to prove department value so the company continues to invest in technical documentation. For any change or improvement, there’s a need to demonstrate return on investment (ROI). ROI is a simple enough formula; we just need to show that the net return is larger than the cost of investment.

We should be able to easily quantify the investment side of the equation, at least at the department level, if not for individual products we create. What is the annual budget for the team, including items like:

  • Salaries and benefits
  • Infrastructure
  • Translation
  • Printing
  • Training

What’s not so easy is quantifying the return on that investment, how the company benefits from our efforts:

  • Does our work differentiate the company’s products or services, resulting in higher revenue?
  • Does our work positively impact the company’s reputation?
  • Does our work lessen the workload or costs of other departments (such as support)?
  • Does our work reduce risk (of accidents or litigation)?

The truth is that it’s hard to gather such data:

  • We don’t know what to gather.
  • We don’t know how to gather it.
  • We don’t have time to gather it.
  • We don’t have budget to gather it.
  • Data about technical communication is not a priority within the larger organization.
  • We don’t have the support to prove that we should have support.

Yet we need this data to justify our very existence, to gain respect from other organizations in the company, and to get budget for more headcount, infrastructure improvements, and other special projects.

As a result, in trying to prove ROI for any technical communication need, we end up focusing on the easy part of the formula. We don’t know or can’t get data on the benefits, so we make business cases using cost reduction. For example, showing how a new resource will increase productivity or decrease production costs. In effect, we ultimately make the long-term case for fewer people, lower budgets, and less time, rather than showing how continued investment in our work will result in happier customers and increased revenue.

Time is not on our side. Costs come before content gets to the user. Benefits are realized after the content gets to the users. And companies tend to have a short-term viewpoint, with an eye to quick wins, not long-term investment. We can’t wait to prove our value until we need to prove it, we must gather data now so it is readily available.

But don’t take my word for it, at ConVEx 2024 in Minneapolis, I raised this issue and asked participants to chime in. I asked my audience to indicate their agreement with the following statements:

  • Technical communicators need to demonstrate their value proposition to the business.
  • It is easy to demonstrate the value proposition of technical communication.

These questions were also asked of 634 technical communicators not at the conference. The results in both the informal polling and formal survey were virtually the same. On a normalized scale, both groups gave importance a 9 out of 10, but gave less than a 5 out of 10 on ease.

Just because it’s difficult, doesn’t mean metrics aren’t being gathered, so my presentation and the formal survey also looked at what organizations are measuring.

First, expecting that having high quality content was central to achieving positive value, we looked at the quality control processes in place within the organization. With the exception of editorial reviews, again, both groups of data tracked very similarly.

 

  Conference attendees Survey respondents
Peer reviews 8.2% never
44% sometimes
47.8% always
10% never
42.3% sometimes
47.7% always
Editorial reviews 24.8% never
39.4% sometimes
35.8% always
7.6% never
31.7% sometimes
60.6% always
Technical reviews 4.6% never
25.7% sometimes
69.7% always
6.2% never
20.6% sometimes
73.2% always
Pre-release user contact 45.9% never
48.6% sometimes
5.5% always
40.9% never
50% sometimes
9.1% always
Post-release user contact 47.7% never
38.5% sometimes
13.8% always
49.4% never
44% sometimes
6.6% always

 

We then looked at four categories of specific benefits:

  • The perception of the value of good content. For example:
    • How much content counts in decisions to buy
    • How much more would customers pay for useful content
  • Satisfaction with current content, including
    • CSAT scores
    • Feedback forms
    • Content ratings, such as likes/dislikes
  • Avoidable costs. In contrast to saving money from what is currently spent, these measures look at what would have to be spent if quality technical content didn’t exist:
    • Costs of creating and releasing updates for missing, unclear, or incorrect information
    • Costs of technical support
    • Costs to other teams to create this content rather than professional technical communicators
    • Costs to customer when their staff can’t do their jobs
  • Actual outcomes. Data that shows a direct impact from technical content, such as
    • Reduction in the amount of time for a user to complete a task
    • Increased user productivity
    • Increased sales

 

  Conference attendees Survey respondents
Perception of value 28.4% 32.6%
Satisfaction with current content 71.6% 56.4%
Avoidable costs 29.5% 17.7%
Actual outcomes 38.9% 11%

 

Survey participants were also asked how useful and how difficult each measure was, with satisfaction having the highest usefulness rating as well as the highest ease rating, outcomes having the highest difficult rating, and perception of value having the lowest usefulness rating.

We have to wonder then that when the majority of respondents are measuring what is considered both the most useful and easiest metric, why is there still an overarching perception that technical communication isn’t valued and that we don’t get what we need?

The question becomes even more poignant than it initially appears when I add the fact that although the conference poll was conducted just last week, the survey data was actually gathered in the same year as:

  • The Channel Tunnel was opened to connect Britain and France.
  • Nancy Kerrigan was attacked at a practice session for the upcoming winter Olympics.
  • Steven Spielberg won his first Oscar (for Schindler’s List).
  • Yahoo was founded.
  • Nelson Mandela was elected president of South Africa.
  • OJ Simpson’s white Bronco was chased by police and televised live on national and international news.
  • The Lion King and Forest Gump topped the box office.
  • The World Series was cancelled for the first time in 90 years due to a players’ strike.
  • The sitcom “Friends” premiered.
  • Harry Styles, Justin Bieber, and Dakota Fanning were born.
  • The George Foreman Grill, the Playstation, and the zip drive were introduced.

1994.
30 years ago.

 

For 30 years, the majority of technical communication departments have gathered at least user satisfaction data. In addition, for 30 years, innovations have made our jobs easier and less costly, theoretically making ROI easier to improve as the cost of creating content has decreased:

  • The primary method of delivery is no longer print-based; we don’t have to pay for printing.
  • Reusability of content reduces the effort to recreate, maintain, and translate duplicate content.
  • Public access to the world wide web in 1993, one year before the survey, has increased our users access to content so we don’t have to create it all.
  • Computers and software have gotten faster with greater functionality (for example, Spellcheck was added to Microsoft Word in 1995, one year after the survey).

But in 30 years, overall perceptions have not changed. It’s encouraging to see that today’s numbers were higher than 30 years ago in terms of what is being measured. However, we still struggle to prove ROI. Is our plight simply inevitable? Are we the poster child for the mantra “the more things change, the more they stay the same?”

Or, are we just not investing enough in the right things? In 1994, an entrepreneur named Jeff Bezos founded a little company called Amazon. If you had invested $1000 at the time of its IPO, when its stock prices was $18/share, you would have $2.4 million today. If we had invested a little time and money into more difficult and costly measures 30 years ago, where might we be today? And more importantly, if we continue as is, where will we be 30 years from now? But if we do learn from the past and invest in proving our worth, where could we be 30 years from now?

It’s time to ask ourselves, what are we investing in today to provide our value-add tomorrow?


About the Author:

Dawn Stevens is CIDM’s Director and President of Comtech Services. She has over 30 years of practical experience in virtually every role within a documentation and training department including project management, instructional design, writing, editing, and multimedia programming.