Dr Amanda Patterson, Comtech Services
October 1, 2024
At a recent CIDM roundtable we discussed the role of metrics in technical communication. While many technical communication practitioners acknowledge the need and value of metrics, few are actually collecting and using metrics. Metrics allow tech comm teams to share their goals, successes, and progress to their shareholders, leaders, and cross-functional teams.
Measuring within tools
Several people at the roundtable discussed that one of the easiest ways to start collecting metrics is to use the in-tool analytics and metrics. For example, tools like [Zoomin] and DCatalog have built in analytics features that capture various metrics about your content’s performance. Each tool will have different metric measures; usually specializing on what the tool is designed for.
Another popular tool is Google analytics. Yes, you can get all the SEO and click-on types of metrics but for technical communication the group questioned the value of those clicks metrics. Given that most technical communicators are working on product documentation, how many times (or even ‘if’) a document is getting clicks or downloads does not provide clear direction on whether that content should exist. It might provide insight on:
- how the customer journey is unfolding (the path people are taking to access the content)
- how audiences are referring to your products (the use of synonyms or even common versus formal names and the like)
- accuracy of your metadata tags or titling (that your content is surfacing in your documentation portal where it should)
Basically, it could provide insight into how your information is being located or found by your customers. Unlike marketing collateral and campaigns, our product documentation, particularly in regulated industries, is required regardless of how few clicks and downloads the content gets which makes the click-type metrics less insightful. With that said, if you have content that you know should be highly sought after (new product release content, for example) and that is not getting the expected volume of traffic, it is a good indication that it is not being found in your portal and something is amiss in your publishing process.
Measuring production
When measuring documentation production, the roundtable indicated that it would be important to minimize individual performance and focus on the team efforts. There are still robust metrics about production that you can get at the team level to reflect your efforts. The most simplistic of these is how long it takes to produce content.
There are some easy to use and affordable time tracking software available such as Clockify, Timely, Toggl Track, etc. Time tracking software such as these will give you an idea of how much time it takes to complete projects. This will also allow you to quote time for projects with increased accuracy.
Another production related metric to measure is about the number of pieces of content created or updated. While much of the roundtable discussion was about work that was unplanned or unscheduled, this same metric can be applied to planned projects. These metrics include counting things like:
- calls deflected
- tickets or requests closed
- number of maps, topics, or other pieces of content updated and/or created
Any of these will help teams build the story of their content creation because it provides a number that helps leadership to understand the workload that projects require. This is also something that can be counted without specialized software; which makes this an opportunity to start showing the value of the technical communications team; the first step of asking for additional funding or other resources.
GenAI Metrics
As much as Generative AI (GenAI) has started to revolutionize our industry, it has not particularly changed metrics. As with any new technology, there are a few things that can be measured about GenAI, but they all require similar metrics that have been historically collected such as efficiency and accuracy.
The biggest selling point of GenAI is the gain in efficiency that is being seen at nearly every step of the content creation process. SME information dumps are being summarized faster, first drafts of short descriptions and release notes are being created in record time, and even conversion from one structured framework to another is happening faster than ever. Individually these may not seem like huge gains, but they are measurable efficiencies (if project time is being tracked, particularly at a team level). Anecdotally, these efficiencies are freeing up writers’ critical thinking bandwidth to spend more quality time on high effort projects (which should also be a place that measurable time savings can be demonstrated).
There will also be metrics on accuracy. This can be collected by using scenarios or test cases for chatbots where known question-answer pairs are given the chatbot and the answers are human verified. If the chatbot answers the questions correctly and references the correct document(s) then the chatbot is rated at some percentage of accuracy. Several participants discussed how they have set up 50-200 scenarios to run this kind of testing.
The other version of accuracy metrics that could be collected is on the generations that the AI creates – Is the chatbot answering in a manner that meets the tone and style of your brand? Again, this would be scenario tested but the answers would be scored or measured in accordance with an established writing style and style guide. This is particularly important for technical communications as so many companies are using the product documentation to train the private large language models (LLMs). If the product documentation is of high quality and closely adhering to the style guide, then the LLM has a much better chance of being tone and style accurate.
Conclusion
Metrics do not require expensive tools to start collecting and the sooner you start collecting and establishing benchmarks, the sooner that you will be able to tell the story of your team. Additionally, look to your leadership for what metrics they are interested in (how do they measure progress on their teams) to understand what you should be measuring. Use the metrics to make data driven decisions about priorities and to advocate for resources that your team needs. It is harder for leadership to deny your asks if you can prove to them that there is clear return on investment (ROI).