Improbable Defence

Technical writing: making the (unique) case for guerilla docs testing

Unique case blog

The technical writing team at Improbable Defence is focused on simplifying complex technology, advocating for the needs of different users so they can use our tech successfully.

A day in the life might involve writing for military planners weighing their decisions against a mountain of known and unknown factors. Or scientific modellers who want to simulate the results of different pandemic mitigation techniques. We write for warfighters using virtual training to learn how to operate effectively as a unit, and network engineers tasked with keeping critical national infrastructure up and running. One thing is always true: the content we provide is only valuable if it supports their aims.

It’s hard to know exactly what users need when it’s common for technical writers to be one removed from their audience. Anne Edwards, Lead Technical Writer, and Alexandra Hayes, Senior Technical Writer, talk through how they’re starting to bridge that gap.

Anne Edwards

                  Anne Edwards, Lead Technical Writer

                Alexandra Hayes, Senior Technical Writer

Imagine you’ve just got a new job as a food critic and restaurant reviewer. You’re excited to get started and are busy planning which places you’re going to visit first. But then your boss tells you you’re not allowed to go there yourself. You’re not even allowed to talk to people who’ve eaten there before. It’s not all bad, though – you can talk to Sam in the Sales team, who knows someone who ate there once…

In most fields of writing this would seem ridiculous, but as Michael J. Metts and Andy Welfle point out in Writing is Designing: Words and the User Experience, not having direct contact with the users we’re writing for and instead having to make do with people who are one or two steps removed from them is a surprisingly frequent scenario for technical writers.

As technical writers, we’re focused on providing our users with the resources they need in order to use the tech we create successfully . This could be documentation, in-app help text, or even video tutorials. We create these resources in collaboration with subject-matter experts, who we rely on to explain functionality and answer all of our (many) questions. We know we’re not the user, so we can’t simply aim to create content that would help us. Instead, we somehow need to put ourselves in our users’ shoes and work out what will be the most beneficial to them. The best way to do that is to talk to the users themselves.

The challenges of getting user feedback

Often, this is easier said than done. Our products are technically very complicated, and although we’re a small team, we devote a lot of time and care to getting the documentation right. With so much content that needs writing, and so little time, focusing on user research sometimes gets put to the bottom of the list.

Other challenges are more specific to the defence industry. We’ve sometimes worked with defence subject matter experts on secondment from clients, but they tend to rotate frequently. While they’re on secondment, everyone wants a piece of them, so it can be hard to make the case for prioritising technical documentation. And as writers working on a software platform aimed at multiple users with different needs, we need to look beyond basic feedback mechanisms or analytics that are more geared towards getting people to buy stuff, or stay on a website for as long as possible. We’re more concerned with taking lots of complex information from different sources and making it all comprehensible to a wide range of different users. We can’t take a ‘one size fits all’ approach, and our content needs to reflect that.

We could rely on our own experience, follow ‘best practice’ in our writing and adhere to standards in information architecture, accessibility and peer reviews. But without feedback from real users on what they actually want and need, our documentation will never be as good as it could be. If it falls short of its purpose, and without structured ways of gathering that feedback, we might never even know that it was failing.

Don’t be afraid to try new things and start small

We’ve mentioned several challenges, but there is a way forward. We’re part of a fast-growing startup which means that we’re encouraged to try out new things on a small scale. In fact, it was our manager who suggested we kick off some ‘guerrilla’ user testing (low-cost, fast testing on a small number of users, to an agreed plan). And because we’re part of a wider Design, Research, and Content team, we were able to lean on the product researchers to help us get started.

We already had an idea of the information we wanted to find out: are we writing at the right level of detail? Can people find what they’re looking for easily? Is what we have both clear and useful? What do users expect from our documentation? Are there any gaps?

First, we asked the product researchers to validate our initial ideas and come up with a testing plan. Amy, our researcher colleague, gave us advice on the kinds of research methods that would be useful, the inputs required, and the outputs we could expect. What we eventually settled on was a series of one-to-one interviews to help us get more information on the different needs and motivations of different groups of users, in tandem with a diary study to run over a couple of weeks.

For the interviews, we decided that our US-based colleagues were as close as we could get to “real” external users. They’re sufficiently distant from the creation of the tech, and the documentation is often their first port of call. We set up individual meetings with three volunteers and asked questions to discover their pain points with the documentation website.

For the diary study, we set up a Google Form to go out automatically to six people every weekday for two weeks. In the end, three of those responded regularly, which was enough for us to get started.

What we learnt

Perhaps unsurprisingly, people in different roles did use the documentation in different ways and for different reasons. It’s important to note that our internal test subjects are in different roles to our external users, but they can still tell us something useful. It confirms that we’re right in thinking a ‘one size fits all’ approach isn’t the way forward.

We learned three key things from guerrilla docs testing:

  1. Technical writing saves time – and money. When users couldn’t find answers on the documentation site, they’d ask their colleagues for help – which undercuts the aim of good documentation to speed up the user experience, increase satisfaction and confidence, and reduce support costs.
  2. We can be wrong. Turns out, no-one noticed that the documentation site didn’t have a search feature, when we thought it would be a priority user requirement.
  3. When users couldn’t achieve their goals with the help of documentation, it wasn’t because the existing content was too vague or confusing, but because the content wasn’t aimed at their specific need. Thankfully what we’d already written was useful and at the right level of detail.

Next steps

User testing has to be a process, not a one-off event, so we’re keen to keep the momentum going and build on what we learnt. We gained useful information on where the pain points are for our users in terms of access to, and visibility of, the documentation site. Most importantly, we’re going to continue to push for as much access as possible to real users in real roles, because even small-scale testing backed up how important that is.

Guerilla testing proved to be a practical and efficient way to get valuable insights that will help to shape the direction of our documentation. A small amount of up-front effort helped to verify our hunches (and prove them wrong!). Having that confirmed in the real world can only improve our content, and by extension improve outcomes for our users.

As a technical writer tasked with making sense of intricate, nuanced technology, it’s satisfying to glean feedback in this way. To produce content off the back of it that not only solves problems for users but lets them work faster and more efficiently is the icing on top of a very complex, very rewarding cake.

Ready for a similar challenge? At Improbable Defence, we work as one – and we’re growing. Join us to work at the very edge of technology, research and simulation and help users plan and train to meet the world’s most complex security threats.

Check out our open roles to find your perfect fit.