Does usability testing work for documents? The answer from me and Ginny Redish is a resounding yes.
In this article, we’ll give you three techniques for having people try out documents or any other stand-alone content. These techniques apply whether your document is on paper or online — for example, as a web page or a PDF. They apply for both in-person and remote usability testing — especially with moderated remote testing.
What is a document?
We’re talking about functional documents that provide information to people — not fiction or poetry. Functional documents include informative banners — such as the ones on many websites about how an organisation is dealing with COVID-19 — legal documents, manuals, notices, official letters, press releases, privacy policies, terms and conditions, and more.
Although a document might just deliver information, it could also require action after someone has read it. An online document might link to other content or provide a way to add comments or give other feedback.
Why test documents?
If your document has a purpose, you need to find out whether it works for that purpose. (If it has no purpose, consider whether it is necessary at all!)
If a functional document does not work, there will be problems—for example:
- At the mildest, the reader might become confused and misread a message.
- More seriously, people might miss out on a benefit because the document was too dense to read.
- At the worst, there could be lawsuits, fines, penalties, people getting sick, or someone making others sick because, for example, a message about how a virus spreads wasn’t clear enough.
Usability testing lets you see problems in a document. It gives you insights about how to fix the problems before the document reaches a wider audience.
How can I test a document?
Here are three ways to test the usability of documents:
- “Tell me in your own words.” — This is called paraphrase testing.
- “Mark positives and negatives.” — This is the plus–minus testing technique that Dutch researchers Menno de Jong and Peter Jan Schellens devised.
- “Find an answer or do something.” — This is known as task-based testing.
For each test session using these techniques, you’ll need a facilitator — probably you — and a participant — one of the people who would be likely to use the document. If possible, it’s great if you can also have a notetaker, as you would for any other usability testing.
1. Tell me in your own words
In a paraphrase test, each participant goes through the document a bit at a time. After reading each bit, the participant tells you, in his or her own words, what that bit said.
Paraphrase testing is a great way to find out whether people understand the messages in a document such as an email message, letter, notice, jury instruction, lease, warranty, or a web page whose aim is to convey information.
This method also works for longer documents that people must read from beginning to end — such as a contract or legal brief.
You’ll learn whether:
- the document’s organisation makes sense to your participants
- your paragraphs and sentences are short enough for people to get all the parts
- the words you use are ones your participants understand.
You’ll also hear whether participants use different words from those in the document.
Before you start the test, decide where the breaks between bits will be. A bit might be a sentence, a short paragraph, a list, or one provision of a contract. You want each bit to be both meaningful and small enough for your participants to grasp.
In a one-on-one interview — whether in person or remotely — ask a participant to open or go to the document.
Start by asking a few general questions such as:
- What’s your first impression of this document?
- Who is it from?
- What do you think it’s about?
Then, ask the participant to read whatever you’ve decided is the first bit. For example: “Please read just the first paragraph, then stop.”
We often ask participants to read out loud. That lets us hear what words the participant reads easily and words the participant stumbles over.
You might tell the participant that it’s okay to be unsure how to say some of the words out loud. If your participants are not comfortable reading out loud, you can have them read to themselves. The important part is having them tell you, in their own words, what they just read.
At each stopping point, ask the participant, “Please tell me, in your own words, what that bit you just read means to you.” Take notes on these points:
- what they understood correctly
- what they misunderstood
- what they left out
- words they used that are different from the words in the document.
You and the participant continue in this way through the document — or the part of the document you’re interested in — with the participant reading and then paraphrasing.
At the end of this test session, you might ask a few more questions, such as the following:
- You might probe for the participant’s understanding of specific words. For example, “The letter uses the words gainfully employed. What does gainfully mean to you here?” or ‘What would be an example of being gainfully employed?’”
- “What would you do now that you’ve read this letter?”
- “How do you feel now about the organisation that sent this letter?”
2. Mark positives and negatives
Dutch researchers Menno de Jong and Peter Jan Schellens developed the plus–minus testing technique, and they describe it this way:
“Participants are asked to read a document and put pluses and minuses in the margin for positive and negative reading experiences. After that, the reasons for the pluses and minuses are explored in an individual interview.”
Plus–minus testing is a good way to get people’s reactions to a document. You can choose what plus and minus mean for your document, depending on its purpose and what you want to learn.
You can use plus–minus to:
- probe for your participants’ opinions about what is clear and what is not clear to them
- get people’s emotional reactions to a document
- investigate some other attribute such as confidence in the organisation.
We find that, when we ask for positive and negative reactions in general, we often get comments about wider topics such as the tone of a document alongside comments about whether the document makes sense.
You can also probe more deeply for specific emotional reactions. For example, you might want to find out how well the document transmits a brand’s values. You could ask participants to write a plus sign where the document makes them feel positive about the brand and a minus sign for negative feelings.
Anything that has both positives and negatives can be the focus of a plus–minus usability test. Pete Gale describes using plus–minus to explore how a document did and did not make people feel confident about a government service in his blog post A Simple Technique for Evaluating Content.
Decide what you want to focus on for plus–minus testing. Write a short set of instructions so participants know what plus means for this usability test and what minus means. Give or send the instructions and the document to each participant.
If you are doing this testing in person or sending people the document on paper, you might ask participants to use pens with different colors—perhaps yellow to mark pluses and blue to mark minuses. (We avoid red and green to avoid issues for people who have color-deficient vision.) Ask participants to write comments about their pluses and minuses as they mark up the document.
For participants who are working online, we’ve found that it’s easier for them to type + or – at the start of each comment. Most editing programs seem to make using highlighting a bit tricky.
If you send the document to the participants to mark up on their own time, we recommend arranging a brief interview to go over their general reactions and their specific markings.
3. Find an answer or do something
You may be working on a document that people won’t read from beginning to end. For example, people rarely read through annual reports, benefits handbooks, insurance policies, manuals, and many other types of documents. They refer to them only to get the answer to a question, check a specific fact, or follow some instructions.
Task-based testing of a document for which finding information is as important as understanding it is much the same whether the document is on paper or digital — on a website or in an app. Watch and listen as your participants try to use the document to find and understand the information.
Usability testing of websites typically uses techniques that were developed for usability testing of software and software documentation. Both of us were doing task-based usability testing for printed manuals and other paper documents before websites existed.
The reader here is really a user who wants to read only what is necessary. Finding the right place is critical. The reader first finds where the information is; then perhaps skims and scans to read just enough to get the answer, check the fact, or follow the instructions.
Just as in typical usability testing for websites, think about what you want to learn. You might decide to write scenarios for specific tasks for your participants. Or you could ask each participant what task they might have that is relevant to this document. Give or send the document to each participant or have participants go to it online.
As with other techniques, you might start with the following general questions:
- “What’s your first impression of this document?”
- “Who is it from?”
- “What do you think it’s about?”
Give the participants the scenarios, one at a time, or ask them to tackle each of the tasks they’ve identified themselves.
Watch and listen — whether remotely or in person — as they use the document. If the task is to get an answer or check a fact, the participants tell you what they found. If the task is to act — or follow some instructions — you’ll watch and listen as they act — or explain how they would act if it’s not something you can have them do then and there.
You might end with the same types of questions or surveys that you use when you’re conducting usability testing for websites, products, or services.
How do I choose which technique to use?
Let’s return to the point about purpose. As Table 1 shows, The intended purpose of the document can help you decide how to test it.
|Purpose of Document||Best Technique|
|To explain something in detail—whether to read now, refer to later, or act on after reading||Paraphrase testing|
|To give a general understanding of a topic
To create an emotion—for example: reassure, create confidence, or build trust
|To give answers or instructions||Task-based testing|
Can I use these techniques remotely?
Yes, certainly. Both of us have tested the usability of documents in person and remotely.
Remote testing has the advantage that participants are in their own space. But that also means you’re a virtual visitor in their space.
The UK Government Digital Service has thoughtful advice about conducting remote user research in their article, Conducting User Research While People Must Stay at Home Because of Coronavirus. You’ll find their good advice useful even when the pandemic is over.
You might want to look into tools that are specifically for remote usability testing — such as those in Justinmind’s list of usability tools. However, you might not need a specialised tool. You can do live, remote testing using a free chat or video-conferencing tool. We’ve used Skype, Google Hangouts, and WhatsApp calls. They all offer enough video for a usability test.
Ideally, find out what tool each participant is most comfortable using and be prepared to learn that tool for that particular test.
You can share a document with your participants by screen-sharing or by creating a permanent or temporary location for a document that they can use online. If you need to keep a document confidential, secure it with a password or some other identification process.
One government organisation refused to let Caroline use most of the popular tools on her government-issued notebook computer. But they let her use a separate, stripped-down computer that was not linked to their network. That way, she could do usability testing with the tools that the participants were most likely to want to use.
For more ideas, see the crowd-sourced tools and tips for remote research that user researchers at the UK Government Digital Service have put together.
Testing remotely with paper
Some participants might not use digital technology. You can still include them by using mail, paper, and your phone. Here’s how:
- Put the document into an envelope with the instruction: “Please do not open this envelope until our phone call.”
- Write a short letter with the arrangements for the phone call — including the time you’ll call them, how long the interview will take, the purpose of the call, and how you will use the results.
- Put the letter and the envelope containing the document into a larger, outer envelope.
- Mail the whole package to the participant.
You won’t be able to see these participants as they work through the document. During your interviews, ask each participant to have their document in front of them. Make sure that you have an exact copy in front of you, too. Explain to participants that they need to help you by telling you what they’re looking at, so you can keep up with them.
If you opt for a plus–minus test, once you and a participant have finished your discussion about the document, ask the participant to mail the marked-up copy back to you. Make the mail-back part of this process easy for participants by including a stamped, self-addressed envelope in the package.
Bottom line: testing documents is necessary and powerful
Documents often have very serious legal, economic, or medical consequences.
Usability testing helps you find out how well your document works for the people who use it.
Don’t stop testing just because you have to test remotely.
Testing documents is fascinating. You’ll learn what really works — and what doesn’t — directly from people who need to use the document in real life.
Most of all, testing documents is powerful. When we create documents, we can become too familiar with them. The results from testing cut through that familiarity and help us — and our colleagues — make better decisions about what to change and what to keep.
We would like to thank Carol Barnum, Ella Botting, and Benjy Stanton for helping us by commenting on this column.
This article first appeared in UX Matters, May 2020.
Menno de Jong and Peter Jan Schellens. “Toward a Document Evaluation Methodology: What Does Research Tell Us About the Validity and Reliability of Evaluation Methods?” IEEE Transactions on Professional Communication, October 2000. Retrieved May 3, 2020