The Perfect Task: Optimizing Usability Tasks And Questions
How-tos | User Experience | Theory | User Experience

The Perfect Task: Optimizing Usability Tasks And Questions

/ by Sabina Idler

Automated remote usability testing is a very simple and efficient way to gather feedback on digital interfaces – if done correctly. When you do usability testing automated AND remotely, it’s good to keep in mind that the participant is missing some common communication channels. You ask participants for feedback, but you can’t see their faces, while your participants can’t ask any questions to express discomfort or uncertainty. When you’re aware of these limitations you’re able to compensate for  it by carefully designing your test questions.

In our post Craft better content with 3 basic communication models we talked about different levels of communication and the impact of their absence on the message we communicate. In this post we go a step further and take a look at test questions and how to make sure our test participants understand the questions we ask, and feel comfortable to answer them.

Task or question

Usabilla offers two kinds of test questions: One-click tasks and open questions. A one-click task only, includes one answer option. One-click tasks can be used for A/B testing, measuring performance, or to find out about questions like: How do people approach my website?, Where do they expect to find information about xy?, or Where do they click for action xy?.

Open questions on the other hand contain multiple answer options, are more general, and participants can add notes to share their opinion. Open questions can be used to collect feedback on questions like: What do users think about my website?, Do they trust me?, or Do they like my design?

Short and sharp

There are two formal aspects to consider when designing test questions. First of all, try to keep your test questions as short and precise as possible. No one likes to read a novel when asked to participate in a short survey. So, keep your participants motivated by sparing them any unnecessary reading. Secondly, be specific. The broader your questions are, the more diverse the answers will be. Know what you want to test and formulate specific questions to get the information you are looking for.

Focus on your participant

As with anything that you do for your clients, a Usabilla test should be user-centred. Keep in mind the missing communication channels and that your participants cannot ask questions halfway through the test. The first advise we give here is to keep it simple. Use language that is easy to understand and try to avoid jargon. Jargon might either scare participants off or alter your test results because questions were misunderstood.

Be aware of your participants’ background and foreknowledge. Don’t expect them to be experts in your field, unless, of course, they are. If you feel that certain information is needed to understand your question, offer it. If you are uncertain about your participants’ level of foreknowledge, you might want to offer optional extra information and mark it as such. This way participants can choose for themselves if they want to read or skip it.

Last but not least, be cool, polite and kind. Remember, you want something from your participants, keep that in mind when designing your questions, introduction, and thank-you pages.

Prevent priming

Priming is the increased sensitivity to a stimuli due to prior exposure to a similar stimulus. Don’t worry, this sounds way more complicated than it is. Let us explain: A lot of times, we want to test whether or not, or how easy users find certain elements on our website. For example we want to find out how long it takes participants to find the ‘sign up’ button. Now if we ask: “Where do you click to sign up for our product?”, we prime our participants with the words “sign up”. This means they already know exactly what to look for and are likely to find the button faster than if we had avoided the exact wording. An alternative, neutral question would be: “Where do you click if you want to start to use our product?”. Try to not prime your participants and choose words for your questions that do not occur on your test interface.

In short…

To sum up, we can say that a test question is not necessarily a good test question. We have identified several aspects to keep in mind when setting up your next Usabilla test. Good user-centered tasks should be short and precise, simple, and prevent priming as much as possible. It’s always a good idea to do some test runs of a test with a selected group of users, colleagues or friends and double check your questions before launching your test for a bigger audience. Our preview mode enables you to dry-run a test without storing any data. Ask your guinea pigs for feedback on spelling, writing style, and comprehensibility.

Further reading

In the chapter on Automated Research in their book Remote Research, Bolt & Tulathimutte offer very nice examples on task specificity and priming.

| | | | | | | |
Article by

Sabina Idler

Sabina is community manager, technical writer & UXer @ Usabilla. She is interested in Usability, User Experience, Design, and everything that makes the Web a better place. Follow Sabina on Twitter.

Become a writer here!

Share your thoughts

Pin It on Pinterest