Tag Archives: tasks

img_3 How-tos

The Perfect Task: Optimizing Usability Tasks And Questions

Automated remote usability testing is a very simple and efficient way to gather feedback on digital interfaces – if done correctly. When you do usability testing automated AND remotely, it’s good to keep in mind that the participant is missing some common communication channels. You ask participants for feedback, but you can’t see their faces, while your participants can’t ask any questions to express discomfort or uncertainty. When you’re aware of these limitations you’re able to compensate for  it by carefully designing your test questions.

Read the rest of this entry »

Some statistics on the use of different tasks Theory

Some statistics on the use of different tasks

When you set up your Usabilla test you can pick one of our suggested tasks or add your own tasks. Most of the tasks in Usabilla tests are predefined tasks in any of the 20 languages we’re offering. I was curious which tasks are most popular and analyzed the tasks and results in our database. I want to share a selection of my findings about how our users use tasks and how participants respond to these tasks.

Which tasks are the most popular?

When we launched our first beta we didn’t provide any suggestions for tasks. Most of our early users were completely lost without any guidelines. Based on tests with our prototypes and the input of usability pro’s we made a small list of predefined tasks. We guessed that these tasks can inspire our users to add their own research questions. The following chart shows the ‘market share’ of each of the 9 predefined tasks (out of all predefined tasks).

Which predefined task is the most popular?

Which predefined task is the most popular?

Roughly 80 percent of all the Usabilla tasks are one of the predefined tasks. About 20% of all tasks are custom tasks added by users. Our users are experimenting more and more with different types of tasks. Some of the custom tasks defined by users to inspire:

  • ‘Test’
  • ‘Where do you click for more information about …’
  • ‘Mark the things that you find confusing and tell us why.’
  • ‘Where would you click to continue after reading the content?’
  • ‘Mark your three least favorite elements of this page.’
  • ‘Take a look at this page and give us some comments about layout and styling.’
  • ‘You would like to add a comment to an article you’re reading. Where would you go?’
  • ‘Where do you click if you want to know more about …’
  • ‘Where do you click first?’
  • ‘Click on the search box.’
  • ‘Where do you click to select a date.’
  • ‘Where would you click to learn about … ?’

How many points do participants add?

We offer two types of tasks in Usabilla. A standard task allows participants to answer a question or tasks with zero or more points. Participants can attach notes to these points to share additional feedback. A one-click task can be answered with only one point and participants can’t add any notes. These tasks are especially interesting to measure first impressions and task performance (incl. time). All our predefined tasks are standard tasks, where participants can add multiple points and are able to comment with notes. The chart below shows how many of these points participants add on average for each of the predefined tasks.

How many points do participants use per task?

How many points do participants use to answer a task?

The positive predefined tasks seem to make participants more trigger happy. Participants answer the positive tasks on average with 12 points and use only 10.5 points to answer negative questions.

How many points do participants use for negative and postive tasks?

How many points do participants use for negative and postive tasks?

What works for you?

We would love to hear which tasks work the best for you. We’re planning to expand the list of predefined tasks in the near future and add some options to make it easier to set up different types of tests (for example measuring call-to-actions, attention, recognition, credibility, findability, collecting design feedback, information scent, etc). Feel free to share your experiences and wild ideas in the comments, on Twitter (@usabilla), or by mail (support@…).