Tag Archives: usecase

Test before you spend: simple early stage user testing Design

Test before you spend: simple early stage user testing

An astonishing design, wide content, and innovative interactive elements might be of no use if not focused on the future user. The actual users foreknowledge, needs, and interests must be met to offer both a satisfying source of information and positive user experience. This sounds complicated, but really, user centred design is just a matter of the right approach. The key is to start user testing early…

Read the rest of this entry »

Some statistics on the use of different tasks Theory

Some statistics on the use of different tasks

When you set up your Usabilla test you can pick one of our suggested tasks or add your own tasks. Most of the tasks in Usabilla tests are predefined tasks in any of the 20 languages we’re offering. I was curious which tasks are most popular and analyzed the tasks and results in our database. I want to share a selection of my findings about how our users use tasks and how participants respond to these tasks.

Which tasks are the most popular?

When we launched our first beta we didn’t provide any suggestions for tasks. Most of our early users were completely lost without any guidelines. Based on tests with our prototypes and the input of usability pro’s we made a small list of predefined tasks. We guessed that these tasks can inspire our users to add their own research questions. The following chart shows the ‘market share’ of each of the 9 predefined tasks (out of all predefined tasks).

Which predefined task is the most popular?

Which predefined task is the most popular?

Roughly 80 percent of all the Usabilla tasks are one of the predefined tasks. About 20% of all tasks are custom tasks added by users. Our users are experimenting more and more with different types of tasks. Some of the custom tasks defined by users to inspire:

  • ‘Test’
  • ‘Where do you click for more information about …’
  • ‘Mark the things that you find confusing and tell us why.’
  • ‘Where would you click to continue after reading the content?’
  • ‘Mark your three least favorite elements of this page.’
  • ‘Take a look at this page and give us some comments about layout and styling.’
  • ‘You would like to add a comment to an article you’re reading. Where would you go?’
  • ‘Where do you click if you want to know more about …’
  • ‘Where do you click first?’
  • ‘Click on the search box.’
  • ‘Where do you click to select a date.’
  • ‘Where would you click to learn about … ?’

How many points do participants add?

We offer two types of tasks in Usabilla. A standard task allows participants to answer a question or tasks with zero or more points. Participants can attach notes to these points to share additional feedback. A one-click task can be answered with only one point and participants can’t add any notes. These tasks are especially interesting to measure first impressions and task performance (incl. time). All our predefined tasks are standard tasks, where participants can add multiple points and are able to comment with notes. The chart below shows how many of these points participants add on average for each of the predefined tasks.

How many points do participants use per task?

How many points do participants use to answer a task?

The positive predefined tasks seem to make participants more trigger happy. Participants answer the positive tasks on average with 12 points and use only 10.5 points to answer negative questions.

How many points do participants use for negative and postive tasks?

How many points do participants use for negative and postive tasks?

What works for you?

We would love to hear which tasks work the best for you. We’re planning to expand the list of predefined tasks in the near future and add some options to make it easier to set up different types of tests (for example measuring call-to-actions, attention, recognition, credibility, findability, collecting design feedback, information scent, etc). Feel free to share your experiences and wild ideas in the comments, on Twitter (@usabilla), or by mail (support@…).

Underdogs beat Expedia in usability showdown Demo UX Cases

Underdogs beat Expedia in usability showdown

The international travel site Expedia (Alexa Rank 816) gets defeated by its competitors Hotwire, Priceline, and Travelocity on basic usability tasks. Expedia performed the worst in a usability showdown between the four major international travel sites. A total of 148 people participated in this usability test and tried to perform three basic tasks on one of the four websites.

Alexa rank for Expedia, Hotwire, Travelocity, and Priceline

Alexa rank for Expedia, Hotwire, Travelocity, and Priceline

Book a room in Amsterdam

The 148 participants tried to book a hotel room in Amsterdam in a certain price range on one of the four big travel websites. They had to find a way to search for a hotel room (task 1), limit their search on price (task 2), and book a room in the correct price range (task 3). The task performance was measured by the success rate and the time per task.

Expedia homepage

Expedia, Hotwire, Travelocity, and Priceline

Read the rest of this entry »