Tag Archives: usabilla

My time as an intern at Usabilla Announcements

My time as an intern at Usabilla

Suzanne helped Usabilla a great deal this summer. Her design skills proved invaluable, most of all in creating user flows and thinking about the user experience of our upcoming backend. It was great to have her as an intern, both on a professional and on a personal level! Suzanne was so kind to write a bit about how she experienced her time here, which you can read below. All the best Suzanne! — The Usabilla team

Read the rest of this entry »

Screen shot 2012-04-24 at 5.14.56 PM Design

Different Ways To Approach User Centred Design

User testing. Everyone knows it, everyone does it, or at least knows he should be doing it when creating user interfaces. Over time many different kinds of user testing, such as classic in-lab user testing, remote, or automated user testing, have evolved. They are all based on the same idea: user centred design. And they all have their advantages and their disadvantages. Let’s look into different approaches to user centred design and how the saying ‘many a little makes a mickle’ applies to automated remote user testing.

Read the rest of this entry »

Perfectly worded hyperlinks equals better usability and conversion Demo UX Cases

Perfectly worded hyperlinks equals better usability and conversion

A little while ago I devoted myself to the wording of hyperlinks. I set up a case study in order to find out if wording influences our users’ action, success rates, and their perception of our website. We tested three versions of the ‘About NESCAFÉ’ page, with generic, informative, and intriguing wording. Results show that generic and informative wording increased the chance of finding information, while the intriguing wording was more catchy and appealing.

Figure 1 - Informative version, interesting elements

Read the rest of this entry »

Screen shot 2013-01-02 at 10.44.06 AM Theory

Usability Testing With Children: A Lesson From Piaget

Children are becoming an increasingly important target group on the web. Good usability and high user experience are crucial aspects for a successful website. Early and repetitive user testing is the way to go. If we address children on our website, we need to focus on what they want. We need to include children as a target group in our user testing.

In this post I’d like to take a look at usability testing with different age groups. First, let’s have a look at the question-answer process to understand the importance of cognitive abilities. I will then briefly introduce Piaget’s theory of cognitive growth and explain how it can be useful for usability testing with children. What can we learn from a widely recognized scientist from the beginning of the twentieth century?

Read the rest of this entry »

The perfect hyperlink: choose your words carefully Demo UX Cases

The perfect hyperlink: choose your words carefully

People often don’t read webpages, but scan them. Good experience designers know this and take good care to provide the user with a clean headline and a prominent call to action. Great experience designers go a step further and adjust their copywriting and links to aid in the scanning. By striking a balance between informative and intriguing wording, people will be enticed to keep reading or explore the rest of the site. Some people will even do both!

Figure 1 - Subject of study - Nescafé

We wanted to test this for ourselves. How will users react to different worded hyperlinks in an otherwise identical website? Have a look at the remote test we have set up.

Read the rest of this entry »

img_3 How-tos

The Perfect Task: Optimizing Usability Tasks And Questions

Automated remote usability testing is a very simple and efficient way to gather feedback on digital interfaces – if done correctly. When you do usability testing automated AND remotely, it’s good to keep in mind that the participant is missing some common communication channels. You ask participants for feedback, but you can’t see their faces, while your participants can’t ask any questions to express discomfort or uncertainty. When you’re aware of these limitations you’re able to compensate for  it by carefully designing your test questions.

Read the rest of this entry »

Test before you spend: simple early stage user testing Design

Test before you spend: simple early stage user testing

An astonishing design, wide content, and innovative interactive elements might be of no use if not focused on the future user. The actual users foreknowledge, needs, and interests must be met to offer both a satisfying source of information and positive user experience. This sounds complicated, but really, user centred design is just a matter of the right approach. The key is to start user testing early…

Read the rest of this entry »

3 hours, 4 tools, 1 test: +19% conversion rate Demo UX Cases

3 hours, 4 tools, 1 test: +19% conversion rate

There are many many online tools out there that help you test and improve almost any aspect of your website. It can be very convenient to not only look at these tools separately, but to combine their advantages into one single test. Matthew Niederberger, a specialist when it comes to online optimization, shares his experience with such an ‘hybrid’ test case on his blog actualinsights.com. We are very happy to get some great insights about how Matthew set up a complete usabillity test in only three hours combining Usabilla with Wufoo, Kampyle, and Mailchimp. Aim of the test was to find out about users preferences for different design variations.


Read the rest of this entry »

Customize your tests, tasks and even more fun Announcements

Customize your tests, tasks and even more fun

A few moments ago we released a major update of Usabilla. We know for sure this new release will make usability testing with Usabilla even more easier for both you and your participants. We’ve improved the usability of our test interface and added some interesting new features to extend your testing possibilities.

A short overview of the most important new features in this release:

Easier feedback with an improved test interface

We’ve simplified the test interface for your participants. We’re sure the new interface makes it even easier to participate in a test and share feedback. We removed unnecessary clutter from the toolbar and combined points and notes. Users can simply add a point to answer your questions and attach a note to this point if they want to share some additional comments. Usabilla tests currently got an average conversion rate of more than 30 percent. With these new improvements we hope to improve the conversion of your tests even more.

Test interface: Add a note

Help participants by providing additional context per task

You can now add a short introduction for each task. This short intro can be used to provide some additional context for your participants.

Some examples:

  • You want to book a trip to Amsterdam in October. You’re searching for a hotel in the price range of 80 to 100 dollar. Where would you click to search a hotel?
  • We would love to know what you think of our new homepage. Please let us know what your think of this page. Click on the screen to add a note.

Give users some additional info for each task

Customize your introduction and thank-you page

Your tests need a clear introduction. You can now improve a test by customizing the introduction. You can use basic HTML for markup and some simple tags to automatically include page thumbnails, the number of tasks, etc. The Thank-You page can also be customized. This page is shown after a participant finished a test. You can use the page to thank your participants for their input or to point them to pages or forms.

Test interface: Add a note

Combine one-click-tasks and standard tasks in one test

A few weeks ago we launched ‘One-Click-Tests’. A one click tests is very simple test, where users can only use one point to answer a question. You can use this method for example to test first impressions or to measure the time it takes to accomplish a certain task. You can now combine one-click-tasks and standard tasks in one test. You can define for each task if it’s a standard or a one-click-task.

Add an introduction per task

Improved results and task timers

We gave the results page a small facelift. The layout of the page changed a bit and we added some new features to analyze your test results. You can use our selector to select a specific area of your page and check the stats for that area (how many users clicked in a specific area?). We now also track time per task. This helps you to compare different tasks and pages.

Test interface: Add a note

Upgrade your plan

We’re going to launch paid plans in the near future. We’ll offer different plans, varying in price between $49 and $950 per year.

You can continue to use Usabilla for free, but your reports will be limited to 25 participants per test.

Share interesting cases

Please let us know if you need any help with a test case or want to share your findings. We would love to learn from your questions and usability challenges.

Contact us by mail (info@usabilla.com), via Twitter (@usabilla) or leave message at our customer support page at GetSatisfaction (http://getsatisfaction.com/usabilla).