Category Archives: Demo UX Cases

3 hours, 4 tools, 1 test: +19% conversion rate Demo UX Cases

3 hours, 4 tools, 1 test: +19% conversion rate

There are many many online tools out there that help you test and improve almost any aspect of your website. It can be very convenient to not only look at these tools separately, but to combine their advantages into one single test. Matthew Niederberger, a specialist when it comes to online optimization, shares his experience with such an ‘hybrid’ test case on his blog actualinsights.com. We are very happy to get some great insights about how Matthew set up a complete usabillity test in only three hours combining Usabilla with Wufoo, Kampyle, and Mailchimp. Aim of the test was to find out about users preferences for different design variations.

 

Read the rest of this entry »

Boost Your Web Credibility: Learn From The Pro’s Demo UX Cases

Boost Your Web Credibility: Learn From The Pro’s

As promised, we’re back with the results from our web credibility test case that we previously introduced. We are very pleased with the results as they seem to be in line with previous findings on web credibility by experts in the field. The findings of this test case support the assumption that different aspects on a website can either increase or decrease it’s perceived web credibility.

Figure 1 – heatmap showing elements that were perceived to increase the website’s credibility

Figure 1 – heatmap showing elements that were perceived to increase the website’s credibility

Read the rest of this entry »

Help us to unravel credibility on web pages Demo UX Cases

Help us to unravel credibility on web pages

Credibility in many cases is the most important driver for conversions. Recently we’ve been experimenting with new tasks to determine factors that influence credibility on webpages. A demo case with Mint helped us to determine how people react to questions about the credibility of a webpage. Based on this pilot we’ve set up another study to measure drivers for credibility on six other websites.

Test introduction - Click to participate (3 minutes)

Read the rest of this entry »

Mashable: “5 Essential Web Apps for the Lean Small Business” Announcements

Mashable: “5 Essential Web Apps for the Lean Small Business”

Donna Wells, CEO of Mindflash.com, named Usabilla as one of five essential web applications for Small Business in an interesting article on Mashable. The user focused online training startup Mindflash uses Usabilla to collect online feedback from their users. Mindflash recruits participants with Ethnio:

“Ethnio allows us to grab and vet potential candidates through website pop-ups. Usabilla then pulls the best candidates from the Ethnio session to work through questions, product concepts and sends feedback and analysis to the business.”

“Set up properly, this nifty duo works almost entirely hands-off, allowing the site to motor along while continually mining data. And it keeps our top research and product talent focused on interpreting the results – not collecting them.”

Google Chrome

Mindflash reinvented itself with a little external help from Adaptive Path. Paula Wellings (Adaptive Path) and Cameron Gray (Mindflash) shared insights about their journey to become more customer focused at UX Week 2010. A video of their presentation about ‘Pizzability’ (who can spot the Usabilla screenshot?):

UX Week 2010 | Paula Wellings & Cameron Gray | Turning a Developer-driven Organization into a UX Company from Adaptive Path on Vimeo.

We’re happy to facilitate the dialogue between Mindflash and their customers and are proud to be mentioned by their CEO, Donna Wells, as one of the most essential web apps.

Do you like the article on Mashable? Please retweet it, share it on Facebook or post it on LinkedIN using the buttons next to the article on Mashable.

Why do you trust Mint? Takeaways from a visual survey Demo UX Cases

Why do you trust Mint? Takeaways from a visual survey

Two weeks ago we published four tiny visual surveys to help inspire Usabilla users. Our users and visitors participated in these surveys and shared their feedback on different webpages. We will share the results of these test cases to give you an idea of how these short visual surveys can help you understanding your users. As requested we will start with a selection of the results from the Mint demo case: “Click on the things that make you trust Mint. Please explain why.

We analyzed the results 177 participants in this survey. These participants answered the question with 596 points (3.4 points / participant) and explained with 140 notes (0.8 / participant). We will share four takeaways based on a selection of their feedback.

Heatmap

Read the rest of this entry »

Mashable redesign: What draws attention? Demo UX Cases

Mashable redesign: What draws attention?

Mashable made a fresh start of the new year by launching a redesign. The intention of this new design was to put more focus on the stories, removing clutter, and dividing the content into sections (Home, Social MediaMobile, Web Video, Entertainment, Business, Tech, and Jobs). In the past week more than 150 people commented on the blog post about the new design. Most reactions on Mashable seem to be positive about the new look and feel: ‘Fresh & clean’, ‘I like the sections’, ‘More professional’, and ‘Clean and Simple’. What are the most important changes in this design iteration and what can we learn from feedback? We asked 60 social-media-savy participants for feedback.

Mashable: What draws attention?

Mashable: What draws attention?

Read this case on UX Booth

Using Usabilla for simple A/B testing Demo UX Cases

Using Usabilla for simple A/B testing

A/B testing is a popular method to optimize conversion on a website. Basically you set up two or more variants, measure the differences in conversion rates between these variants, and select a winner based on your test results. The winning variant is the one with the highest conversion rates. A/B testing is a great way to improve your webpages one step a time. Unfortunately implementing tests is not always as simple as it sounds, even if you use for example the nifty interface of Google Website Optimizer. To set up a test, you need to have access to the source code and someone who’s able to adapt it, a live website with visitors or a working prototype, etc.. We’ll show you an example of how Usabilla’s One-Click-Tasks can be used as an interesting and low-budget alternative for A/B testing.

Read the rest of this entry »

Call-to-Action: benchmarking 10 web services Demo UX Cases

Call-to-Action: benchmarking 10 web services

The sign up button or link is an important call-to-action on the homepages of most web services. In a recent demo case Usabilla compared the sign up on the homepages of 10 different web services. Users found the sign up button on the Twitter homepage in 1.8 seconds. Animoto was a good runner up with 2.3 seconds. On average it took participants 3.5 seconds to find a way to sign up for these web services.

The differences between the performance of these websites on this important task are big. But what makes Twitter homepage stand out in this test? Why do the sign up buttons at Animoto, Vimeo and MyNameisE catch attention faster than those of Wakoopa, Basecamp, and PayPal? We would love to hear your opinion about these test results.

Read the rest of this entry »

Underdogs beat Expedia in usability showdown Demo UX Cases

Underdogs beat Expedia in usability showdown

The international travel site Expedia (Alexa Rank 816) gets defeated by its competitors Hotwire, Priceline, and Travelocity on basic usability tasks. Expedia performed the worst in a usability showdown between the four major international travel sites. A total of 148 people participated in this usability test and tried to perform three basic tasks on one of the four websites.

Alexa rank for Expedia, Hotwire, Travelocity, and Priceline

Alexa rank for Expedia, Hotwire, Travelocity, and Priceline

Book a room in Amsterdam

The 148 participants tried to book a hotel room in Amsterdam in a certain price range on one of the four big travel websites. They had to find a way to search for a hotel room (task 1), limit their search on price (task 2), and book a room in the correct price range (task 3). The task performance was measured by the success rate and the time per task.

Expedia homepage

Expedia, Hotwire, Travelocity, and Priceline

Read the rest of this entry »