Credibility in many cases is the most important driver for conversions. Recently we’ve been experimenting with new tasks to determine factors that influence credibility on webpages. A demo case with Mint helped us to determine how people react to questions about the credibility of a webpage. Based on this pilot we’ve set up another study to measure drivers for credibility on six other websites.
Category Archives: Demo UX Cases
Donna Wells, CEO of Mindflash.com, named Usabilla as one of five essential web applications for Small Business in an interesting article on Mashable. The user focused online training startup Mindflash uses Usabilla to collect online feedback from their users. Mindflash recruits participants with Ethnio:
“Ethnio allows us to grab and vet potential candidates through website pop-ups. Usabilla then pulls the best candidates from the Ethnio session to work through questions, product concepts and sends feedback and analysis to the business.”
“Set up properly, this nifty duo works almost entirely hands-off, allowing the site to motor along while continually mining data. And it keeps our top research and product talent focused on interpreting the results – not collecting them.”
Mindflash reinvented itself with a little external help from Adaptive Path. Paula Wellings (Adaptive Path) and Cameron Gray (Mindflash) shared insights about their journey to become more customer focused at UX Week 2010. A video of their presentation about ‘Pizzability’ (who can spot the Usabilla screenshot?):
We’re happy to facilitate the dialogue between Mindflash and their customers and are proud to be mentioned by their CEO, Donna Wells, as one of the most essential web apps.
Do you like the article on Mashable? Please retweet it, share it on Facebook or post it on LinkedIN using the buttons next to the article on Mashable.
Two weeks ago we published four tiny visual surveys to help inspire Usabilla users. Our users and visitors participated in these surveys and shared their feedback on different webpages. We will share the results of these test cases to give you an idea of how these short visual surveys can help you understanding your users. As requested we will start with a selection of the results from the Mint demo case: “Click on the things that make you trust Mint. Please explain why.”
We analyzed the results 177 participants in this survey. These participants answered the question with 596 points (3.4 points / participant) and explained with 140 notes (0.8 / participant). We will share four takeaways based on a selection of their feedback.
How much do you really know about the people visiting your website? Take a look at these 30 second demo cases to get inspired on how you can use short Usabilla surveys to learn from your users and improve your website.
Mashable made a fresh start of the new year by launching a redesign. The intention of this new design was to put more focus on the stories, removing clutter, and dividing the content into sections (Home, Social Media, Mobile, Web Video, Entertainment, Business, Tech, and Jobs). In the past week more than 150 people commented on the blog post about the new design. Most reactions on Mashable seem to be positive about the new look and feel: ‘Fresh & clean’, ‘I like the sections’, ‘More professional’, and ‘Clean and Simple’. What are the most important changes in this design iteration and what can we learn from feedback? We asked 60 social-media-savy participants for feedback.
A/B testing is a popular method to optimize conversion on a website. Basically you set up two or more variants, measure the differences in conversion rates between these variants, and select a winner based on your test results. The winning variant is the one with the highest conversion rates. A/B testing is a great way to improve your webpages one step a time. Unfortunately implementing tests is not always as simple as it sounds, even if you use for example the nifty interface of Google Website Optimizer. To set up a test, you need to have access to the source code and someone who’s able to adapt it, a live website with visitors or a working prototype, etc.. We’ll show you an example of how Usabilla’s One-Click-Tasks can be used as an interesting and low-budget alternative for A/B testing.
The sign up button or link is an important call-to-action on the homepages of most web services. In a recent demo case Usabilla compared the sign up on the homepages of 10 different web services. Users found the sign up button on the Twitter homepage in 1.8 seconds. Animoto was a good runner up with 2.3 seconds. On average it took participants 3.5 seconds to find a way to sign up for these web services.
The differences between the performance of these websites on this important task are big. But what makes Twitter homepage stand out in this test? Why do the sign up buttons at Animoto, Vimeo and MyNameisE catch attention faster than those of Wakoopa, Basecamp, and PayPal? We would love to hear your opinion about these test results.
The international travel site Expedia (Alexa Rank 816) gets defeated by its competitors Hotwire, Priceline, and Travelocity on basic usability tasks. Expedia performed the worst in a usability showdown between the four major international travel sites. A total of 148 people participated in this usability test and tried to perform three basic tasks on one of the four websites.
Book a room in Amsterdam
The 148 participants tried to book a hotel room in Amsterdam in a certain price range on one of the four big travel websites. They had to find a way to search for a hotel room (task 1), limit their search on price (task 2), and book a room in the correct price range (task 3). The task performance was measured by the success rate and the time per task.