A/B testing is a popular method to optimize conversion on a website. Basically you set up two or more variants, measure the differences in conversion rates between these variants, and select a winner based on your test results. The winning variant is the one with the highest conversion rates. A/B testing is a great way to improve your webpages one step a time. Unfortunately implementing tests is not always as simple as it sounds, even if you use for example the nifty interface of Google Website Optimizer. To set up a test, you need to have access to the source code and someone who’s able to adapt it, a live website with visitors or a working prototype, etc.. We’ll show you an example of how Usabilla’s One-Click-Tasks can be used as an interesting and low-budget alternative for A/B testing.
The original Firefox A/B test
The website abtests.com shows examples of A/B tests and their results. To demonstrate the use of Usabilla to quickly measure differences between multiple variants we picked a case submitted by Firefox (original writeup at the Mozilla blog). Firefox tested two variants for the button on their homepage that needs to trigger users to download the latest Firefox version. They used two variants of a call-to-action button: “Try Firefox 3″ and “Download Now – Free”.
In the original A/B test conducted by Mozilla 9.73% of all visitors (+/- 150.000) visiting variant A (‘Try Firefox 3′) downloaded Firefox. Using variant B (‘Download Now – Free’) 10.07% of the visitors (+/- 150.000) downloaded Firefox. Variant B has a 3.6% higher conversion than variant A.
The Usabilla test
We’ve set up two similar tests in Usabilla in about 5 minutes, using screenshots of variant A and B. We created a new test (let’s call it ‘A’) and used a screenshot of the Firefox homepage showing the call-to-action button with the text ‘Try Firefox 3′. We’ve created another test (test ‘B’) using the screenshot of variant B (‘Download now – Free’). In both tests we asked participants to perform only one task: ‘Where do you click to download Firefox?’. To compare the performance on Test A and B we focused on ‘time per task’, which is a standard Usabilla feature.
We sent about half of the participants to Test A and the other half to Test B. We invited participants via Twitter and asked them at the end of the test to help to spread the test. We used the the ‘Redirect URL’ for in Usabilla to redirect users to Twitter to post a message about this case.
|Test A||Test B|
|Try Firefox 3||Download Now – Free|
|Geo Mean (seconds)||5.30||4.84|
Usabilla as an alternative for A/B tests
In this demo case we set up two simple Usabilla cases in about 5 minutes as a quick and dirty alternative for A/B testing. We measured the task performance (time) on the task ‘Where do you click to download Firefox’ for two different variants of the Firefox homepage for a total of 388 participants, split into two different groups. It took participants in test A (‘Try Firefox 3′) on average 5.3 seconds to click the download button. Participants in test B (‘Download Now – Free’) clicked the download button in 4.8 seconds (almost 10% faster).
We demonstrated that simple comparative Usabilla cases like this one can help you to improve your conversion. The results of quick tests can give you guidance when you want to improve your website, without worrying about development, a release protocol, server access, and other difficulties. Use mockups or (edited) screenshots to test multiple variants in a short time span.