2013_01_CTA_guido_featured Theory

Tests Prove It: The Single Best Call To Action Color Is… Green!

This is a guest post by Guido Jansen.

Hopefully, you were preparing to start shouting at me after reading the title. If you weren’t, you might be making some really wrong decisions when optimizing your webshop and losing a lot of money. Let me explain to you what is so wrong with the title…

Warning: Only read this if you’re serious in investing in optimizing the user experience and conversion of your shop. If you’re happy with suboptimal results, go ahead and try to copy Zappos and Amazon like the rest of the web.

User Interface

When the interface of a webshop is the discussion topic, I regularly get questions like “Which color should we use for…”, “How big should this button be to…”, “What do users expect when…” or “But on Amazon.com they…”.

For I am the psychologist among a group of technological people, of course I’m thé person to have an answer to these questions.

To have an answer would be great, but to get that illusion out of the door right away: the golden egg does not exist. Or better: the answer is different for every website, and sometimes even different for every page. This is actually a good thing, or else all websites would look completely the same.

1001 variations

Why does some text, image or color increase conversion on one site and why does that same text, image or color decreases conversion on another? There are a lot of variables that influence the effectiveness of a user interface.

Let’s take a look at some examples:

Site variables:

  • What kind of products or services do you provide?
  • What kind of (brand) image do you have (business, cheap, durable, fast, traditional, …)?
  • Your design
  • How does your site work in different browsers and on different devices (mobile, tablet, pc, * chrome, firefox, safari, …)

Visitor variables:

  • Who are you visitors (age, sex, business, consumers, …)?
  • How much money do they have or are they willing to spend?
  • When are they visiting your site (day, evening, weekends)?
  • Where do your visitors come from before they come to your site (search engine, comparison websites, adds, newsletter, …)?

External variables:

  • The weather (especially when you have a travel agency)
  • Pricing at competitor websites (for example the interest rate if you’re a bank)
  • Advertising campaigns of your competitor.
  • Economic situations
  • If the national team won an important match the day before (yes I’m still very serious)

These are just some examples, but I can continue for a while. The idea is that every variable requires a different strategy for what works and what doesn’t work for your customers in a certain situation. And the worst part for you is that these variables can also influence each other (a large button might work best when it’s blue, a small button might work better when it’s red).

For a complete site, there are way too many variables for any expert or “best practices” to tell you exactly what will work and what won’t work. Of course there are guidelines/ heuristics/ best practices but to really find out what works best you will have to test it.

Setting up tests

Persuasion marketing, interface design and a scientific approach are indispensable when creating a proper test. Every online business owner should at least have 1 FTE in his team working on this, or get an external professional to do this for you (hey, that’s me ;) ).

But please don’t make the mistake of taking on advice and “best practices” as being the one and only truth and implement every tip right away. That something works great for Amazon, really doesn’t mean it will work well for you (see the variables above)! Same thing for collecting user feedback: If 2, 5, 10, or 50 users in your quantitative research study tell you to change something, you still have to test if that change actually works better for the majority of your site visitors.

Good online marketers/ testers/ psychologists spent a lot of time with qualitative research. You can use that experience to save you (a lot of) time on getting ideas on what to test, implement testing procedures, testing properly and interpreting the results. But you still have to verify all assumptions about your (potential) customers to see if these assumptions actually work.

You can’t expect to throw some bad ideas in your testing machine and expect brilliant results…

For A/B and Multivariate testing in early design stages, check out Usabilla Survey, or take a look at www.whichmvt.com for a wide list of testing tools.

So about Green…

To get back at the title of the article: Chris Goward of Widerfunnel has done a nice meta research of around 20 A/B tests of CTA buttons and he made a collection of all the winning variants.

This was the result…


The complete presentation of Chris can be found at Slideshare.


Guido Jansen is Senior Conversion Consultant at Online Dialogue. Trained as a cognitive psychologist, Guido enables e-commerce companies worldwide to sell more by improving the customer's experience and the company's optimization process in a more scientific way.

3 comments

  1. Bart

    Nice post Guido!

    I have 2 admit something. I have a client that I helped gaining customer & conversion insights for a couple of years. However, they used nada-zero-niente of these learnings in a redesign project recently (which screwed conversions of course, duh)…

    => How come that we psychologists and webanalists are so bad in selling our value on board level (where data web and brand agencies are heard)? I guess Kahneman has the answer again…?

    p.s. I love the slide by Chris Goward too. Especially the pink glowing CTA (the winning one was actually animated (#young-girls :-) ).
    p.p.s. When I told my old man that I became a consultant, he said: “Don’t worry, it’s ok son, just never, never use best practices…”
    p.p.p.s. When clients ask me these questions, I tend 2 say “are you the type of guy that used your best friends pick-up lines hoping 2b score the same girls?
    ;-)

  2. Michael

    Funny how I’ve seen all kinds of colors as “the best”. I just saw orange as the best last week, and now green.

    It does come down to testing. A button can be tested with the text, too. So keep in mind it might not be the color.

  3. Guido Jansen

    @Michael: sure, the ‘color’ part in the above example is just a silly example. There are many other (more relevant) things to test :)

Leave a Reply

Your email address will not be published. Required fields are marked *