Anna Rymer | Mon Sep 08 2014 CET | Cleeng Nuts & Bolts
If you look over the countless articles available about this subject on the web, you will probably get the feeling that you have found the Holy Grail. Especially if you’re a designer, you might hope to find some impartial data to replace the long discussions about ‘the color and size of the button,’ as well as the tools to objectively test your ideas on actual human users.
The reality, however, is that, if you simply dig a little more deeply and look a bit more closely into the web (or simply start doing some testing yourself of what is being presented), you’ll find that it’s not all rainbows and unicorns.If you’ve never tested before, you’ll find several 5-20% increases to your bottom line.
So yes, A/B testing can be a very powerful tool that will provide you with important data, but you should also keep in mind that not every test will end as conclusively as has been stated. In fact, most of them won’t. Here’s a visualization of an A/B testing we’ve done using Optimizely. We were testing if changing a color of a registration button to more visible, will impact the amount of registrations, and will boost the conversion. We were quite convinced the color change will do the trick, but it turned out not to be relevant.
A/B testing was very helpful when improving the performance of a landing page of our Cleeng Live! solution. We’ve noticed that after adding a button ‘Learn more’ on the top of the page, the engagement rate with the page went up. As did the conversion. Compare the A and B version ( we stick to the latter one, confirmed by the testing outcome).
In my experience, to make the most of what A/B testing has to offer, there are a couple of important guidelines that you should remember before you begin your testing:
4 tips to improve A/B testing:
Don’t forget that’s just the tip of the iceberg. Further, to follow the advice of Jakob Nielsen, “you also need to complement A/B split tests with user research to identify true causes and develop well informed design variations”.
That’s my experience, yet, i’d be interested to hear more about your perspective too. Please share as well your tips, so we can make A/B testing better for us all.