Tuesday, September 27, 2016

A/B Testing vs Multiple Variant Testing: And the Winner Is…?

During the 2016 Rio Olympic Games, Mahe Drysdale rowed 2,000 meters (1.24 miles) in just 6 minutes and 41 seconds.


However, despite his impressive performance, the world record-holder nearly lost the race.


Drysdale Rows to Nail-Biting Single Sculls Win in Rio 2016 Olympics


In one of the closest finishes in Olympic history, Drysdale won by mere millimeters.



In contrast, Great Britain's Men's Eight team rowed the same distance in just 5 minutes and 29 seconds-over 70 seconds faster than Drysdale's time!


What's more, the Brits won by more than a half second.


gbr-mens-8-rowing-win


That might not seem like a huge margin, but in the Olympics, a half second is a big deal.


So, why was Britain's team so much faster than Drysdale?


The answer is simple: they had more oars in the water.


Now, at this point, you might be thinking, This is all well and good, Jake, but what does rowing have to do with online marketing?


Well, it turns out that conversion rate optimization (CRO) is a lot like rowing.


The more oars you have in the water, the faster you'll make it to your goal and the more likely you are to beat out the competition.


The Secret is Testing Multiple Variants


Over the years, CRO seems to have become synonymous with A/B testing in the minds of many marketers.


Now, there's nothing inherently wrong with this. A/B testing is a form of conversion rate optimization. You have a page and you want it to perform better, so you change something and see if it improves your results.


But here's the thing, A/B testing isn't the only way to do CRO.


It might not roll off the tongue as nicely as “A/B testing”, but if you've got enough traffic, A/B/C/D/etc testing can allow you to produce meaningful results much more quickly.


For example, Optimizely recently studied and reported on the factors that defined the world's best testing companies.


Guess what the 4 biggest factors were?



  1. Testing the things that drive the most revenue

  2. Testing every change

  3. Testing to solve real problems

  4. Testing multiple variants simultaneously


Does #4 surprise you?


Apparently, the most effective CRO doesn't come from A/B testing-it comes from testing multiple variants.


Essentially, A/B testing is like the Mahe Drysdale of CRO. It works and it can even deliver amazing results.


But, it's only two oars in the water-there's no way it can compete with an 8-man team.


To put this in more concrete terms, according to Optimizely, just 14% of A/B tests significantly improve conversion rates. On the other hand, tests with 4 variants improve conversion rates 27% of the time.


So, if you test 4 variants, you are 90% more likely to improve your conversion rate than if you just ran an A/B test. However, 65% of CRO tests are-you guessed it-A/B tests!


Why Testing Multiple Variants Works Better


Basically, there are two reasons why multiple variant testing outperforms A/B testing: 1) it's faster and 2) it allows you to test more variants under the same testing conditions.


Multiple Variant Testing is Faster


Sure, you can test the same things with a series of A/B tests as you can with a multiple variant test-it just takes a lot longer.


When you run an A/B test, you can really only learn one thing from your test. Your variant will either perform better, the same or worse than your original.


And that's it, that's all you can learn.


Now, if you're smart about your A/B testing strategy, your results can teach you a lot about your audience and make your future tests smarter, but you're still only learning one thing from each test.


On the other hand, with multiple variant testing, you can try out several ideas at the same time. That means you can simultaneously test multiple hypotheses.


So, instead of just learning that a hero shot with a smiling woman outperforms a shot of a grumpy man, you can also see if a grumpy woman image drives more results than the grumpy man pic or if a happy man outshines them all.


Or, you can try multiple combinations, like a new headline or CTA in combination with either the smiling woman or the grumpy man.


Running all of these tests simultaneously will allow you to optimize your page or site much more quickly than you could with a long series of A/B tests.


Plus, running a test with multiple variants will greatly improve the odds that a single test will deliver at least one positive result, allowing you to start getting more from your website sooner.


Multiple Variant Testing is More Reliable


Another problem with successive A/B tests stems from the fact that the world changes over time.


For example, if you are in eCommerce and run your first A/B test during October and your second test during November, how do you know if your results aren't being skewed by Black Friday?


Even if your business isn't seasonal, things like differences in your competitors marketing strategies, political change or a variety of other variables can make it difficult to directly compare the results of A/B tests.


As a result, sometimes it can be hard to know if a particular A/B testing variant succeeded (or failed) because of factors outside of your control or even knowledge. The more tests you run, the murkier your results may become.


However, with a multiple variant test, you are testing all of your variants under the same conditions. That makes it easy to compare apples-to-apples and draw valid, reliable conclusions from your tests.


What Does Testing Multiple Variants Look Like in Real Life?


To show you just how testing multiple variants can improve your CRO results, let me share an experience we recently had with one of our clients.


The client wanted to get site traffic to their “Find Your Local Chapter” page, so we decided to add a “Find Your Local Chapter” link to the client's footer. That way, the link would be seen by as many people as possible.


Makes sense, right?


So, we put together something that looked like this:


testing-multiple-footer-variants-v1


At first, we figured we would just put the link in the footer and run a test to see if the link made a difference.


But then, we started wondering if there was a way to make the link even more noticeable. After all, getting traffic to this page was a big deal to the client, so it made sense to emphasize the link.


With that in mind, we added color to the link:


testing-multiple-footer-variants-v2


Now, this idea seemed logical, but at Disruptive, we believe in testing, not gut instinct, so we figured, “Hey, we've got enough traffic to test 3 variants, let's take this even further!”


The problem was, the client's site was a designer's dream-modern and seamlessly designed. To be honest, we had a bit of trouble selling them on the idea that creating a page element that interrupted their seamless flow was worth testing.


But, eventually, we convinced them to try the following:


testing-multiple-footer-variants-v3


It was very different from anything the client had tried on the page before, but we decided to run with the idea and include it in our test.


A few weeks and 110,000 visitors later, we had our winner:

testing-multiple-footer-variants-results


Not surprisingly, adding the “Find Your Local Chapter” link increased page visits by over 60% for every variant-that's an awesome win, right?


But here's the thing. With our original, strict A/B test, we would only have discovered that adding the link increased traffic by 63%.


On the other hand, by including a couple of extra variants, we were able with the same test to discover that-contrary to the client's belief-the more our link “interrupted” the site experience, the more traffic it drove to the chapter page.


Sure, we might have reached the same conclusion with several more tests, but we achieved these results much more quickly and reliably than we would have with an A/B testing series.


Should You Test Multiple Variants?


When it comes to testing multiple variants, there's only one real reason not to use it: your boat is too small.


rowing-fail


Think about it: if the entire British Eight Man team had tried to cram onto Mahe Drysdale's boat, they never would have made any forward progress.


The same idea applies to CRO.


As great as multiple variant testing is, if you don't have enough traffic, a test could take months or years to complete.


In fact, in true multivariate testing-where you test to see how a large number of subtle changes interact to generate your conversion rate-you want at least 100,000 unique visitors per month (for more information on multivariate testing, check out this great article).


On the other hand, you need far less traffic to simultaneously test multiple page variants.


To see how long a multiple variant test will take on your site, try out this VWO has a free sample size and test duration calculator from VWO. If the time frame makes sense for your business, go for it!



Conclusion


Whether it's Olympic rowing or CRO, the more oars you have in the water, the better your results will be.


Although it may be tempting to limit CRO to A/B testing, testing multiple variants will allow you to improve your conversion rates more quickly and reliably than you could with a series of A/B tests.


You've heard my two cents, now it's your turn.


Have you tried multiple variant testing? What was your experience like? Did any of the data in this article surprise you?


About the Author: Jacob Baadsgaard is the CEO and fearless leader of Disruptive Advertising, an online marketing agency dedicated to using PPC advertising and website optimization to drive sales. His face is as big as his heart and he loves to help businesses achieve their online potential. Connect with him on LinkedIn or Twitter.




No comments:

Post a Comment