Web Analytics Special Report: Why Multivariable Testing Beats A/BImagine you just got your driver's license and want to buy your first vehicle. You go into a showroom, and there are two choices: a nice new 2005 automobile with all the features you'd expect in a modern-day car or an 1889 horse and buggy. Which would you choose?
Unless you are a "Little House on the Prairie" fanatic, my guess is you would take the car.
But surprisingly, when faced with this decision in the world of online testing, many companies think they need to start with the horse and buggy, otherwise known as A/B testing, rather than the modern-day car, otherwise known as multivariable testing. Under any circumstances, multivariable testing is the way to go for a variety of reasons.
Multivariable testing is faster. A/B testing lets you test only one factor at a time. If you want to test headlines, offers, layouts and copy via A/B, the process could take months. With multivariable testing, you can test all of these factors simultaneously and reach decisions on the best overall Web page in a fraction of the time.
Multivariable testing is more robust. If one tests two headlines using A/B, you may learn that headline A is better than headline B, but you will be sure of this only if everything else on the page stays constant. With multivariable testing, the same two headlines are crash-tested against large varieties of changes to the rest of the page. As such, if headline A wins in a multivariable test, you can be far more confident that when the other elements of the page inevitably change, headline A will remain your best bet.
Multivariable testing requires a far smaller sample. If you want to test eight different variables, A/B testing requires eight separate A/B tests, meaning eight times the sample size of one A/B test. Multivariable lets you test all eight variables simultaneously. In most cases, with multivariable testing you can test the same eight variables with the same sample size required for just one A/B test.
So why do people still use A/B testing? Some companies shy away from multivariable testing because they think it's too complicated. It is true that the math behind the experimental design and analysis required to conduct multivariable testing is much more complex than A/B testing.
But this is not a reason to avoid it, because technologies and consultants exist that can manage the process for you. After all, you probably don't know how a modern-day car works, but that doesn't stop you from driving one. And if you are a true do-it-yourselfer, plenty of Web sites and books can teach you how to conduct multivariable tests.
Another reason companies avoid multivariable testing is that they think developing the inputs for such a test is too burdensome. An A/B test needs only two versions of a page for testing while a multivariable test might require 10, 20 or even more. Certainly, if you asked your creative department to create 20 versions of your home page, you would get a lot of dirty looks.
Fortunately, new technologies let you generate these versions dynamically so that one multivariable test requires the same or even fewer creative resources than one A/B test.
Does it work? As more companies are learning, the answer is a resounding "yes." For example, The Motley Fool used multivariable testing to simultaneously test 13 variables on its Rule Your Retirement landing page. The result: a 39.5 percent increase in subscriptions in less than three months.
Palo Alto Software used multivariable testing to simultaneously test 11 variables on its Business Plan Pro landing page. Online sales rose 41.3 percent in just over a month.
Like the horse and buggy, A/B testing was a quaint little solution long ago, before modern technology and engineering. But multivariable testing, like today's automobile, gets you where you want to go much faster and with far less effort.