Rules-of-Thumb for A/B and Multivariate Tests
I recently got interviewed on Unbounce blog as a conversion hero. In the interview, I shared a few rules of thumb related to A/B and multivariate testing, which you may find helpful. I developed these heuristics while observing and advising hundreds of tests created by VWO users. So, in this post, I will paraphrase and expand on some of the things I shared in the interview.
Download Free: Multivariate Testing Guide
A/B or Multivariate, which test methodology to choose?
Three main criteria will help you in choosing the right methodology between A/B testing or Multivariate testing:
- Traffic on test page: MVT requires lots of traffic to get any significant results
- Design resources available: MVT requires fewer design resourc
- Objectives of the test: MVT is used for optimizing existing design and A/B test is used for optimizing conversions by testing a completely new design
Quoting from the interview, here is an elaboration on these three factors:
The eligibility criteria for each method is traffic of course. You should not attempt to do MVT if you don’t have enough traffic on the site. But assuming traffic isn’t a constraint, MVT works best when you are hyper-optimizing. That is, when your aim is to squeeze the last drop of conversion rate juice from your existing design. On the other hand, A/B testing should be used if you want to test completely different designs and ideas. Ideally, an organization should do lots of MVT tests followed by a few large A/B tests.
MVT typically requires less design resources as compared to large scale A/B test changes. Moreover, as I said, if the objective is to optimizing existing design MVT (or single element change) is way to go. But if you want to do radical changes on the page (say layout change, theme change, etc.) you will go with A/B testing.
Best methodology to start with?
Undoubtedly, if you are just getting started with testing and conversion rate optimization, you should go ahead with a simple A/B test. Multivariate testing is a complex methodology, and it is easy to draw erroneous conclusions. From the interview:
For the starters, I always recommend to start with small-step changes in order to truly appreciate the value of testing. Ideally, they should pick a sweet spot on their page (ideal candidates: call-to-action, headline and image) and optimize that by a simple A/B test. Only once they get the hang of the whole process, they should attempt MVT or large-scale A/B test.
What to test and what not to?
Of course, what you test on a page depends on the specific site and objectives of the test. But if you are looking for some rules-of-thumb on what are the most common elements on a page that can be tested, here are they:
- The King: Call-to-Action (your main button)
- The Queen: Headline
- Others: Text copy, images, number of form fields, number of steps in the funnel, required vs. optional steps, number of elements on page, amount of text on page, layout (left vs. right kind of tests)
As far as what not to test is concerned, it is best to avoid testing:
- Pricing: It is very risky and potentially illegal. You shouldn’t offer the exact same service/product at different price points.
- Trivial elements on site: Every element being tested on a page should have a hypothesis on why you include them in the test. For example, you shouldn’t add page elements (say a footer or header) in the test without a specific reason and expect the conversion rate to improve magically! You need to be convinced that a particular site element has a high chance of impacting the conversion rate.
Download Free: Multivariate Testing Guide
What kind of surprises can you expect while doing A/B testing?
Technically, no winning variation in a test should be seen as a surprise because there had to be a specific hypothesis on why you included it in the test. Nevertheless, sometimes one finds that the test results are contrary to what was expected. That is, a variation won hands-down when one expected it to lose significantly (or vice versa). Here are a few (real-world) examples of such surprises:
A recent test was very surprising – in this test it was found out that removing a secure icon from the page actually increased conversions by 400%. Another surprising result was that by simply adding a human photo on a homepage, conversion rate can be potentially doubled.
One of the test results on our homepage goes really against the standard advice of having a ‘Signup’ button prominently features on homepage. We found that a ‘Signup’ button actually decreased eventual sign-up’s and ‘Watch a short video’ worked much better because after watching the video, visitors were sure of what they are signing up for. (We had a ‘Signup’ button on the video page, by the way).
I hope you liked the interview snippets.
If you want to read the full interview, head over to Conversion Heroes Part 3: Split Testing – An Interview with Paras Chopra.