And the winner was…

Note: This post is over 4 years old. It was first published in November 2008

Ok, so I’m following up last week’s post on Maxymiser’s MVT research. The findings are revealed below, and a few of you got your predictions wrong! It just goes to show that when you are putting the final stages of polish on live page designs (i.e. sweating over 0.1% improvements in your conversion rates), multivariate testing is a really useful method that can yield quite unexpected findings.

If you didn’t read my previous post, you may want to take quick look and read the comments. Put briefly, Maxymiser recently released the findings from a multivariate testing project. Multivariate testing is, in case you don’t know, a method that allows you to test different combinations of page designs on your live user-base, and select the most effective by monitoring your analytics metrics. The method has very good ecological validity as you know for sure that you are measuring the behaviour of your real users in their natural environment. In fact, they don’t even know they are being researched, making it almost the polar opposite of traditional lab-based usability testing.

In this study, Maximiser tested 5 different versions of the shopping basket page for the Laura Ashley online store. Bear in mind that the percentages shown below are percentage uplifts on the conversion rate, rather than the conversion rate itself. For example, if you had a 10% conversion rate and then had an uplift of 9%, you’d end up with a conversion rate of 10.9%.

First place with an 11.02% uplift on the default: Version 3:

Second place with a 3.88% uplift on the default: Version 5:

The default design (with 0% uplift): Version 1:

Fourth place with a 1.92% downlift: Version 2:

Fifth place with a 7.85% downlift: Version 4: