Retailer split testing in focus

Split testing: Why your split tests are not driving a change in revenue or margin

Split testing (also called A/B testing) is a means of measuring variations in performance between two pages: the control page (representing the default page on a site) and a challenger page. This article highlights the challenges of split testing but also reaffirms its significance in performance change.

We looked at a broad selection of retailers who spoke at the RBTE Technology Expo 2016 and discovered that the majority of those retailers are utilising a split testing tool on their website. This trend is reflected across the e-commerce landscape. However, whilst it is clear that organisations are investing in the tools to build split-testing capability – they are not seeing results. Econsultancy has reported that many companies say they failed to capitalise on their investment due to a lack of resources. We are more challenging of this assessment: having researched and worked with some of the UK’s largest retailers, the issue is not a lack of resources but more often that the tests are based on ‘expert’ opinion, i.e. they don’t know what they are testing for. This view is also supported by some of the world’s most successful e-commerce analytics companies such as Kissmetrics and CrazyEgg.

Driving a better commercial outcome

Split testing should form the basis of every change to a website. It is the way you should de-risk any sales and marketing changes before mainstreaming.

The objective of a split test is to run the test long enough to declare the control or challenger page the ‘winner’ with 95% statistical significance; in other words you are 95% certain that the uplift in performance is genuine and not due to chance alone. 

If split testing is performed correctly, its value will be reflected straight away and, in almost all of the tests we conduct, we see significant increases in site revenue and overall customer usability improvements.

Succeeding with split testing: detail, planning and objectives

To test effectively, eCommerce teams need to establish how to decide the order of testing potential changes. Our belief is that the initial driver should be the potential commercial value of the proposed change. There is always a balance between commercial value, test complexity and the difficulties of platform change should a test be successful. So, having established a commercial priority, the other two factors should be overlaid and against these criteria you can establish a test and learn roadmap that reflects a sensible order of priority.

The most successful way to achieve the most accurate results when testing is by following these key points:

Driving successful tests requires a range of new capabilities and reinforces the importance of others you already have. Whilst analytics is indeed a marketing cornerstone today, so is a good understanding of ethnography. We often say that, to be really effective, a website needs to be as good as the best salesperson in the company. Given that, we believe that copywriting is a core capability as words are one of the only two things that convert interest into engagement and then into a transaction; the other is images.

The tools

There are many split testing tools available on the market: their core functionality is similar but there are several additional features and wide variations in ease of use. Tool selection may then be very much personal decision on top of the requirement of the business and the reporting which is required. We advise that you shop around before committing to testing software, as they are expensive tools to invest in; some particularly so and, unlike some other purchases, a high price does not necessarily correlate to a better quality tool.



Later in this series, we will look at website search functionality and the essence of world-class execution.

James Hammersley is the co-author of Leading Digital Strategy, a guide to eCommerce strategy published by Kogan Page. He is a founding partner and director of Good Growth, a digital change consultancy which has worked with organisations such as The Economist, The Co-operative, O2 and Manchester United.