4 things I've learned from A/b testing websites

Document everything #

A/B testing tools allow you to do your test and add basic information such as a title and sometimes a description. I like to go further and make sure I record assumptions and data outside the tool. This makes it easier to go back and see what was done, when, and why. It also helps clarify the main information to the rest of your team without them needing an account for the tool you are using.

In my testing documents, I include

Volume #

You need to have enough traffic in your A/B test to provide meaningful results. Look at page traffic data in Google Analytics to determine the best pages to test that will give the most results the quickest. If you have a low traffic website or page you may need to wait longer to get meaningful results.

There are other ways to improve your site if you don't have enough traffic including Google Ads experiments, user feedback, and looking at site analytics.

There are a few online tools that can help you determine how much traffic vs how long you may need to test to receive meaningful results.

Further reading - A/B Testing Tech Note: determining sample size

No clear winner #

Some a/b tests will end with no results or in a draw even after running for long enough. The important part is to take this and understand why.

Some thoughts on why the results are flat:

Pair no results tests with other metrics (heat maps, event analytics, page recordings) to further refine where you should be testing.

Always be testing #

A/B testing should be a continuous improvement initiative that is also baked into your analytics and user testing processes. Taking a look at Google Analytics is a good place to start to understand the best pages to test. Using results from a previous test is also a great way to determine the next tests that you should do.