5 Big A/B Testing Mistakes You Need To Avoid

The following is a guest post by Matt Knee, Founder and President of MyNewCompany.com. MyNewCompany.com, started in 2001, makes starting and running a business simple, fast, and inexpensive for entrepreneurs and their advisors.


A/B Testing is one of the best ways to ensure your site is ready for primetime – and an even better way to optimize your marketing. But it’s only helpful to you if you do it right.

Creating content for the web isn’t just a matter of throwing together a site and tossing it to the mercy of your users. It’s a painstaking process, with countless little details you need to tweak and modify to maximize your conversions. And here’s the thing – just because you’re running your own business, that’s no excuse to skimp on that process.

One of the best ways to determine whether or not a change to a website is beneficial is through something known as A/B Testing (or split testing). How it works is pretty simple. Users are shown two or more variants of a single page, and their reaction to each page is measured.

Once the variants are compared against one another, the owner can determine which one best fits their goals.

This process is a bit more complicated than it sounds, mind you. There are a lot of things you can do wrong. Here are some of the major mistakes – and how you can avoid them.

Using A Faulty Testing Tool

There’s a ton of excellent, low-cost A/B testing software on the web. But the truth of the matter here is that you get what you pay for. A lot of freeware testing software isn’t particularly well-optimized for larger sites, and can eat up resources and slow things down enough that you’ll actually see your conversions suffer no matter what you do. And that, naturally, will impact the results of your test.

It’s worthwhile to spend a little bit of extra money on the tool you use for your A/B tests. And before you start using it, it’s also important that you run an A/A test. If the platform you’re using has an impact on your site’s performance, you’ll know right away.

Here are a few tools I’d personally recommend:

Calling The Tests Early

Have you seen a significant result from a test you’re running? An upturn in the conversion rate on one of the two pages? That’s great – keep testing. Stopping your tests the first time you see something of significance isn’t going to gain you anything.

Likely as not, you’ll wind up with a false positive, and your tests will get you nowhere.

Ideally, you want a result to occur consistently over the course of a testing period. If, for example, Page A generates more conversions than Page B for only a week, and then the two of them equalize out, then it’s unlikely Page A is objectively better than Page B. If, however, Page A brings in more traffic and conversions than Page B over the course of the entire testing period?

Then and only then can you conclusively say Page A is the better choice.

Focusing On The Wrong Details

Conversions aren’t the only thing that A/B tests can measure. Long-term business results are equally as important – if not more so. Don’t just focus on vanity metrics like whether or not people are led to a landing page. Focus on what they do when they get to that landing page.

Focus on the leads you generate, not just the potential conversions you gain. Who are you looking to maintain long-standing relationships with? What do you want your customers to do after they purchase from you – how frequently do you want them to return?

Do you want them to share their experience with their friends, colleagues, and family? Sign up for a newsletter? Purchase a particular product or set of products?

Testing For Too Short A Timeframe

An A/B test takes a long time to generate significant results – two weeks or more, at the very least. Stopping it at any point before this functionally defeats the core purpose of running one in the first place. Additionally, it may be worth your while to test a radical change every now and then, as well – minor, incremental changes might not always be the best choice.

I’d recommend testing for at least a month or more before you bother calling any of your results conclusive. Hey, these things take time – a successful A/B test doesn’t happen overnight. As for more radical changes, try testing one of them every few months – you might be surprised at where your results take you.

Testing Without A Purpose

What’s your end goal in running an A/B test? If you don’t know the answer to that question, don’t run one. Like any other process in the marketing and design of your site, A/B testing should always be done with a clear end goal in mind, such as:

  • Increasing the conversion rate for a particular subset of visitors (ie. people referred by an email blast campaign).
  • Increasing the number of people who purchase product X.
  • Increasing the number of people who sign up for service Y.

You get the idea. Don’t A/B test without cause.

Converting for Success

A/B testing is a valuable tool, but if only if you understand how to use it.  You need to set out with a clear goal in mind, make sure you use the right tool, and make sure you’re testing for the right reasons. Otherwise, you’re basically just a monkey throwing things at the wall to see what sticks.

Pin It on Pinterest