How A/B Testing Can Help Your Small Business Increase Conversions and Revenues

A/B testing: fork in the road for your customers: which one is best? (photo credit: Richard Croft)

We all have times when we wished, “what if?”

What if I had done this instead of that, or chosen this item instead of the other? What would have happened?

We might not have the ability to rewind and replay our lives, but we do have this ability with our marketing campaigns, websites, and apps. It’s called A/B testing (or split testing).

At its most basic, A/B testing is when you take two variations of a website design or copy and try each one with random customers.

For example, you can A/B test two different designs for your website’s landing page by randomly serving each one to different site visitors, and track the usage. You use A/B testing to determine if a change or addition will improve conversions.

Highrise ran A/B tests of different headlines and subheadlines to see how they affected registrations. The one on the right, that noted how quick the sign-up process was raised registrations by 30%.

Some of the biggest companies have used A/B testing for years to help them make decisions. In fact, most companies with an online presence leverage A/B testing.

Companies A/B test because they know that increasing the amount of traffic to their website landing page or mobile app isn’t the only way to increase revenues. Smart business owners can also increase revenue by increasing conversion rates (getting more prospects who visit your website or use your mobile app to become customers).

Amazon constantly A/B tests different versions of its homepage, product pricing, and much more.

Netflix is known to A/B test many aspects of its service, from genre names to screen layouts and featured content.

A/B testing can provide businesses of any size with actionable data to improve conversions and revenues.

You can test almost anything, and there are many services and tools available to simplify A/B testing.

So let’s look at how you can integrate this powerful tool into your business.

There are four components of successful A/B testing:

  • Know what your goal is
  • Know what defines success or failure
  • Know what your baseline is
  • Know how to interpret the results

Yuppiechef tested two different variations of their homepage: one with a top navigation bar, and the other without. The one without boosted conversions by a staggering 100%.

Know what your goal is

Before you can start testing, you need to know what your goal is. By the end of your test, what do you want to know? There are many different things that test well using this method:

  • Calls to action – Is what I’m doing to encourage behavior clear and actionable?
  • Content – Is the content I’m using communicating what I want? Is the message clear?
  • Copy – Are the words I’ve chosen effective? Do they help drive conversions?
  • Design – Is this layout adequate? Is the focal point of this design clear?
  • Funnels – Are the steps a customer needs to take helping or hindering conversion?
  • Usability – Is this navigation usable? Can customers get from point A to point B?
  • Pricing – Does the price of my products meet customers’ expectations?

Business objectives are another key component of any A/B test.

Use DUMB objectives: Doable, Understandable, Manageable, Beneficial. Your objectives should be clear and easy to understand.

Your business objectives inform your goals.

If your business objective is to “build sales of product X by 20% year over year,” your testing goal might be “increase clicks from the homepage to the product page” or “decrease the number of people who fail to complete their purchase.”

Examples of less effective goals might be, “Is this product design appealing to millennials?” or “Is this color the best color for this button?”

Your goals are your priorities expressed as simply as possible. Make sure yours are, too.

Server Density A/B tested their pricing model and discovered that the new, value-priced model made total revenue jump 114%.

Know what defines success or failure

Once you have your goals and objectives clearly stated, you need to establish what your key performance indicators (or KPIs) are.

KPIs are measurements that help you understand how you’re doing against your objectives.

If your objective is to sell widgets, one KPI could be the number of widget sales on your website.

One essential thing to remember: make the goal of your test as specific as possible. It’s possible to test for multiple outcomes using A/B testing — this is called multivariate testing — but to start, keeping things simple makes interpreting the results of your test a lot easier.

Know your baseline

Before you can start testing, you need to know the lay of the land as it is right now.

If you’re testing sales, what are the current sales numbers? If you’re testing how effective your call to action is on your landing page, how many conversions are you getting?

A common way to establish your baseline is to make sure you’ve set up your website or app with an analytics package. Google Analytics is a very popular go-to, but there are a lot of other analytics services to choose from, like Kissmetrics, Mixpanel, and Statcounter.

If you’re not already tracking usage numbers, make sure you give yourself enough time to build up a meaningful amount of data. You want to understand the current state, and you can best know that through collecting numbers over as long a time as possible.

Virgin Holidays ran an A/B test of different email subject lines to see which one resulted in a higher email open rate.

How do you implement A/B tests on your website? Mobile?

Now that you understand what and where you’ll test, let’s talk about the actual testing. There are numerous applications that can help you implement A/B testing. Some of the more popular options:

  • Google Tag Manager – Google Tag Manager is a tag management system that allows users to update tags and code snippets on your website or mobile app. What’s a tag? A tag is a snippet of code that sends information to a third party (like Google).
  • Five Second TestFive Second Test shows your design to customers for five seconds. When the time is up, they’re asked questions about what they remembered.
  • Optimizely – Optimizely bills itself as an “Experience Optimization” platform, meaning it helps you optimize between two different variations of a webpage. You can also run tests that let you test many different items on a page – known as multivariate testing. (we use Optimizely at crowdspring and recommend it).
  • Unbounce Unbounce’s offers the unique benefit of 80+ pre-designed, optimized landing page templates. And that’s not all! An Unbounce subscription will allow you to perform A/B testing on your own landing page. (from time to time, we’ve used Unbounce for quick pages and recommend it).
  • KISSmetrics – This widely used testing tool focuses on the human aspect of data. Reports and testing data are connected to real customers, and their interactions with your website are consistently tracked as they peruse a site.

All of these choices offer a basic A/B testing process and a varying set of fun additional features, too. The one you choose might depend on whether you have development skills, pricing, or any other factors especially pertinent to your business.

Testing… 1,2,3

You have your goals and objectives set, you’ve collected buckets full of baseline data, and you’ve got a solid A/B testing system in place. Now it’s time to run your test!

One of the most critical things to remember when you run an A/B test is to make sure you don’t end it prematurely.

A/B testing is like a science experiment: you have a hypothesis on a way to increase conversions, and the A/B test helps validate (or invalidate) that hypothesis.

You want to be thorough, diligent, and detail-oriented, and you want your sample size to be large enough to ensure your data is solid. If you end your test too early, it’s possible your data may be compromised because it’s not a true picture.

How long should you run a test?

Unfortunately, there are no hard-and-fast rules. It depends on what you’re objectives are, how much traffic your site or app receives (low traffic sites need to run tests longer to gather enough data), and the complexity of the test itself.

Fortunately, there are tools available that make this part of the test easier, such as the aforementioned Optimizely, whose Stats Engine automatically determines what the right sample size is based on a number of criteria. They also offer a Sample Size Calculator that might provide at least a starting point for your test. VWO also has an A/B Split and multivariate test duration calculator that may help as well.

Statistics and analytics literally are a science, and as A/B testing involves both of these things, it can get very complex quite quickly. There are some very technical articles out there on how to mathematically determine the right sample size and duration for tests.

Your best bet is to take advantage of some of the tools we’ve mentioned that help make A/B testing a much easier process.

Behave.org has a big collection of A/B test case studies that lets you try to guess “which version won?”. It’s a great way to see many different types of tests and their results.

We have a winner!

Once you’ve run your tests and gathered enough data, you should be able to figure out which one of the two options was more successful (or not).

Depending on the outcome, you might want to tweak your test and run it again, or choose the winner and make that test a permanent part of your product or site.

Improvements are often iterative, with small tweaks over time making a dramatic difference.

Once you’ve successfully run your first test, you’ll probably think of other insights you want to glean using split testing. As the label says, rinse and repeat.

What’s next

There are many different tests you can run that provide you with equally valuable information to help improve your site or product.

Clickmaps (or heat maps) have been around for a long time, and can be used in conjunction with an A/B test or on its own.

As the name suggests, clickmaps are tests that show you where customers are clicking on your site. Measuring this type of activity can show you how people use your site or app, and help you find places to make changes.

Some of the newest tools on the market are session tracking apps like FullStory or Pendo, which show you in great detail how customers use your site. You can watch, often in real-time, as users click, scroll, and navigate through your site.

These types of tests are invaluable as they not only show where users click, they also show how users interact with the content on your site, and capture things like “dead clicks.”

Dead clicks show where users click on something that they think is clickable but actually isn’t. You can also track “rage clicks”, where users click repeatedly on a location, usually out of frustration.

A/B testing provides insights

Companies of all sizes use A/B testing to reduce the amount of guessing they make about their customers use their sites and apps.

A/B testing provides valuable insights and can be a very powerful tool to help you improve your site, but as with anything, it’s important to know when it’s appropriate and when it’s not.

While working at Google, ex-Yahoo! CEO Marissa Mayer famously tested 41 different shades of blue to find which one resulted in more clicks. (Photo credit: JD Lasica.)

Google infamously used A/B testing to the point of ridicule, when they used it to determine the color for the links that appeared in ads in Gmail.

Probably the most infamous example of Google’s design-by-testing approach was the “41 Blues” — Google’s engineers apparently couldn’t decide on two shades of blue for showing search results, so they tested 41 of them to see which attracted the most clicks. (They eventually settled on a blue that is basically the average of all the blues used in hyperlinks across the web.)

Be sure to draw direct connections between your business objectives and the theories you’re using A/B tests to validate. Through diligent data gathering and analysis, you can turn even the simplest A/B test into something that truly makes a difference.

The opposite of Google’s 41 shades of blue might be Jared Spool’s famous $300 million dollar button, where testing showed that one tiny change to a form could make a massive difference:

The designers fixed the problem simply. They took away the Register button. In its place, they put a Continue button… The results: The number of customers purchasing went up by 45%. The extra purchases resulted in an extra $15 million the first month. For the first year, the site saw an additional $300,000,000.

Not every test may end up that valuable, but you never know until you try. As the saying goes: “if you’re not testing, you’re guessing.”

A/B testing is a great way to help you stop guessing and start making decisions based on real, quantifiable data.

To get started with your own A/B testing, you’ll need multiple designs you can test with. Crowdspring offers many types design projects that can be set up to award multiple creatives – thus allowing you to choose however many designs for testing you like.

One last word of advice?

Don’t get complacent. Make sure you routinely test new designs to help your brand stay fresh.

You can’t A/B test without design choices.  Start a graphic or website design project to enlist the help of thousands of designers that can help convey your brand’s message. Design projects get plenty of entries, making it easy to select the perfect options to help your brand stand out.