Knowing what resonates best with your audience is often a guessing game. This is where A/B testing comes to play, by helping you optimise content and recognise what works well and what doesn't. Whether you're having difficulties reaching campaign goals or improving your current marketing strategy, A/B testing is an efficient tool for solving these challenges.
What is A/B testing?
A/B testing, also known as split testing, involves displaying two different variations of your content to your online visitors. This means that 50% of viewers are shown variation A (known as, the control) and the other 50% are shown variation B (known as, the variation). The performance of both versions is measured, then evaluated and used to optimise your final content. The aim is to determine, which variation performs better for the given conversion goal.
Today, it’s a common feature in most martech software and used in various channels, from email and e-newsletters to landing pages and homepages. Some tools will allow you to test more than one variant against the control variant.
How to test
In general, A/B tests are often done by testing one specific variable at a time. Testing one element, such as an image or copy text can make it easier to draw conclusions. For example, you could test whether different copy text of a CTA button on a landing page has an effect on your conversion rate. A CTA with “Buy now” could potentially resonate better with your online visitors and convert more than “Go to purchase”.
The elements you choose to test will depend on the campaign at hand. Here are a few examples of what you could test:
- CTA button (text, colour, position)
- Email subject line
- Headlines and Subheadings
- Ad copy
- Forms (location, form-fill elements)
Some arguments have been made against testing single variables, which claim it could limit the improvement of design. This is known as getting stuck at the local maximum. The hypothesis is that testing two radically different variations can actually result in a better design outcome. This technique is often applied in larger scale website design projects and split testing a specific element still seems to be the preferred method among marketers.
When experimenting, it’s important to keep in mind that traffic is imperative for conclusive results - you need a decent amount of online visitors to engage with your content or campaign. Devoting time to A/B testing is also something to consider. Tests can last between a few days or up to a few weeks, again this depends on the amount of traffic you’re getting.
A/B testing is an effective testing tactic if you want meaningful results quickly. Some key benefits are:
- Optimising conversion rates
- Understanding online visitor behaviour and customers better
- Driving better engagement
- Improving your bottom line
Optimising conversion rate is usually the main goal of A/B testing and remains a hot topic among marketers. We recently attended the Technology For Marketing Expo in London, where this was a key topic discussed. One specifically interesting presentation by Workbooks and Communigator showcased how fewer fields on a landing page form increases the amount of form fills. By performing an A/B test, they identified that a short variation requiring the user to only fill in their email address converted 33% more form-fills than a form with multiple fields.
Another interesting example is from one of President Barack Obama’s fundraising campaigns, which raised an additional $60 million using A/B testing. By testing two different elements, media (image vs. video) and CTA button copy text, they were able to fully take advantage of all website visitors and increase the sign-up rate by 40.6%.
A/B testing isn’t a new tactic but clearly an effective one, helping you figure out what works best for your business. So if you haven’t already, it’s about time you stop guessing and start optimising! You can read more about our new A/B testing tool here, which makes it possible to run A/B tests for proactive automated messages and interactive content on your website.