Into everyday campaign optimisation
How to incorporate A/B tests into everyday campaign optimisation, also known as Split Testing, is a method of comparing two variants of the same communication to determine which one performs better towards your business goals.
You can test almost everything following this methodology. From Names, From and Reply To Addresses, Subject Lines or anything within the actual content. Each of these tests will have a higher impact on different metrics (open, click, conversion rates).
A/B Testing enables marketers to make data-informed decisions, rather than guessing what they believe will work better, allowing companies to systematically incorporate client feedback into their marketing efforts. The positive impact in results makes the additional effort worthy – finding those variants that help you make the most out of your marketing automation platform and investment.
The process can be divided in a few basic steps:
- Decide what you would like to test, and what metric you will base your decision on
- Define a hypothesis of what you believe will perform better
- Create the variants – testing one element at a time helps to better understand the impact of each of these variables
- Select random equal samples and present the different versions
- After a defined period of time, analyse the results and compare
- Present the winning version to the rest of the audience
- Keep on testing, continuously improving your campaigns with each iteration
The idea is simple: plan your variables, create a sample, divide in equal segments for each test, determine a wining version based on a particular metric and then use this version for the rest of your campaign. But is it simple to implement as well? Having the process systemised and automated in your marketing platform would save marketers a lot of time in executing these campaigns.
IBM Watson Campaign Automation provides built in A/B Testing functionality for Email and Mobile Push Messages, allowing marketers to test up to 4 versions of each communication, and automatically send the most successful version to the rest of the audience.
So how does A/B testing work?
When deploying your email campaigns, go to the Delivery Options tab to enable A/B Testing. You can then define how many variants you would like to test (from 2 to 4) and the content for each version. Then select the parameters for your test: size of the testing sample, winning metric (open rate, clickthrough rate, conversion rate or effective rate) and the time until the test is considered complete. Similar functionality is available for Simple, Inbox Only and In-App Mobile Push Messages.
WCA will automatically create random segments of your selected audience for testing purposes, send the different versions and, after the testing time is completed, send the winning version to the remainder of contacts based on the winning metric selected (or send a notification email if the A/B Test was configured not to automatically send the winning version to the remainder – this may be useful, for example, when testing multiple changes at the same time and looking to manually compare open rate between versions A/B and C/D, while comparing click to open rates between A/C and B/D to then manually determine what is the best combination). Live data and metrics for your A/B Test campaigns are available in WCA on the A/B Test Status page, under Sent Mailings.
How about automated marketing programs?
In 2017, IBM introduced Percentage Splits in the New Programs Experience. This functionality allows users to divide the contacts who reach a step in a Program into random segments, enabling marketers to send different versions of a communication to these segments. As Programs are multi-channel, this can be used for Email, SMS or Mobile Push Messages.
Tip: WCA Programs do not automatically re-route based
on winning metrics at the time of writing, but the effectiveness of each
version can be compared by looking at the reporting data and the best variant
can be determined. Then, Programs could be adjusted, for example, by modifying
the split and sending 100% of the audience to the most effective touchpoint.
Here’s some suggestions to get your A/B Testing started:
– Personalisation in the subject line: do your clients feel more inclined to open your mailings if they see their name in their inbox?
– Header in email body: does a different headline captures your customer’s interest and they therefore spend more time reading through your communication, and potentially converting?
– Different hero image: nowadays, pictures make the first impression. Try different hero images for your campaign and determine which one works best with your audience.
– Location of the CTA (click to action): does having only one button at the top or two buttons in different areas make a difference to clickthrough rates?
For more information on how Purple Square could help you with A/B testing or anything else, get in touch today!