In marketing utopia, everyone agrees on an email campaign concept. As a group, you all collectively decided on a design, message, call to action, recipients, and follow ups. There are high fives exchanged and stock photo smiles all around.
Back here on earth, there are disagreements over new concepts, subject lines, and even body content. A/B testing, also referred to as A/B split testing, lets email recipients tell you what’s working and what isn’t. No need to pick one concept and abandon another – try both! Whichever garners the best results can be used again.
A/B testing allows you to design and send one email with two separate subject lines (A and B) or two separate email bodies (A and B).
- As the email is delivered, your marketing automation software will divide your email recipient list in half – the first half receiving email A and the second receiving email B.
- As recipients open and click through their version, their engagement is monitored by the MA software, testing A and B against each other.
- Many MA softwares, including SalesFUSION, becoming Sugar Market, emails both versions to about 30 percent of your recipient list. Then, as open and click analytics come in, the rest of the list receive the version that had the higher rate of engagement with the test subjects.
- In SalesFUSION, this is referred to as ‘Advanced A/B Testing’ and is an additional option when using the A/B function.
- Customers still have the option to test all email recipients – 50 percent receiving A and 50 percent receiving B.
Before beginning A/B testing, like any experiment, know what you’re testing. You can, of course, test subject lines and email bodies in the same experiment. However, best practice is to stick to testing just one element at a time. This will give you a better understanding of what lead recipients to engage with your email.
Elements of an email campaign you can test with A/B testing:
- Subject lines: There are many subject line variations you can create. Try different approaches by asking a question, including dynamic content, creating a sense of urgency, or making a bold statement.
- Calls to action: Test wording and sentence structure by creating an exclamation in version A and provide more detailed instructions in B. For example, “Download now!” versus, “Click here to access our whitepaper.”
- Layout: Do recipients respond better to multiple columns or one? Images across the top of an email as a banner or along the righthand side? This lets you know how recipients want to read, instead of how you want them to read.
- Personalization: This may seem granular, but people respond differently to their formal titles versus their names or nicknames. Create an email implementing dynamic content in which version A uses titles (Mr. Jones) and version B uses names/nicknames (Bobby).
- Headlines: Do your recipients prefer short, snappy headlines at the top of their emails or more detailed explanations of the message below? Create one headline between 3-6 words and another between 12-15 words.
Be sure to define the parameters of a successful campaign for your company. If your open rate usually hovers around 15 percent, then an open rate of 20 percent or more would certainly be an improvement. Don’t decide that you’ll only consider a campaign successful if engagement rates double – you’re going to be A/B testing for a while waiting for that one.
Testing is a great, and relatively risk-free, way of experimenting with new marketing tactics. You’re not locked-in on a controversial strategy, but you still get to dip your ever-conscious toe in the icy pool of email marketing strategies.