Request a Consultation.
Let’s talk about your business and how we can help it grow. Schedule a complimentary Strategy Session today with one of our Senior Strategists.
Let’s talk about your business and how we can help it grow. Schedule a complimentary Strategy Session today with one of our Senior Strategists.
Request a free SEO Audit by completing the form below. It’s tailored, meaty, just for you, and – best of all – free . We’re willing to prove where we can deliver value – and whether you hire us or stop taking our calls, you keep the ideas.
We’re on a mission to help businesses use data to make more informed decisions. Contact us today and we’ll build you your very own custom Google Data Studio Dashboard at no charge.
Time for a pop quiz! How quickly and easily can you answer these three questions about your company’s email marketing channel?
1. Does your audience respond more positively when your email’s subject line is personalized with their name?
2. Are text links or image links more effective as CTAs in your emails?
3. Which type of offer does your audience respond to more eagerly: a discount on product price or free shipping on a total order?
Time’s up – how did you do? If you weren’t able to answer confidently and immediately, perhaps it’s time to incorporate some regular A/B testing (sometimes called “split testing”) into your email process.
A/B testing gathers valuable data about how the email you send affects your audience’s behavior upon receiving it. Through regular testing of a number of aspects of email construction you will begin to develop a series of “best practices” specific to your company’s email campaigns and audience, and will see email become a richer source of revenue.
In an A/B test, you send out two versions of an email. These two versions should be identical with the exception of one variable – the thing you are testing. So what sorts of things can you test this way? The three questions at the beginning of this post are fairly common examples of A/B test variables: subject lines, types of links and kinds of offers can be compared to see whether your audience clearly prefers one or the other variant. Other often-tested variables include the “From” name that appears on the recipient’s email, length of text sections, one-column vs. two-column layouts, larger images vs. smaller – virtually anything you can think of that can be altered in an email’s content or structure can be A/B tested to determine how or if it affects your response rates.
Setting up an A/B test is simple. First, determine what variable you want to test. Make sure only one variable changes between the two versions of your email, so you can be certain that any difference relates to that variable and not some other variation.
Take the database of contacts to whom you will be sending your test, and randomly split it into two segments. One segment should be about 75% of your list, the other about 25%. The larger group should receive email “A” (the control) and the smaller group should receive email “B” (the variant).
If you are tracking results in Google Analytics, you will want to set up your UTM codes on your email links so that you can clearly identify which segment your data is coming from. A common way to do this is by simply appending either “_a” or “_b” to the Campaign field in your UTM code.
Give the email at least 2 – 3 business days after being sent out before you begin comparing data in earnest. That will allow for the majority of those who are going to respond to have time to do so. Look at the data and determine whether any differences in performance jump out at you. If you see virtually no difference, or if your “A” list performed better, then you can assume your variant is not a beneficial practice. If, however, the “B” list performs better, you have one more step to take in the test. Resend the same two emails to the same database, but this time send the “B” variant to the larger group. If the better performance of the “B” email holds true, you have uncovered a potential “best practice” for your email process!
When should you be A/B testing? Always! Constantly! The more often you see a particular variant outperforming its control, the more certain you can be that you have a valid result. Also remember that time of year can sometimes affect results, and audiences can be fickle. Your “best practices” will be an evolving list, so when something that worked for months suddenly seems to no longer produce the results it once did, you’ll know it’s time to test it again!
At Synapse, we’re happy to help you set up and track all your A/B email testing. If you don’t have an email strategy in place for your marketing, we’ll help you with content, coding and deployment as well!
Start a conversation with us today!