Not happy with your eBlast conversion rates? Let A/B tests be your guide to finding out what's working and not working.
Is A/B testing uncharted territory for you? Have no worries, we're here to help.
"A/B Testing" may sound complicated and daunting, but it's actually fairly easy to implement when you know what you want to test. A/B testing is essentially just an experiment where marketers test the performance of two different versions of one piece of content. Versions are sent to two similar audiences, allowing you to see which approach works better. You can A/B test landing pages, CTAs or eBlasts. A/B testing is a critical component for any email marketing strategy.
The key to A/B testing is to first establish what you'll be measuring. Is it copy, text size, text color, CTA, placement, graphics, or some other element you're interested in?
Let's say you're a solar firm reaching out to bottom-of-the-funnel leads. The purpose of the eBlast is to offer a free on-site residential solar system estimate. You're comfortable with your email copy — that won't change across the two test versions. You're also happy with placement of your logo and the overall design of the email. Your primary area of concern is your CTA, so you decide to test the performance of your eBlast where only the copy of the CTA changes across both versions. Version A — known as the "the control" — will say, "Start saving money today!" Version B will say, "Click to set up your free consultation." (You'll note the differences are subtle — Version A is a bit more aggressive compared to B — but it's something nonetheless worth testing.)
Next up, determine how you'll measure success. This part is actually pretty easy: Simply track the number of clicks on the CTA in Version A vs. Version B. (Of course, you can also drill deeper to see which leads filled out contact forms on the landing pages.)
Now comes the fun part: Sending out the eBlasts to a large enough sample size — we're thinking at least 100 recipients per blast. And remember to keep the recipient audiences consistent across the eBlasts. As we previously noted, our example was geared towards bottom-of-the-funnel leads. Don't send Version B to low-scoring or top-of-the-funnel leads because you won't be generating "apples to apples" comparisons.
(One quick tangential note. The purpose of this post is to explore the nuts and bolts of actually conducting A/B testing. If you're looking for more tips around things like commonly tested elements beyond CTAs — things like subject lines, images, and promotions — check out HubSpot's thoughts here.)
Once the results are in, you can measure performance. For example, Version B generated 12% more CTA clicks than Version A. That's a good thing to know. But can your click-through rate be further optimized? The answer is "most likely," and thus begins the next round of A/B testing.
See? The fun never ends.
What do you think? Is A/B testing part of your email marketing strategy? What tweaks to your eBlast — subject lines, graphics, CTAs — tend to "move the dial" the most? What other metrics do you use, besides clicks, to A/B test?
Looking to roll out a killer email marketing strategy? We'd love to work with you.