<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1628844417177726&amp;ev=PageView&amp;noscript=1">
FREE CONSULTATION
FREE CONSULTATION

Our Blog

Our latest articles, all in one place.

Email Marketing: How to A/B Test Your eBlasts

DATE PUBLISHED: February 16, 2015
Category: Marketing Strategy
 

email marketingNot happy with your eBlast conversion rates? Well, have you conducted some A/B testing recently?

If your answer is, "not exactly," then you really should keep reading.

While the term "A/B Testing" may sound complicated and scientific, it's anything but. A/B testing is simply an experiment where marketers test the performance of two different versions of one piece of content. Versions are sent to two similar audiences, enabling you to see which approach works better. You can A/B test landing pages, CTAs, or, as we'll soon see, eBlasts. A/B testing is a critical component for any email marketing strategy.

They key to A/B testing is to first establish what you'll be measuring. This is an important step because as we all know, eBlasts can have multiple inputs — copy, text size, text color, graphics, and so on. The more elements that are altered across the the test, the more complicated your analysis will become. Therefore, let's start with a basic example. 

Let's say you're a solar firm reaching out to bottom-of-the-funnel leads. The purpose of the eBlast is to offer a free on-site residential solar system estimate. You're comfortable with your email copy — that won't change across the two test versions. You're also happy with placement of your logo and the overall design of the email. Your primary area of concern is your CTA, so you decide to test the performance of your eBlast where only the copy of the CTA changes across both versions. Version A — known as the "the control" — will say, "Start saving money today!" Version B will say, "Click to set up your free consultation." (You'll note the differences are subtle — Version A is a bit more aggressive compared to B — but it's something nonetheless worth testing.)

Next up, determine how you'll measure success. This part is actually pretty easy: Simply track the number of clicks on the CTA in Version A vs. Version B. (Of course, you can also drill deeper to see which leads filled out contact forms on the landing pages.)

Now comes the fun part: Sending out the eBlasts to a large enough sample size — we're thinking at least 50 recipients per blast. And remember to keep the recipient audiences consistent across the eBlasts. As we previously noted, our example was geared towards bottom-of-the-funnel leads. Don't send Version B to low-scoring or top-of-the-funnel leads because you won't be generating "apples to apples" comparisons.

(One quick tangential note. The purpose of this post is to explore the nuts and bolts of actually conducting A/B testing. If you're looking for more tips around things like commonly tested elements beyond CTAs — things like subject lines, images, and promotions — check out HubSpot's thoughts here.)

Once the results are in, you can measure performance. For example, Version B generated 12% more CTA clicks than Version A. That's a good thing to know. But can your click-through rate be further optimized? The answer is "most likely," and thus begins the next round of A/B testing.

See? The fun never ends.

What do you think? Is A/B testing part of your email marketing strategy? What tweaks to your eBlast — subject lines, graphics, CTAs — tend to "move the dial" the most? What other metrics do you use, besides clicks, to A/B test?