A/B testing. The phrase itself sounds complicated and time-consuming—something that can really wait for another day. But it doesn't need to be that way. Modern web tools, like the 25 email marketing apps we recently reviewed, are often equipped with features that make A/B testing (also called split testing) as simple as scheduling a campaign or customizing a template.
Though there are all sorts of A/B tests you can run—mainly on the pages of your website—this post focuses on tweaking what ends up in your customers' inbox. Specifically, how to A/B test with the help of an off-the-shelf email marketing app such as MailChimp, AWeber or Campaign Monitor.
Before we show you exactly where to access testing features in your email marketing software—including A/B testing links for 20-plus apps—let's examine the basics of A/B testing: how it works, guidelines to follow, and email characteristics to test.
Pretend you own a restaurant—the Chatterbox Café—and after a few months of flat revenue, you're tempted to change up your menu to see if has an impact on sales. You create a new menu where instead of just text, you include five pictures of your highest-priced items. That's the only change—everything else about the menu stays the same. You then make 25 copies of the new menu, placing them alongside the 25 copies of the original all-text menu, and then split your café down the middle—patrons on one side get the new menu, and everyone else gets the old menu.
After two days, you tally up the revenue and compare the two sides, pleasantly finding that the menus with the five pictures increased the average table ticket by 15%. You have a winner. Next stop: the copy machine.
That is an A/B test. You have variant "A", the menu with text only, and variant "B", the menu with the five pictures. Variant "A" is also often referred to as the "control"—the variant you've been using and keeping stats on. Returning to the example above, it's only because you knew the historical sales of your café's control menu that were you able to confidently say the new picture menu increased sales by 15%.
That example is an A/B test in a café, probably the last place internet marketers would think to apply an A/B test today. You're likely more familiar with testing your company's homepage—slightly changing the copy, picture or button color—with an aim to increase sign-ups or sales inquiries. The same can be done for your marketing emails, too, and it's not as complicated or laborious as it might seem.
Let's start with an honest admission: Before I dug into the A/B testing features provided by email marketing software, I fully expected I'd be manually setting up, executing and analyzing A/B tests. That is, I'd split my email list down the middle, send a control to one group, a variant to another, then watch the results come in.
But it's not like that at all. A/B testing just takes applying an extra bit of text to your email body copy, using an extra subject line field or setting up a second email and marking it as variant "B". Once you do that, your email marketing service takes care of the rest.
MailChimp is one of the most widely-used email services in part because of its friendly brand that permeates the software. Its A/B testing tool, officially called "A/B Split Campaign", falls right in line with that branding: it's a convenient shortcut to better versions of your emails.
While some tools will make you conduct an A/B test on your full list, MailChimp lets you pull aside a sample—say, 3,400 emails from your 10,000 subscribers—and conducts an A/B test with just those folks. What percentage should you test? After seeing 7 years of testing results, the team at MailChimp recommends a test segment of 20%-50%—the smaller the list, the higher the percentage, too.
Once you've selected a percentage, MailChimp has you choose which metric it'll use to select the best version, which will be sent to the remaining email addresses on your list after your test reveals a winner.
Finally, you need to enter the details of your test. For this example, let's test the from name (the email sender's name). We want to find out which is better: using the company's name or a person's name?
After you've filled out the fields, send the test and let MailChimp take care of deploying the winning email—or use the "Manual" option to pick it yourself. Remember, the only difference between the two emails you've sent will be the "From Name"—a small but possibly important variant. If you had changed up anything else, such as the subject line and the from name, the results would be a mix of those two changes. This is commonly called an interaction effect (or confounding effect). Your stats are only good for that specific setup.
AWeber is another popular email app among marketers because of its ability to easily add contacts via WordPress, PayPal and Facebook. Once you've accumulated a list of more than 100 contacts, the email marketing tool lets you start A/B testing your communications—or as they call it, conduct a "Broadcast Split Tests." 100 contacts, they say, is the minimum to get statistically significant results.
Unlike MailChimp where the ability to A/B test a campaign is offered after you've written and designed your email, AWeber has you select Broadcast Split Test before you begin setting up your campaign.
From there, they give you the ability to create up to four variants—since you're A/B testing though, you'll want to select "2 messages"—and then enter the send percentage for each.
After doing so, you'll indicate which message editor you're going to utilize—the app's "drag and drop email builder", "plain text message" or "code your own HTML"—and then click "Save Split Test". You've now set up your split test structure; all that's left is your campaign's copy, design and scheduling.
At this point—writing and designing your email—you'll want to keep everything the same except for your call to action (CTA) link or button. If you're emailing your restaurant patrons about this month's special, for example, and your aim is to have them click-through and make a reservation, one email's call to action might say "Book a reservation" while the other says "Save a table for me!"
After you send the email, you'll see stats for the two versions roll in, and since you're concerned with subscribers visiting your reservations page, you'll want to pay attention to the "clicked" count. You can use these stats to decide which text is more enticing for your next email.
Like MailChimp, Campaign Monitor does the work for you—after selecting what to test, enter the two versions, select the email content, and then decide on a subset of your list to send this test to. Campaign Monitor recommends sending to 20-30% of your list—in the screenshot below, for example, the marketing team plans to send emails to 17,820 respondents before a winner is named.
Testing subject lines with Campaign Monitor is quite easy, too. After selecting it from the interface above, you'll enter two subject lines in the two provided fields. That's it. Everything else about your email—from the sender name to the CTA text—will remain the same.
When you see the results, you'll want to hone in on open rate—in the test below, "Version A" is the winner.
In their A/B testing documentation, Campaign Monitor stresses that it's best to test your emails by either viewing a preview or sending yourself a campaign email.
"It's very important to test and check for errors of any kind before you start an A/B test because you cannot make any campaign changes when an A/B test is in progress," reads their documentation. With Campaign Monitor, that's an easy step to heed since before they allow you to send the emails, they show you a campaign snapshot—similar to a travel itinerary when you purchase a plane ticket—that includes these test options.
This is a good rule to follow when A/B testing, but it isn't the only one. Here are 10 A/B testing guidelines to consider before jumping into a test of your own.
What are you trying to improve about your marketing emails? Ask yourself that simple question before proceeding. At Zapier, we'd like to increase the click-through rate on our blog newsletter emails and amp up the open and click-through rates on our new user onboarding emails. If we hit those goals, we'd likely see more unique pageviews on our blog and a higher level of engagement with our app. So our "what" is increase open and click-through rates, and our "why" is to increase engagement.
For the "what you're testing" question, you have two options—each has been mentioned in this chapter, but let's clearly define them:
Open Rate: This stat tells you how many of your subscribers open a particular email. It's expressed as a percentage and calculated by dividing the number of emails opened by the number of emails sent minus the number of emails bounced—bounced meaning the subscriber's email is no longer valid. A 30% open rate, for example, would mean that if 102 emails were sent and 2 bounced, 30 were opened.
Pay attention to the open rate when you're testing subject lines, sender names and message preview.
Click-Through Rate or Click Rate: Look to this number to learn how many of your subscribers clicked on a link in your body of your email to visit your site. This number is also expressed as a percentage and is calculated by dividing the number of subscribers who clicked a link by the number of emails sent minus the number of emails bounced.
Some email marketing software—such as Campaign Monitor (below)—lets you take this a step further, too, and hone in on a specific link in your body copy. This can be helpful if you have multiple links, such as a text link in the intro and a button link at the bottom, for the same landing page—now you can see which link is most effective. If your email marketing service doesn't offer this feature, rely on Google Analytics to tell you which link gets the most clicks by giving each link unique UTM parameters.
Pay attention to the click-through rate when you're testing call to action text and buttons along with body copy.
For the "why you're testing" question, it's really up to you. An increased click-through rate on a call to action in your onboarding email could mean higher revenue next quarter. An improved open rate for your emails could turn into a higher active user count, leading to more feedback on your product. It’s also important to make sure that metric will actually move the needle if influenced.
Also ask yourself what you'd like to learn from your tests. This list of outcomes offered by Campaign Monitor in an A/B testing case study offers some great questions to start with:
No matter the objective, just be sure to identify it at the start of your testing.
Hold up, you want to A/B test your "Happy Holidays!" email? Think again: you'll be waiting 12 months until you can put your results to good use.
A/B testing works best for perfecting frequently sent emails—at Zapier, our frequent emails include blog newsletters, new user onboarding emails, and error alerts. We send hundreds of thousands of those emails weekly; if we can increase the open or click rate by just 1%, that's a major win. So back to that holiday greeting idea—sure you might send it to your full email list of 55,000 email addresses, but what are you going to do with the results? Email services and clients change so frequently, it's unlikely that what you learn can be applied to whatever you send next year. The key challenge with frequent emails is making sure that sample sizes are enough for the inference you’re trying to make, Evan Miller offers a helpful calculator to find your minimum sample size per group.
If your email marketing software doesn't offer A/B testing—or you want to spin up a manual test on your own—remember to split your subscribers randomly. One way to do this is to download your list as a CSV and randomly sort it using Excel. Or, you can simply arrange it alphabetical order with any spreadsheet software, slicing it up from there. Luckily most email marketing apps will handle this step for you.
The struggle with A/B Testing in general is having a large enough sample size. However, there’s a lever that can be applied to remedy this problem. The larger the difference in effect between emails, the smaller the sample size can be. Therefore, test highly divergent variants. For example, plain-text versus HTML emails, or "one prominent link" versus "everything is a link." Just try to learn the lay of the land by turning "big knobs"—save finer adjustments for later.
Since sample size is our biggest challenge, we want to avoid things that will explode the necessary participants. Sample size requirements will explode as variants are added, so unless you’re consistently testing over 100,000 in each variant, keep it to two variants.
In our experience at Zapier, a single email's effect declines sharply, petering out around day 4 to 5 after being sent.* If your email has no effect after 5 days of waiting, it’s not likely to ever have any meaningful difference. 4-5 days is a good rule of thumb, otherwise you might destroy your experiment.
You dispatch your A/B test to your 65,000 subscribers and it comes back that 5,400 of them opened the "A" version and 5,500 of them opened the "B" version. B's the winner, right? Not so fast. Just as new reporters have a tough time calling a political race on election night, you too should you be slow to call a winner in your A/B test. The best step to check if your results are statistically significant is to use one of the many free calculators online. Visual Website Optimizer, an app for A/B testing your website, has a simple free significance calculator (pictured) that serves up a quick answer. When using it for email, consider "visitors" as "subscribers".
Statistical significance is a statement about the repeatability of your result, not the extent to which it moves the needle. It’' very important to ask yourself, will this change really impact an aspect of the business that is cared about? It’s possible to achieve statistical significance while having no real effect on the bottom-line. This might mean new landing pages with more relevance for customers or a better UX on the landing page.
And the significance calculator says… "yes", your results are statistically significant. Moreover, they're meaningful because you were careful to check that they could be ahead of time. Score! Since hopefully your variants were divergent, you might have learned something big about email marketing. Zapier's Peters suggests trying to replicate the experiment to to see if the effect is repeatable. If you reach this point, you're really onto something. You can install your divergent variant —this is now your "champion." Now repeat with a new "challenger" that is again divergent to the new champ. You're well on your way!
Customer behavior never stops changing, and neither should your A/B tests. If you've increased your click-through rate, challenge yourself to spike your open rate. If you bumped your open rate with a better subject line, consider changing up the message preview. If your message preview is perfected, change up the sender.
Out of testing ideas? That's what we'll cover next.
HubSpot saw a higher email click-through rate when it sent a campaign using a team member's name as the from name—"Maggie Georgieva, HubSpot"—versus their company name.
"Our control generated a 0.73% CTR (click-through rate), and the personalized version generated a 0.96% CTR. With a confidence level of 99.9%, we had a clear winner," says marketer Bryan Harris in a video on the HubSpot blog. "Our conclusion after conducting this A/B test was that emails sent by a real person are more likely to be clicked on than emails sent from a company name."
From Name / Sender Name Variants to Test
CoSchedule, a content planning and scheduling tool, used Campaign Monitor to identify the best title length for their popular newsletter, "The Content Marketing Update". What they found, as told in a Campaign Monitor blog post, is that longer titles received higher open rates. Specifically, the blog post stated, "shorter subject lines were less successful for (CoSchedule) than longer ones, and that around between 40 and 50 characters was their sweet spot."
So 40-50 characters? That's a pretty long subject line.
<— this is 47 characters
Subject Line Variants to Test
The message preview (highlighted above) is an easy to overlook part of an email campaign, one that I'll admit we've missed making the most of on several occasions. The little snippet of text that shows in your inbox—also called the email preheader—is often customizable using your email marketing client. But beware: your email subject lines and message preview could butt heads, depending on the email client your customers are using. Gmail, for example, puts the title first and then the message preview. Mobile email apps, on the other hand, will list the title with the message preview below. Either way, you don't have too many characters to work with.
A/B testing this could be cumbersome, since it might require manual testing, but it will be worth the effort. Think to yourself: how often do you read that little piece of copy to help decide if a new newsletter is worth reading or not?
Message Preview / Preheader Variants to Test
Sometimes simple is refreshing. That can be the case when you get an email in plain-text format, rather than a flashy, over-designed newsletter—plain-text emails feel like something you would get from a friend.
Unbounce points out that the plain-text format worked best for the Obama 2012 team.
"I think it gave the impression that there were real people writing these emails, that it wasn’t focus-grouped to death. It was something more off the cuff," says Amelia Showalter, the campaign's director of analytics.
Hey reader! If I knew your name, I'd have inserted it there. Unfortunately, I don't, which is one reason I've been cautious of personalizing emails in the past—not all subscribers supply their first and last names. But besides testing emails with names, think about testing personalization in other ways, too.
Email marketing app Vero, for example, personalized an email by employing a term they felt had a better chance of striking their audience.
"The word 'converts' resonates more with Vero’s (conversion rate optimization)-focused audience and resulted in a 15% increase in opens and a 50% increase in clicks," writes marketer Siddharth Bharath on the Vero blog.
Personalization Variants to Test
"Cut the length of your email copy in half. Now cut it in half again."
That's the top-voted piece of advice on a GrowthHackers thread titled, "Ask GH: In two sentences or less, what’s your single best email marketing tip, trick, hack or piece of advice?"—it was offered by Morgan Brown, a marketer at GrowthHackers.
It shouldn't be a surprise to see that recommendation in the top spot. It's a safe bet that you—like the rest of us—don't often care for long emails.
But what else can you do to your email body copy beyond length to start A/B testing it? How about trying a new format? That's what the team from Flightfox did—instead of the usual chunks of copy, they used a question-and-answer format.
The result: Flightfox doubled their conversion rates, according to a case study by Vero, a lifecycle email marketing app.
"The original email had relatively high open and click through rates but was converting at a rate of 1.6%. This was an excellent start but there was certainly room for improvement," Vero states.
"With their new copy, Flightfox have been able to increase their conversion rate to 3.2%. A fantastic result that has an instant effect."
Body Copy Variants to Test
Which of the above emails draws you in? If you're like the users of Unhaggle, the company behind the featured emails, it's the one on the right. After A/B testing these two, click through rates increased by 378%, according to a case study by Vero.
"Here’s a snapshot direct from Unhaggle’s email provider," Vero's founder Chris Hexton writes on their blog. "The changes they made in the second campaign helped Unhaggle achieve a massive 32.4% click through rate. This is one of the highest click-rates we’ve seen recently"
Hexton credits three items for the success of the email—images that stir up emotions, punchy, short body copy and clear prices to see the savings.
While this is a success story in images, it's also an A/B test case study that can serve as an example of what happens when you change up too many items in an email. If Unhaggle had only inserted images, they could have narrowed in on more images making an impact on click-through rates. Now their results, however, lead them to continue testing to conclude which of the three characteristics impact the most customer decisions.
Image Variants to Test
3-4 seconds. That's it. That's the length of time Litmus says you have to grab a customer's attention when they open up your email. So what better way to grab it then with a design that appeals to their eye?
Moreover, your design needs to be ready to be viewed on mobile, too. Email marketing app Movable Ink found that an astonishing 61% of brand emails are now viewed on smartphone and tablets. So testing responsive design is almost a must now; larger text and clearer call to actions are good places to start.
Which one do you want to click? That's your goal here—how many different ways can you write a call to action. Not only write it, but present it, too: should it be text link or a button? Should it go at the top of the email or the bottom? Or both?
Email call-to-action A/B tests conducted by live chat app Zopim found four factors that were present in winning CTAs. Abhiroop Basu, the company's content strategist explained these characteristics in a blog post.
Call to Action Copy and/or Button Variants to Test
What if your received the Zapier blog newsletter every Sunday night at 8 p.m. EST? Would you be more or less likely to open up that message and click the link to read our latest post?
Testing for the optimal day and time is hard—especially as online businesses cater to customers and subscribers around the globe—but the question of when it's best to send emails continues to plague marketers. So there are two routes to take: stick to the stats or follow your gut.
If you listen to the stats, a good place to start is GetResponse's recent look at 300 million messages. It showed that Tuesday is not only the most popular day to send emails, but also the day with the highest open and click-through rates, as well.
If you follow your gut instead, you're in good company. On the Crazy Egg blog, marketer Neil Patel offers this advice:
Everyone wants to know, but precious few actually test it. Save yourself some time, spare yourself some grief, and give yourself a break.
You see, there’s no stock answer to the question “what time is best?” Like everything else in marketing (and life), the true answer is it just depends.
If, for example, you send your email at 7:30 a.m., Patel says, you might think you'll get higher click-through rates because professionals are checking their email. But there's a catch, he warns. "They feel rushed, open your email, but may not have time to respond to your offer. Sure, you get high CTRs, but your conversions are awful."
So instead try 4:30 p.m., when Patel says employees are bored at work, winding down and possibly looking for a distraction. "They may see your email, open it, and be more likely to convert. Fewer opens? Maybe. Higher conversions? Yes."
Delivery Date and Time Variants to Test
You should now have the confidence to go out and conquer any customer's inbox… well, maybe not conquer, but at least A/B test for increased open and click-through rates. To get you started fast, here are quick links to A/B testing features provided by some of the more popular email marketing services.
Now that you've got an email marketing tool and a strategy to make great emails that get opened, it's tempting to get right to work and stop reading right here. But don't leave just yet, or otherwise you might end up making one of the many common mistakes that are made with email marketing. In chapter 11, we'll look at 21 of the worst email marketing mistakes, and how you can avoid making them in your own work.
Transactional Email: The 7 Best Services to Send 1000s of Emails Daily
Experts Weigh In: 21 Email Marketing Mistakes to Avoid
Build workflows with your apps.Try Zapier Free
Connect apps. Automate tasks. Get more done.Try Zapier Free
Try Zapier Today
“Zapier is amazing. Took us 30 seconds to setup an integration that saves us 2 hours of work every day.”
Zapier is the easiest way to automate powerful workflows with more than 750 apps.