Resources

What Should I A/B Test First?

4 Minutes

Aug 13, 2020

Nearly 10 years ago, working as a Marketing Coordinator, I was tasked to deploy a direct mail marketing campaign with pamphlets sent to various postal codes within Vancouver, BC.  At the time when researching marketing appeals, I decided to create two unique variations of the brochure. My first A/B test was ready – experimenting with unique messaging in each brochure to determine which version better resonated with customers. 500 pamphlets later, we learned some valuable lessons that are still being applied to the campaigns we manage for clients today.

The primary question we looked to answer was: what proposed benefit or value would be the most impactful for our customers? This question is often neglected in A/B testing when there’s so much focus on design – but copy testing remains as one of the fastest forms of A/B testing  that can provide insights for not just your ad campaign, but your entire marketing plan.

Most Marketing Managers Approach A/B Testing with Little Structure

Marketing managers often resort to testing visual elements first.

Green vs Orange CTAs

Green vs Orange Button A/B Test

Or if you get more creative… Dogs vs. Human Faces? It is called “Face”-Book after all.

Dog A/B Test
Faces A/B Test

But before investing design resources into these visuals – it’s worth reviewing which benefit or presented value matters most. Most businesses have a pretty good idea as to what their customers are looking for, so their ad copy will typically reflect their value propositions. However, we also know that the messaging of your core service offering can resonate with your audience in various ways.

The Case for Starting With Your Messaging

Take for example the A/B test below with one of our clients, Rosemary Rocksalt. The ad samples were created using various messaging concepts to better understand what drives online purchases.  The brand is heavily involved in the local community so we wanted to ensure at least one of our ad variations brought that to the forefront.

We tested 3 core messaging in our ads:

A)  Ingredient-centric: Craving real bagels? Always fresh, never frozen. Come by on Tuesday after 2pm and get ½ dozen bagels for FREE when you buy a dozen.


B)  Promotional: What’s better than bagels? FREE bagels! Come by on Tuesday after 2pm and get ½ dozen bagels for FREE when you buy a dozen.


C)  Community-centric: We’re so grateful to be in a community that supports local business! As a thank you, come by on Tuesdays after 2pm and get ½ dozen bagels for FREE when you buy a dozen.

Variant A - Ingredients
Variant B - Promotional
Variant B - Promotional
Variant C - Community
Variant C - Community

Initial Hypothesis

Internally, we felt confident that version B, Promotional would be our top-converting ad variant.  With “FREE” in all-caps, it was our anticipated winner.  However, we were surprised when we evaluated the actual conversion rates.

Results and Learnings

Conversion Rates

Variant A           8.99%
Variant B           6.05%
Variant C            4.13%

A few learnings can be derived from this initial test.  Firstly from a copy stand-point, “Craving real bagels? Always fresh, never frozen” (also one of Rosemary Rocksalt’s slogans) was effective in driving results.

Variant B’s conversion rate (4.13%) was 54.1% lower than variant A’s conversion rate (8.99%).  On 1000 website visits, that would equate to a difference of 48 purchases between these two ad variations. This may be attributed to the inquisitive nature of the question: asking customers if they crave a fresh, never frozen bagel, inducing a mouth-watering reaction that the word FREE simply can’t hold up against.  

Regardless of the reason, we can feel confident that the results of this test were significant.  How?  Leveraging one of the many Statistical Significance Calculators online will either confirm your ad test had a large enough sample size, or perhaps needs more time and should continue to gather data.

From here – we can take advantage of this learning opportunity by leveraging similar messaging in web copy, email campaigns, and future ad campaigns as well.

How You Can Approach A/B Testing

It’s important to give yourself enough time to collect data until you’ve reached the appropriate data threshold for your data to be meaningful.

Online tools such as Optimizely’s “Minimum Detectable Effect Template” (MDE) will allow you to project the conversion improvement required against your baseline conversion rates, to varying degrees of statistical significance. You can use their MDE template to add your own inputs, and estimate the length of time your experiment will take.

AB Test Example Structure
A/B Test Example Structure

There’s likely room for improvement when it comes to your conversion rates. Regardless of whether you’re testing copy on your ads or landing pages, once you’ve achieved a winning variant, continuous improvement means testing against new variants.

Leveraging copy in your next A/B test will give you an opportunity to determine the effectiveness of your value proposition, a simple method to gain insights that can make a meaningful impact. By understanding which messaging is most valuable to customers, we can take these learnings to better inform the overall marketing strategy from email to web content.