Let the Customer Decide Which Marketing is Best with A/B Testing

Let the Customer Decide Which Marketing is Best with A/B Testing


As marketers, we like to think we know what customers are looking for, but do we? Do we really?

It’s impossible for a marketer to understand the nuances of every segment of every target market for a wide range of products. Consumer preferences can change based on demographics, location, age and myriad other factors. Lack of understanding in a specific market forces us to make gut decisions” based on our personal experiences and allowing our own bias to interfere with the marketing message.

We can minimize these unconscious biases using data. A data-driven approach improves the marketing message and provides a roadmap to reaching marketing and business goals. When creating and refining marketing strategies with data and analysis, we better understand customer preferences and can produce more effective marketing messages.

How to use data to make marketing decisions

Any marketing campaign begins and ends with goals. For more on defining goals, see Mike Reed’s most recent blog post: Analytics: Not Just for Geeks Anymore. After we understand our marketing and business goals, a great way to start removing unconscious bias is gathering data with A/B testing. A/B testing is when we create two iterations of our marketing materials and present them both to customers to see which more effectively produces the desired result. This way, we learn about consumer preferences through their actions and can effectively remove guesswork or bias from the marketing campaign using data to back our strategic decisions.

When creating A/B tests, we start with a null hypothesis to test against. Testing a null hypothesis is something everyone may remember from 11th-grade science, but it isn’t usually something that’s associated with a marketing campaign. A sample marketing null hypothesis could be a contact form with four fields will have a higher conversion rate than a form with eight fields.” Once we have the hypothesis, we test to prove or disprove it. One common A/B testing strategy is using an incumbent and challenger, where the null hypothesis is compared to a control variable. Comparing the null hypothesis to the control, the goal is to make incremental improvements to our marketing goals with each test, declare a winner and then create a new null hypothesis to test. Each time, we create a hypothesis about how we can improve our marketing goals and try to make incremental improvements to the campaign.

Why test if we’re already getting good results?

With a specific schedule and deliberate testing, incremental improvements add up over time to provide a boost in campaign performance. Even a small increase in conversion rate can add up to a large increase in results over time. For example, in 2007, the Obama campaign increased its email signup rate on a landing page from 8.26% to 11.6% using A/B landing page tests. The difference in conversion rate across the entire campaign added up to an additional 2,880,000 email address collected. Extrapolated out based on average donation of $21 per email, this difference added up to $60 million in additional donations to the campaign just by optimizing the landing page for conversion using A/B testing!

How do we do A/B testing?

Martech is continually evolving, and we’re lucky to have new technologies available for testing on a larger scale than ever before. There are new technologies available that can help create hypotheses, simultaneously handle multiple tests, split advertising traffic between creative versions and report results. We can test almost any element of an integrated marketing campaign to find opportunities for improvement, including tests for: 

  • — Landing page design
  • — Call to action
  • — Form length and field
  • — Ad copy
  • — Email design
  • — Subject lines
  • — Images vs. videos 

When starting a new campaign, we follow industry best-practice data. However, each campaign and client is different, and this is where the data we collect through A/B testing can help us continually optimize a campaign while it’s running. Using a data-driven approach and scientific method to gather our data, we can create more actionable insights, make better decisions and ultimately deliver better results on our clients’ marketing and business goals.

Adam Wingate is a digital analyst at Dixon Schwabl, responsible for SEO and web analytics. To learn more about our martech capabilities, contact Adam at Adam_​Wingate@​dixonschwabl.​com.

Adam Wingate
Digital Analyst

 Oh,  Dear. 

We’ve noticed you’re using an out-of-date browser. That’s why you can’t view our site properly. But there’s an easy fix. Simply update your browser via one of the links below in red.

Normally, we don’t make visitors jump through hoops to viewour site, but we really want you to see our latest and greatest.

Thanks! And welcome to the future.