In a world of digital media, we’re surrounded by data and infinite opportunities to test, learn and optimize. The digital advertising industry is guilty of overusing buzzwords, which often render the phrase “A/B test” meaningless.
Let’s dig into both the buzzword and the larger concept of A/B testing in advertising. To make a true impact that ladders up to meaningful business results, a test needs to be carefully designed with both a control and variables that are precisely managed. We’re truly testing Variable A against Variable B.
An A/B test through paid media can play out in multiple ways. We can test tactic vs tactic (like banner ads vs Facebook ads), ad creative vs ad creative (Version A vs Version B) or even audience vs audience by running identical creative to see which one responds to the on-page offer. A/B testing is truly where the art and science of advertising meet.
Before deciding to execute an A/B test, it’s critical to take a step back to decide what insight you’re looking to learn through testing. The key to a successful A/B test is zeroing in on what we hope to learn to drive business forward. Are we looking to see if one ad headline works better than another? Will the more successful copy version be rolled out across other tactics for the client? Are we looking to see if one target audience responds better to a specific offer? Are we looking to test landing pages to see which performs better?
The easiest way to end up with a pile of meaningless data is to try to “test” too many things at once. It’s important to think critically about the end result we’re hoping for, how we’ll drive action, what we’re hoping to learn and what we’ll do with that information once we have it. A/B testing means patience and planning.
Establish testing structure and control the variables.
Let’s say we’re testing creative versions—we want to see whether version A or B drives more lead submissions. That’s our definition of success. And we’ll use that information to inform creative direction for other campaign pieces. We’ve decided to test banner creative, so we’ll want to ensure we’re controlling all variables that could impact the test results, outside of the creative itself. We also want to make sure ad targeting and landing pages for both versions are identical, and we’ll control the budget split to run the same number of impressions for each ad during the same timeframe. And if possible, we’ll compare the A/B test results to any historical baseline for the ratio of landing page views to form fills from all tactics that drive traffic. We can apply several layers of testing when developing your creative: identical banner ads with different headlines, ads with different images or body copy, different CTAs or different value propositions.
Ultimately, what we test should be aligned with what we want to learn. The goal is to keep the test simple so we can see a clear winner. But the truth is, even when controlling all the variables, there will always be outside elements we can’t account for, like economic factors, consumer buying trends or major global disruptions.
Once we’ve set our testing variable, we need to determine a timeline and plan for tracking and tagging to ensure we can measure performance.
With the testing plan in place, it’s time to execute. Then step back, watch and learn. We recommend running a test for at least four weeks, but that could vary based on client needs.
A well-designed A/B test means we can be sure our results will provide insights. We might find that Version A drove 50% more form fills—knowing that the only variable being tested was the headline, we can infer that Version A’s headline was stronger. From there, we can create similar headlines with the expectation that they’ll yield similarly strong results with a similar audience. Then we can test different images or CTAs. And that leads us to our next step.
Once we have our primary measurement (in this case, form fills), it’s time to evaluate additional data points between our two versions. Every digital advertising campaign provides a wealth of performance and analytics data: Maybe we also see that the click-through rate was higher for the ad that drove fewer form fills or that one version led to more time on the website.
This testing methodology and structure can be applied across virtually every digital tactic, as long as the tactics are measurable and trackable.
- Test one thing at a time
- Be patient
- Don’t test without a plan
- Test, learn, optimize and repeat
As media supervisor, Erin understands the importance of science in marketing. Testing, learning and optimizing for our clients is how she defines success.