Modern Marketing Confessional: The Real Truth Behind the A/B Test

Which would you rather do – go into market with something you think will work, or go into market with something you know will work?

It doesn’t take a rocket scientist to figure out the answer. It’s obvious. Go into market with something you know will work. Great. I’m glad all of us modern marketers agree on this fundamental fact.

Only, we’re not…

Some of us are way more fixated on getting something into market, and less concerned with the most rudimentary question – Is “the thing” even right?

Photo by PICSELI on Unsplash
And here’s why that’s a problem…

We can gather all the information possible on personas, messaging, and content offers; but in the end, we’re only making educated guesses based on everything we already know. What about the things we don’t know?

There is a way to learn “all the things.”

And it’s called A/B testing. Running A/B tests on emails, ads, and landing pages is the only way to confirm all our hypotheses will produce real results. And let’s be honest, without real results, there is no real revenue.

Yeah, but…

Go ahead. Hit me with all the “but’s” you have.

  • but how do I know what to test?
  • but how much will it cost?
  • but how long will it take?
  • but what if we don’t learn anything?
  • but what if we do learning something?

Listen, the BUT stops here. Being the diligent modern marker I am, I researched this topic to exhaustion. I scoured the web, talked to peers, made friends with subject matter experts, and reviewed our own internal data. And I learned a lot. So, sit back, I’m about to write all about it.

Data doesn’t lie.

At ID, we’re driven by continuous improvement – it’s part of our methodology. You can’t improve a campaign without collecting and interpreting performance data on a regular basis. That’s why we build tracking and measurement into all our revenue success plans from the get-go. It’s also why we test and analyze results before, during, and after putting a campaign into market.  And when data tells us something about a campaign, we listen, adjust, and set new benchmarks.

Here’s what NOT to do.

You can’t just pick and choose random things to test. Without a strategic reason, you won’t learn much, if anything at all. Here are some of the most common A/B Testing pitfalls:

  • Testing too many variables at a time. “I know! Let’s test our landing page copy, hero image, and CTA button text – all at the same time!” No, really. Let’s not.
  • Forgetting to establish a benchmark. “Good news! Forty percent of users clicked on this button!” Great. How many clicked before we changed the color of it?
  • Failing to establish statistical significance. “Hey! 100% of users filled out the new form!” (…out of 1 user).
  • Testing things that don’t matter to your target audience. “Should the font-size of our footer boilerplate be 6 points or 8?” Really?
  • Rushing into market. “What we learned after going into market is our prospects aren’t interested at all in the content we created.” No comment.

Many companies spend days, weeks, and months creating and launching an integrated marketing campaign. Everything goes into market with the hopes all the hard work will pay off. And then the data comes in. Sometimes, companies get it right, but sometimes, they don’t. And when it’s wrong, crucial parts of the business suffer – budgets, sales, and confidence in the marketing department.

What TO do.
Sometimes, companies get it right, but sometimes, they don’t. And when it’s wrong, crucial parts of the business suffer – budgets, sales, and confidence in the marketing department.

I recently sat down with a fellow modern marketing and A/B Testing Expert, Brian Massey, who is a managing partner of Conversion Sciences. We chatted about all things A/B Testing, and here are the to-do’s I took away from the conversation:

Challenge cultures. Commit to being a bonafide change agent. Make it your mission to push your colleagues out past the notion of creating things built on opinions. Everything you present to your higher-ups should be based on data, not best guesses. Trust me, they’ll thank you for it.

Start early. In-market data is expensive, but pre-market data is cheap. Take advantage of today’s inexpensive tools to test on smaller audience sizes first. Measure and analyze, and then let the data direct where your lead generation resources and media spend should go next.

Seek guidance. I shared five things you shouldn’t do when A/B Testing, but there are countless others. Whether this is your first venture into A/B Testing or not, working with a seasoned veteran will save you and your team time, energy, and dollars.

Rank hypotheses. No one has the time or money to test everything. Don’t go down an A/B Testing rabbit hole. Whittle your list down by weighting each possible experiment by importance, effort, and impact based on earlier experiences.

A/B testing has been shown to generate up to 30-40% more leads for B2B sites, and 20-25% more leads for ecommerce sites.1 

Educate everyone. Apply your findings to the larger campaign, but don’t forget to share them with the entire team – both in-house and outside. Individuals who didn’t work on the specific campaign can take what you’ve learned and harness the information for future projects.

PROOF.

A/B testing has been shown to generate up to 30-40% more leads for B2B sites, and 20-25% more leads for ecommerce sites.1  

Exhibit A: Viva Las Vegas

Vegas.com ran an A/B Test comparing a mobile site with fewer words and less functionality vs. a full desktop version. When the results came in, the mobile saw a 16% increase in traffic and reduced the bounce rate by 22%.2

Exhibit B: Real People, Real Results

Hubspot conducted an A/B Test to see if open rates would go up or down based on the Sender Name. So, they pitted “Hubspot” as the sender vs. a real person’s name. The results? The emails sent using a real person’s name had .53% more opens, .23% more clicks, and 131 more leads.3

Go from “I think” to “I know.”

Whether you’re trying to find new ways to increase conversions or ensuring that you’re already doing things the right way, A/B Testing should never be an afterthought. It’s too important to the success of a campaign (and your career) to skip over it. Without it, you’re the one to blame for being wrong. With it, you’ll be the praised for being right.

Ready to become the next A/B Testing prodigy? We have a solid line-up of gurus to teach you the ways. Contact us today to start experimenting.

Whether you’re trying to find new ways to increase conversions or ensuring that you’re already doing things the right way, A/B Testing should never be an afterthought.
Nikki Flores

Author Nikki Flores

More posts by Nikki Flores

An avid learner, Nikki possesses an innate ability to transform complex concepts into easily digestible content. With more than a decade of copywriting, strategy, and digital marketing under her belt, Nikki’s developed a passion for connecting the dots between meaningful marketing messages and increased revenue.

Leave a Reply