Back to Learning Center
Subscribe
Join 40,000+ sales and marketing pros who receive our weekly insights, tips, and best practices.
Thank you! You have been subscribed.
Learning Center
Learning Center
Close
The IMPACT Learning Center

Free resources to help you master inbound marketing and They Ask, You Answer

Access the Learning Center

Access the Learning Center

Access the Learning Center
learning_center_grey__What is They Ask, You Answer-v2-black

What is They Ask, You Answer

What is <span>They Ask, You Answer</span>
Articles, Podcasts, & Updates

Articles, Podcasts, & Updates

Articles, Podcasts, <span>& Updates</span>
Free Courses & Certifications

Free Courses & Certifications

Free Courses & <span>Certifications</span>
On-Demand Keynotes & Sessions

On-Demand Keynotes & Sessions

On-Demand <span>Keynotes & Sessions</span>
Events
Events
Close
IMPACT+ Membership
IMPACT+ Membership
Close
Services
Services
Close
Navigation_8_2021_taya

They Ask, You Answer Coaching & Training

They Ask, You Answer Coaching & Training
Navigation_8_2021_flywheel

Inbound Marketing Services

Inbound Marketing Services
Navigation_8_2021_website design - monitor

Website Design & Optimization

Website Design & Optimization
Navigation_8_2021_hubspot implementation

HubSpot Training & Implementation

HubSpot Training & Implementation
Navigation_8_2021_virtual selling

Virtual Sales
Training

Virtual Sales <br>Training
Navigation_8_2021_swell - paid ads

Paid Search & Social Services

Paid Search & Social Services
Become a Certified Coach
Become a Certified Coach
Close
How Long Should I Run An A/B Test For? Blog Feature

January 30th, 2015 min read

How_Long_Should_I_Run_an_AB_testI'd like to think that I have a great deal of patience. 

(I mean, I taught my mom how to use an iPhone, so that's got to count for something, right?)

However, when it comes to A/B testing, it seems as though my willingness to wait goes out the window. 

Eager to report on my findings, I find myself refreshing the page insistently. But truth be told, the key to effective A/B testing is to give the test the time it needs to run it's course. 

In other words, prematurely putting a cap on it could cost you the validity of the experiment. And conversely, waiting too long to report on the results could sway the data just the same.

So when should you conclude an A/B test?

We've dug into our own experiences (and the advice of others) to help answer this question for once and for all. 

The dangers of concluding too early

Before we get into real numbers, it's important to define what's at stake. 

Often times marketers will begin to spot what they think is a trend in the data after just a couple of days and close the test. 

Let's make one thing clear, "just a couple of days" is rarely enough time to draw any significant conclusions about which variation performed best. 

Test results can change very drastically, very quickly. 

To help paint this picture, check out this example from ConversionXL. The following details the results of an A/B test just two days after launch:

How_Long_Should_I_Run_An_AB_Test_For_

As you can tell, the control was crushing the variation. There was a 0% chance that it would outperform the control. 

Here's a look at the results after 10 days:

How_Long_Should_I_Run_An_AB_Test_For_

Looks like that 0% chance turned into a 95% chance really fast. 

Point being, had they concluded the test early on, the results would have been totally invalid. 

How to make the right call

While it's tough to definitively say, "You should run your test for X days", there are a few ways to come about a sound ending point for your test. 

In fact, The Definitive Guide to Conversion Optimization by Neil Patel and Joseph Putnam defines the following guidelines for determining when to end a test:

  • You should run a test for at least 7 days.
  • There should be a 95% (or higher) likelihood of finding a winning version. 
  • You should wait until there are at least 100 conversions. 

However, as we mentioned, there's no "one size fits all" approach when it comes to defining a concluding point. 

In a recent "unwebinar" with our friends at Unbounce, our marketing director, John Bonini revealed that we typically run our tests for anywhere between 30-90 days, depending on the amount of traffic that we're driving to the variants. For example, if you're driving millions of people to your pages, it's likely that the time between launch and conclusion will be far shorter than a website that is only bringing in a couple thousand views a day. 

While there are "statistical significance" calculators out there to help you determine whether or not it's time to call it quits, we urge you to proceed with caution. 

I, too, was excited about the functionality of these tools upon discovery, however, after reading into a bit I found that sources were reporting that these tools often call tests too early, putting you at risk of disrupting the legitimacy of the results. 

Here Are Some Related Articles You May Find Interesting

Want to Contribute Content to impactplus.com? Click Here.

HubSpot

How much HubSpot do I need?

By John Becker on August 17th, 2021

By John Becker on August 17th, 2021

IMPACT+ Sign Up
A FREE online learning community with on-demand courses, hundreds of expert-led sessions, thousands of your peers ready to support you, and much more.
Check it out