A/B Test Statistical Significance Calculator

Enter your visitor and conversion numbers below to find out if your test result is statistically significant

Number of visitors on this page
Number of overall conversions
Conversion rate
A
8%
B
12%

Your test result

Test "B" converted 150% better than Test "A".

I am 95% certain this result will improve your conversion rate.

Your A/B test is statistically significant!

Frequently Asked Questions

Why does statistical significance matter? +

Without statistical significance, you cannot tell whether your result is a real effect or random variation. Declaring a winner too early — before reaching significance — is called "peeking" and leads to false positives that can hurt your business when you ship the losing variant.

What does 95% confidence mean? +

95% confidence means there is only a 5% probability that the observed difference between variants happened by chance. In other words, if you ran this test 100 times under the same conditions, the result would go the same way at least 95 times.

Should I trust the result if my sample size is small? +

Be careful. A small sample can still produce a "significant" result by chance. Always use the sample size calculator before starting your test to ensure you collect enough data. A result that reaches significance with only 200 visitors is suspect; one with 5,000 visitors is far more reliable.

My test is not significant — should I keep running it? +

If you have already hit your target sample size and the result is not significant, the most likely explanation is that the change had little or no real effect. Extending the test hoping for significance is a form of p-hacking and leads to unreliable conclusions. Consider redesigning the test with a bolder change instead.

More free tools

The A/B testing platform for people who
care about  website performance

Mida is 10X faster than anything you have ever considered. Try it yourself.

Decorative graphicDecorative graphicDecorative graphicDecorative graphic