Create A/B tests by chatting with AI and launch them on your website within minutes.

Try it for FREE now
CONTENTS
Thoughts
7
Min read

I hate A/B Testing

Donald Ng
Donald Ng
December 25, 2023
|
Capterra
5-star rating
4.8
Reviews on Capterra

Quick answer

The common frustrations with A/B testing — slow cycles, traffic requirements, and frequent null results — are real, but they reflect poor test prioritization and underpowered setup rather than a fundamental flaw in the methodology. Testing high-impact hypotheses on high-traffic pages with pre-calculated sample sizes produces consistently actionable results; testing minor tweaks on low-traffic pages is what produces the frustration.

Key takeaways

  • Most negative A/B testing experiences come from testing too small (minor cosmetic tweaks) or with too little traffic — not from the experimental method being broken.
  • Build a prioritized hypothesis backlog before testing so you always have a high-quality, high-impact experiment queued rather than running whatever is convenient.
  • Accept that 60–80% of A/B tests return null results — this is statistically normal, and each null result narrows your hypothesis space and prevents future wasted effort.

If we're talking about polarizing subjects in the web development world, A/B testing is surely near the top. Some love it for its scientific approach, guaranteeing data-driven decisions and helping convert more website visitors by tweaking the tiniest of details. Others, well… let's just say they're less enthusiastic about it. 

Some programmers have a preconceived disdain for A/B testing. They see it as tedious, uninspiring, and ineffective - a hammer looking for a nail. 

As an advocate for accessible, AI-driven A/B testing, I felt it was important to try to understand their views.

This sparked a deep internet dive that led me right into Reddit's dens of developer dialogue.

The Story: 

I stumbled upon a post that seemed to sum up all these gripes perfectly - a rant titled, simply, “I hate A/B testing”. The author was a developer who’d just been drafted into a huge A/B testing campaign at their company and was far from thrilled. The comment section was lit with a lively, educational debate.

So, I decided to note down their concerns, dissect them, and provide a fresh perspective. Here’s what I came up with:

1. A/B Testing is a Waste of Time...?

The developer argued that A/B tests are time-consuming and yield mostly meaningless results. Hours spent coding just to find minor or even negative results can feel like wasted effort. They went on to suggest that their time would be better spent addressing known bugs or issues.

While it's true that not all tests deliver a massive uplift in conversions, it hardly means they’re a waste of time. Even 'failed' tests can offer significant insights. Knowing what doesn't work with your audience is as valuable as knowing what does. It saves you from using ineffective strategies in the future and offers new paths for exploration.

Free A/B Testing Tool

Run your next A/B test the right way

Visual editor, 15 KB script, GA4-native — and free forever up to 100,000 monthly visitors. No developer required.

✓ Visual editor✓ 15 KB script✓ GA4 integration✓ Free up to 100k visitors
Try Mida free →

2. Work I Did Gets Deleted… Why?

The frustration here lies in the feeling that hard work - the hours spent strategizing, coding, and monitoring - can end up in the trash if the test doesn't make a significant difference.

However, it's essential to remember the real purpose of A/B testing. It’s not about creating the perfect, timeless piece of code. It serves as a sandbox for experimentation, a safe space to try new things without the fear of failure. Every line of code, even when deleted, contributes to building a better customer experience on your platform.

In order to minimize hours down to drain, learn to avoid some of the most common mistakes in A/B testing.

3. Who Decides What to Test?

The developer’s third criticism was about the decision-making process regarding what gets tested. They felt that ideas were often born from teories coined by newly qualified marketers, not seasoned developers.

While everyone should have a say in the process, hypothesis generation should be an inclusive, collaborative process, drawing from multiple areas of expertise. This underscores the importance of communication and cooperation between departments. If everything is driven by a marketing hype train, you're selling your developers' valuable insights short.

4. Fear of Launching Without Testing?

Our disgruntled developer argued that Any system which paralyzes decision-making is a flawed one. In their opinion, their company had become overly reliant on testing without clear guidelines on what to test.

Here, they've hit upon a critical, yet often overlooked, point - the importance of testing strategy. Effective A/B testing isn’t about testing everything, but about strategically focusing on the areas most likely to impact your key performance indicators.

5. Does A/B Testing Reflect Real User Motivations?

An interesting point raised - are we ascribing too much meaning to our test results? Can we confidently say why users behave the way they do on our platform?

No, A/B testing isn't a silver bullet. But it provides, at a minimum, a snapshot of our audience's behaviors. And in a digital world, where businesses need to make thousands of decisions every day, those snapshots can be key.

Free A/B Testing Tool

Run your next A/B test the right way

Visual editor, 15 KB script, GA4-native — and free forever up to 100,000 monthly visitors. No developer required.

✓ Visual editor✓ 15 KB script✓ GA4 integration✓ Free up to 100k visitors
Try Mida free →

6. Evidence Cherry-Picking: How Do We Safeguard Against Bias?

This raises a valid concern about bias - human tendency to celebrate successes while downplaying failures. In an A/B testing context, it could manifest as someone taking credit for positive test results while attributing negative results to external factors.

The best way to combat this bias is by fostering a culture that values data above theories and egos. At Mida, we champion this ideal. By using an automated, AI-driven A/B Testing tool, we let cold, hard numbers speak for themselves.

7. Marketing Ruins Everything… Including A/B Testing?

While some of us appreciate pop-ups and push notifications, others find them irritating. The same goes for A/B testing—some view it as another intrusive marketing tactic that erodes user experience.

Marketing, like any tool, is as harmful or helpful as we make it. A/B testing doesn't ruin user experiences; it's poorly conceived tests that do. An AI-driven A/B testing platform like Mida can be instrumental in intelligently designing tests, minimizing the intrusiveness, and enhancing user experience.

Conclusion: 

To conclude, while A/B testing might seem tedious or even unnecessary to some developers, the reality is far from it. When done right, A/B testing is a scientific, data-driven method of understanding users' preferences and enhancing their web experience. 

That said, it’s important to remember that A/B testing is just a tool, and like all tools, it must be used responsibly. Careful design of tests, a focus on meaningful results, and an open dialogue among all stakeholders are key to success. 

With Mida, we're making A/B testing faster, smarter, and more user-friendly than ever before. Trust the data. Accept both victories and non-victories. And most importantly – keep experimenting!

Run Your First A/B Test in Minutes — 100,000 MTU Free

Visual editor, AI-powered variant creation with MidaGX, GA4 integration, and more. No credit card required, no time limit.

Decorative graphicDecorative graphicDecorative graphicDecorative graphic