Docs/A/B Testing

Analyzing Results

7 min readA/B Testing

Running the experiment is just the beginning. Understanding and acting on the results is where the real value lies. Here's how to analyze your A/B test data.

Key Metrics

For each variant, you'll see:

  • Impressions - Number of times the variant was shown
  • Clicks - Number of clicks on the variant
  • CTR - Click-through rate (clicks/impressions)
  • Confidence - Statistical confidence level

Reading the Results

The results page shows a comparison between variants. Look for:

  • Which variant has a higher CTR
  • The relative improvement percentage
  • The confidence level (aim for 95%+)

Declaring a Winner

When one variant clearly outperforms the other with high confidence, you can declare it the winner. This automatically updates your live link to use the winning variant.

Warning: Declaring a winner before reaching 95% confidence may lead to false conclusions. Be patient!

What If Results Are Inconclusive?

If neither variant shows a clear advantage after sufficient traffic, the test is inconclusive. This means both versions perform similarly - which is still valuable information! Keep the original or try a more dramatic change in your next test.

Learning from Results

Document what you learn from each test. Over time, you'll build a knowledge base of what works for your audience. Common learnings include:

  • Action-oriented titles often outperform descriptive ones
  • Specific numbers can increase credibility
  • Shorter titles may work better on mobile
  • Urgency words can boost or hurt CTR depending on audience
Loading...