Articles on: Upsell

A/B Testing Overview

A/B Testing in Account Editor lets you test two versions of an upsell offer — comparing elements like discounts, layouts, or copy — so you can identify what actually improves conversion and revenue performance.


Instead of guessing what works best, A/B Testing gives you data-backed answers to refine your upsell strategy continuously.



🧠 Why A/B Testing Matters


A/B testing helps you:


  • Validate new offers before rolling them out widely.
  • Improve conversion rates by testing small layout or text changes.
  • Discover which surfaces (Checkout, Thank You, After Checkout, or Order Status) generate the highest lift.
  • Build a habit of data-driven optimization rather than assumptions.



⚙️ How It Works


When you create a new A/B test, Account Editor randomly splits your customers into two equal groups:


  • Control Group (A): Sees your current offer setup.
  • Variant Group (B): Sees the new version you’re testing.


The app tracks both versions automatically and compares performance metrics like:


  • Revenue
  • Conversion Rate
  • Click-Through Rate
  • Average Order Value (AOV)


After enough data is collected, the dashboard shows which version performs better — marking it as the Winner.



📍 Accessing A/B Testing


  1. Go to your Account Editor dashboard.
  2. Navigate to the Upsell Engine → A/B Testing tab.
  3. You’ll land on the A/B Testing Dashboard, which summarizes all active and past tests.



📊 A/B Testing Dashboard Overview


The dashboard helps you manage your tests at a glance. You’ll see:


Section

Description

Active Tests

Number of currently running tests.

Completed

Total number of finalized or stopped tests.

Upsell Added

How many upsell items were added through your tests.

Revenue

The total revenue influenced by your A/B tests.


If you haven’t created any tests yet, the page will show:


“No A/B tests yet – Start testing different variations of your offers to optimize performance.”


Click “Create your first test” to begin.



🧩 Best Practices Before Creating a Test


  1. Always test one variable at a time (e.g., discount %, layout, or button text).
  2. Let your test run for at least 2 weeks to gather valid results.
  3. Avoid editing the same offer while it’s being tested.
  4. Pick a clear metric to measure success — such as conversion rate or revenue.
  5. Choose a surface with enough traffic to collect meaningful data.



✅ Example Use Cases


Scenario

What You’re Testing

Compare two discount levels

“15% off” vs. “20% off”

Test CTA buttons

“Add to Cart” vs. “Get This Deal”

Test layouts

Single product block vs. bundle layout

Test timing

Checkout vs. After Checkout surface



💡 Tip:


A/B Testing doesn’t just tell you which version wins — it helps you understand why.

Look at click-through and add-to-cart differences between variants to uncover behavioral trends.



🧾 Next Article → Creating a New A/B Test


In the next guide, we’ll go step-by-step through:


  • Setting up your test name and hypothesis
  • Choosing success metrics
  • Selecting surfaces and offers to test
  • Understanding how traffic is split


This will help you confidently create your first A/B Test in Account Editor.

Updated on: 27/11/2025

Was this article helpful?

Share your feedback

Cancel

Thank you!