All Collections
Personalized Discount Bundles
How to A/B Test Product Bundle Discounts
How to A/B Test Product Bundle Discounts

Learn how to A/B test the performance of an Experience featuring a Discount campaign against another Experience without a Discount campaign.

Written by Will Wadman
Updated over a week ago

You can use the A/B testing feature to compare experiences and decide which delivers superior performance. The objective is to pinpoint modifications that enhance user engagement, conversions, or other desired outcomes.

One possible scenario is to use LimeSpot's A/B testing feature to compare the performance of an experience featuring a Discount campaign against an experience without a Discount campaign. This allows you to identify which approach yields superior results.

Once you know what you would like to test, you can follow the steps below.

1. Go to your LimeSpot Admin Panel > Conversion > Website Personalization. You will see a button at the top right called A/B TEST.


Make sure that you create new experiences for the A/B test with the same start date and time so that the traffic on both experiences starts from scratch. Do not forget to put an end date and time (the same as the start date and time of the other experiences) on the currently running experience so that the current experience will stop at the same time the A/B test experiences will start running.

2. Clicking on the A/B TEST button allows you to see your current default setup. Since the default is the only setup at the moment, its weight and live odds are both 100%. Click on CREATE to create a new experience or you can also DUPLICATE the current Default experience to copy all the customizations and settings that you have set up.

3. In the Experience Settings, change the title to give details for the testing experience, the weight, start date, and start time. Click on APPLY to save the changes.

4. Once you have created all the new experiences, you will see the experiences listed under A/B Testing Experiences. From now on, each experience has its own independent setup.

5. When you click on each experience, you will see what boxes are enabled on what page. Go ahead and customize the setup by clicking on CUSTOMIZE for each experience.


The customizations will be disabled while A/B tests are running. So if you need to make any customization changes, you must click on STOP NOW to stop the A/B test and then DUPLICATE it to re-run it for a new A/B test after you have made the changes.

  • Experience B: Product Page: Related

  • Experience C: Product Page: Frequently Bought Together

6. Once you are ready to start the A/B test, PUBLISH the changes in each new A/B Test experience and this will schedule the experiences to start on the start date and time specified.

7. Go to the currently running experience (ie. Experience A: Default) and click on EDIT to put an end date and time for this experience. The end date and time should be the same as the start date and time of the A/B test experiences created above. This will make sure that this experience will end when the A/B test experiences start and all the A/B test experiences (ie. Experiences B and C) will start with fresh traffic.

We suggest that you let the A/B test run for at least 14 days to get enough traffic and data to compare each experience.

If you need any assistance in setting this up or if you are currently not a Premium subscriber but would like to avail of this feature, reach out to [email protected].


Did this answer your question?