Skip to main content
All CollectionsPersonalized Discount Bundles
How to A/B test bundle discounts
How to A/B test bundle discounts

How to A/B test the performance of an experience featuring a discount campaign against another one without or with another discount campaign

W
Written by Will Wadman
Updated this week

You can use the A/B testing feature to compare experiences and decide which delivers superior performance. The objective is to identify which modifications can enhance user engagement, conversions, or other desired outcomes.

One possible scenario is to use LimeSpot's A/B testing feature to compare the performance of an experience featuring a discount campaign against an experience without a discount campaign, or a different discount campaign.

Once you know what you would like to test, you can follow the steps below.

  1. Go to Conversion > Website Personalization, click A/B TEST.

  2. You will see your current default experience. Since this default experience is the only one available at the moment, its weight and live odds are both 100%.



    Weights and Live Odds

    • Weight indicates how much of the traffic one experience should get receive against the weights of other experiences.

    • Live Odds represent the percentage each experience will receive.​Examples, assuming 3 experiences:

      • If all 3 experiences have a weight of 10 and all experiences are live, each will receive 1/3 of the traffic.

      • If all 3 experiences have a weight of 10 but only 2 are live, each will receive 50% of the traffic.

      • If experience 1 has a weight of 20, the 2 other experiences, a weight of 10 and all experiences are live, experience 1 will receive 50% of the traffic while the other 2, will each get 25% of the traffic.


  3. Click on CREATE to create a new experience or you can also DUPLICATE the current Default experience to copy all the customizations and settings that you have set up.

  4. In the Experience Settings, change the Title to give details for the testing experience, the Weight, Start Date, and Start Time. Click on APPLY to save the changes.





    Note: Make sure that you create new experiences for the A/B test with the same start date and time so that the traffic on both experiences starts from scratch. Do not forget to put an end date and time (the same as the start date and time of the other experiences) on the currently running experience so that the current experience will stop at the same time the A/B test experiences will start running.



  5. Once you have created all the new experiences, you will see the experiences listed under A/B Testing Experiences. From now on, each experience has its own independent setup.

  6. When you click on each experience, you will see what boxes are enabled on what page. Go ahead and customize the setup by clicking on CUSTOMIZE for each experience.




    Note: The customizations will be disabled while A/B tests are running. So if you need to make any customization changes, you must click on STOP NOW to stop the A/B test and then DUPLICATE it to re-run it for a new A/B test after you have made the changes.


  7. Once you are ready to start the A/B test, PUBLISH the changes in each new A/B Test experience and this will schedule the experiences to start on the start date and time specified.

  8. Go to the currently running experience (ie. Experience A: Default) and click on EDIT to put an end date and time for this experience.


    The end date and time should be the same as the start date and time of the A/B test experiences created above. This will make sure that this experience will end when the A/B test experiences start and all the A/B test experiences (ie. Experiences B and C) will start with fresh traffic.

We suggest that you let the A/B test run for at least 14 days to get enough traffic and data to compare each experience.

If you need any assistance in setting this up or if you are currently not a Premium subscriber but would like to avail of this feature, reach out to [email protected].


Did this answer your question?