All Collections
1:1 Product Recommendations
Analytics
How to use the A/B Testing feature
How to use the A/B Testing feature

Learn how to use the A/B Testing feature in LimeSpot

W
Written by Will Wadman
Updated over a week ago

LimeSpot offers an advanced A/B Testing feature (available for customers on Pay As You Grow and Premium - aka Enterprise - plans only) to facilitate which setup, combination of our recommendation boxes and/or content works best for your store.

Using the A/B Testing feature gives you, the store owner, the ability to create and run multiple Experiences and monitor the results using the LimeSpot Analytics to find out which experience works better in your live stores. 

To start creating your A/B Tests, think of 2 different LimeSpot setups (or "experiences") that you want to test on your store. For example, you would like to test if the Related Items or the Frequently Bought Together recommendation box is better for the Product Page of your store. The experiences you would need are the following:

  • Experience A: Default with Related Items in the Product Page

  • Experience B: Frequently Bought Together in the Product Page

You also need to decide how much weight you want to allocate to each of these experiences during the experimentation phase by inputting the weights for each experience and the weight calculates the odds of the experience. The weight is an integer from 0 to 100. When you put the weight as 0, this will disable the Experience from working. The higher the weight, the higher the priority of the experience to be seen by the visitors. (e.g. 10-10 weight of each experience which makes a 50%-50% odds, 5-15 weight will give a 25%-75% odds etc.)

It is very important to note that the weight of each experience corresponds to the chances a shopper gets that experience. Here is an example:

  • If Experience A has a weight of 100 and Experience B has a weight of 200. This means that Experience A will have 33.3% odds that the visitors will get this experience and Experience B has 66.7% odds that the visitors will get this.

A/B Testing is very valuable so that you have data that backs up the result on what works best for your store with Personalizer.

Once you know what you would like to test and how much weights each experience would have, do the following steps:

1. Go to your LimeSpot Admin Panel > Conversion > Website Personalization. You will see a button at the top right called A/B TEST

Note: Make sure that you create new experiences for the A/B test with the same start date and time so that the traffic on both experiences starts from scratch. Do not forget to put an end date and time (the same as the start date and time of the other experiences) on the currently running experience so that the current experience will stop at the same time the A/B test experiences will start running.

2. Clicking on the A/B TEST button allows you to see your current default setup. Since the default is the only setup at the moment, its weight and live odds are both 100%. Click on CREATE to create a new experience or you can also DUPLICATE the current Default experience to copy all the customizations and settings that you have set up.

3. In the Experience Settings, change the title to give details for the testing experience, the weight, start date, and start time. Click on APPLY to save the changes.

4. Once you have created all the new experiences, you will see the experiences listed under A/B Testing Experiences. From now on, each experience has its own independent setup.

5. When you click on each experience, you will see what boxes are enabled on what page. Go ahead and customize the setup by clicking on CUSTOMIZE for each experience.

Note: The customizations will be disabled while A/B tests are running. So if you need to make any customization changes, you must click on STOP NOW to stop the A/B test and then DUPLICATE it to re-run it for a new A/B test after you have made the changes.

  • Experience B: Product Page: Related

  • Experience C: Product Page: Frequently Bought Together

6. Once you are ready to start the A/B test, PUBLISH the changes in each new A/B Test experience and this will schedule the experiences to start on the start date and time specified.

7. Go to the currently running experience (ie. Experience A: Default) and click on EDIT to put an end date and time for this experience. The end date and time should be the same as the start date and time of the A/B test experiences created above. This will make sure that this experience will end when the A/B test experiences start and all the A/B test experiences (ie. Experiences B and C) will start with fresh traffic.

Let the A/B test run for at least 14 days to get enough traffic and data to compare each experience.

Did this answer your question?