Register

How is the winning version of an A/B split test campaign chosen?

If you are eager to increase the impact of your campaigns then taking advantage of Moosend's A/B split test campaign creation is the way to go! You can really achieve excellence using A/B split testing, by improving all characteristics of your newsletters based solely on what your mailing list's interests are and how exactly they respond to various aspects of your email marketing. 

A/B split testing is an invaluable asset! In essence it's a game of interaction between you and your subscribers: you create campaigns which change in one specific aspect each time, you test on which of the variations your subscribers respond best, and then you send the winning version of your campaign to the rest of your mailing list. 

It's as simple as that! Let's see how it's done:

A. What is the winning version?

When creating an A/B split test campaign, you are asked to set the criteria by which the winning version is defined. You divide your recipients into two groups: a "test" group and a "winner" group. The first consists of those recipients who will be receiving two variations of your campaign, and whose reaction will define which version is the winning version. This is the A/B split test!

The recipients in the "winner" group are only going to be receiving the winning campaign after the A/B split test has been completed. All you have to do is decide which percentage of your mailing list you want to use as a "test" group and upon which criteria the performance of the winning campaign is defined. You have two options:

B. Setting your criteria

Let's assume you are creating an A/B split test campaign, wanting to put to test different subject lines for your new email marketing campaign. 

1. Drag the white bar left or right to set the percentage of subscribers which will be used as a test group. 

The yellow part of the bar-graph represents the amount of people who will only receive the winning campaign. If you set the percentage of the test group as 10%, this means that 5% of your list will receive the subject line A campaign version whereas the other 5% will receive the campaign version with subject line B. The rest of your mailing list (the yellow 90%) will be the group which only receives the version which fares better in the 10% test group.

2. Toggle the criteria according to which the winning version is decided.

You can choose between using the Open Rate or on the number of Unique Clicks as the basis for which campaign version is deemed as winning - the campaign version with the highest rate wins. If for example you're testing two different subject lines, it makes more sense to use the highest Open Rate option as the winning factor, as whichever campaign your test group opens most times probably has the most interesting subject line.

3. Drag the square to set the time duration for which the A/B split test will run before the winning version is decided.  

The maximum duration you can set is 24 hours, while the minimum is 1 hour.

4. Click on the Next button when you're done.