A simple email marketing guide to get results: A/B Testing

Email doesn’t differ from other forms of marketing when it comes to the fact that more relevant content produces better results for your business. This is why segmenting email marketing lists can improve open and click rates by narrowing the focus; sending targeted campaigns to specific groups who will find the content more relevant.

We can do this through A/B or split testing; comparing two emails by showing the two variants (the A group and the B group) to similar groups in the same email list. An example: group A will receive an email with a 20% discount, whilst group B will receive the exact same email but instead of offering a discount, this group will receive a free gift upon purchase. By analysing the results in these tests, a user is able to see what works best for their customers and which provides the best return in terms of click and open rates.

pexels-photo

Where to start

The first decision to make is answering the question: what are you wanting to test?

There are a fairly wide range of ideas you could test;

  • Subject line text variations
  • Images
  • Headline
  • The offer; a free gift vs free postage etc.
  • The emails layout; a grid with images dominating and limited text VS columns of text
  • The wording behind some of the key phrasing which act as calls to action for getting customers from the email to the website; ‘click here for more information’ VS ‘buy it now.’
  • You could even change the main body content to see what style of content produces the highest engagement.

If you’re looking for A/B testing to be an on-going process for your email marketing campaigns to look at various areas of the email, it would be a good idea to begin with the subject line and work your way through the email from the beginning until the end. This will ensure you’ve fully customised the emails, with an emphasised focus on the most important areas- such as the phrasing for the call to action.

Segmenting your email list

The ratio of people receiving version A to those receiving version B is at the discretion of the user; normally this is a straight 50:50 split in order to get fair results, however, these can be altered depending on the size of the test groups. For example, one of our clients using our email marketing studio are a large food producer; they have personalised research which allows them to know which supermarket the individual tends to shop at; from this they can be segmented into the list for that particular store. When an offer begins at that supermarket e.g. Asda, they can send out a campaign to let them know, whilst another list might receive one for Tesco; the data can be compared between both to see which results in a response from the customer in terms of opens and clicks.

One variant at a time? Or can we speed up the process?

Whilst in the long term you might want to test more than one comparable; for the most accurate results you need to test one variable per email, otherwise you won’t be sure which part of the email difference contributed to the results. Splitting your recipients into two groups and analysing these is easier than splitting them into three tests where a cross analysis is run; A vs B, B vs C then A vs C. The split A/B testing will provide results which are easier to process, and from this point, it’s more beneficial to then go on to do other tests based on these results.

Analyse your results

Following your campaigns, after allowing them to run, it’s necessary to analyse the results to see which variants gave the highest open and click-through rates. These will be fairly easy to see, and from this it’s easy to state that test A was the most successful because there was a 35% increase in open rates vs test B’s results. However; this isn’t the full story. Sending email marketing is a means of encouraging pre-existing customers who have signed up, to click through to the website and proceed to make a purchase or enquiry. It is important then, to also know how many of those in both testing list A and testing list B, made a purchase as a direct result of the email campaign.

variant

It’s important the email sent is, therefore consistent with the branding on the website; if the email isn’t a reflection of the email the user might feel they’ve been sent to the wrong website and exit. Overall; one email might receive a higher percentage of click-throughs but following this; does it also have a higher number of conversions; if the answer is no, it’s possible a further A/B test is required to eventually have a formula which not only receives a high number of click-throughs, which then convert into sales.

Tools for testing

You might be aware, adigi have our very own email marketing suite, Launchpad. The fully managed email marketing studio is targeted towards clients and agencies alike; a bespoke system ad cluster of mail servers allow us to provide our clients with a scalable and powerful system. The suite allows the results of these emails with valuable and insightful feedback which can be analysed to gain insight for the next email campaign. Launchpad allows an easy way for your business to send out A/B testing emails, so if you’d like to benefit your business by using the platform, or to have an in-depth chat about the process, don’t hesitate to contact us!

Get in Touch.