Maximize your holiday revenue with our Holiday Playbook 2024 

How to develop an email test plan in 5 easy steps

Clearly defined testing programmes are the benchmark of accurate, finely-tuned email campaigns. They help increase the revenue you can generate from your email marketing activities, which is, after all, the ultimate goal. To ensure your testing continues to pay off in the long run, follow these 5 simple steps to help you develop an effective programme.

1. Define your hypothesis

Investing the time up front to clearly define what you’re hoping to achieve through testing helps focus your efforts, and will keep you on track throughout the process. At this stage, don’t be constrained by practical issues. You can consider the restrictions of resources, data and technology later – at this point just focus on your ideals. Once you have a clear idea of your objectives, brainstorm the influential factors you feel could have an impact on them. Your email objectives might be defined by: opens, clicks, purchases, revenue, and unsubscribes. Your influential factors could include anything from: day of the week the email is sent, week of the month the email is sent, subject line length, keywords in the subject line, types of content, message, frequency, email design and images. The combination of these two lists will form the basis of your hypothesis. If you want more of a steer, there’s plenty of material available online that can help – from research to case studies. By all means have a look at these, but it’s important you try testing influential factors for yourself. No two brands have the same market, selling points or subscriber base, so it’s important not to rely too heavily on the findings of others.

Tip: Involve as many stakeholders as you can at this early, defining stage.

2. Mine your historical data

Your historical data contains a wealth of information that will help you achieve your test objectives. To pick an example: if you’re trying to find out which days of the week have the biggest impact on the revenue your newsletter campaign generates, analyze your data over the past 12 months. Chances are you’ll have sent mailings on different days throughout the year, and this data could help minimize the number of tests you need to carry out. You may discover that, say, Wednesdays are your best weekday, so you’ll probably only need to test this against Saturday and Sunday to reveal your optimal day of the week overall.

Diving into your historical data can also be a quick way to identify potentially influential factors that you haven’t even considered. You may find, as we did here at Alchemy Worx, that there is a significant relationship between subject line length and your click-to-open rate – and this presented us with another opportunity to fine-tune our hypothesis.

Tip: If you’re able to analyze your historical data at an individual subscriber level, you may be able to identify segments of subscribers who behave in similar ways. Finding significant groups of subscribers with the same behaviour patterns for purchase and engagement is likely to present strong opportunities – but only if you carry out a few tests on them.

3. Design your test plan

The next stage is to set out your test methodology. Are you carrying out A/B-split tests, multi-variant tests, or test-and-send? And what’s your test schedule going to look like? We have found the best way to approach this is to take a long-term view. We therefore recommend developing a test plan that entails making small, regular changes to your campaign, then only producing analysis after a long period. This will give you more reliable results. We also recommend splitting your campaign into test cells, then analyzing the results well after the activity has finished as the first actions are typically not representative of the whole list. Keep in mind the potential gains versus the resources required to implement your tests.

Testing often means duplicating the amount of work required when sending each campaign, depending on the type of test. A creative test, for example, will be much more time-consuming than a data or timing test, and it’s essential to consider practicalities at this stage. Once you have prioritized your test factors, you will need to define your sample size. This will dictate how many factors can be tested each week, month and year.

Tip: You may be able to achieve efficiencies in the testing process by using dynamic content, personalization or triggered campaigns based on a date field in your data. But remember, this may require additional data work up front, and make reporting more time consuming.

4. Deploy your campaigns

Having taken any resource constraints into account during the planning stages, you can now deploy your campaigns using the schedule you have produced in your test plan.

Tip: Develop a naming structure for your test campaigns, which will ensure your post-campaign analysis will be as easy as possible. Keep in mind that your campaign code may be visible to recipients on your hosted version, image locations or file names.

5. Analyze your results

Wait as long as you can after the campaign has been deployed to analyze the results. Two weeks afterwards is a good starting point, but a month is better, depending on your brand and subscriber purchasing patterns. Use your original hypothesis as the benchmark for analysis – and conversely the process of analysis might cause you to question some of your thinking in the hypothesis. And so the process goes full circle. It’s not over yet, though.

The results from each of your tests can now be used to fine-tune your hypothesis, give you fresh ideas for looking into your historical data and possibly re-prioritise your test plan. So it’s back to step one to start the cycle all over again.

Last updated: Oct 18, 2016

YOU MIGHT ALSO LIKE...

Screenshot of the Listrak Interview with Alchemy Worx CEO Allan Levy

Innovation Unleashed

In this edition of Innovation Unleashed, Listrak CEO Ross Kramer and Alchemy Worx CEO Allan Levy dive deep into the future of cross-channel digital marketing. Together, the two share insights

Read More »