Category Archives: Testing

Often missed part of successful email split test strategy

A mega test for sure, Marketing Experiments took 300 subject lines suggested by their readers, picked what they felt were the best ten and ran a split test.

The table of results is below. Read the original post for details of how it was run but before you do read on here for a quick lesson in email split test strategy not mentioned elsewhere.

MECSplitNotice that the top three had the same level of performance, within level of confidence. The next five subject lines, lines 4 to 8, also shared the same performance level too.

Had only the top three subject lines been tested there would have been no winner! Had the test been just the five subject lines, lines 4 to 8, there would have been no winner – in five tests.

The crucial lesson is that testing is a game of skill and quantity. A key ingredient to email split test strategy is simply doing enough tests and different treatments to discover a big gain. Doing occasional tests won’t do it.

If you are running A/B subject line tests then be prepared for several tests showing no performance improvement. Don’t get despondent but keep persevering.

As a rule of thumb expect out of ten test treatments for one to make a big shift to the needle, 3 provide some learning and improvement, 3 no winner and 3 under perform (but you’ll still learn).

When you’ve enough data to run multiple tests concurrently then it’s great to do as it accelerates learning and optimisation. However, remember your sample sizes need to be large enough for statistical significance. A 10,000 list split into ten test cells of 1000 each won’t give any valid result.

Stay on the straight and narrow with this split test calculator. to work out sample sizes and check statistical significance of results.

 

‘Strictly’ tips to use dynamic content to improve conversion

Dynamic messaging has been with us for some time (as has Strictly Come Dancing), but very few people realise the benefits preferring to simply batch it and blast it!

Dynamic messaging isn’t just another marketing buzzword. It’s a critical stage in the evolution from mass marketing to personalised, one-to-one marketing. But it goes beyond making sure that each customer and prospect receives a targeted message via his or her preferred channels.
While proper targeting techniques are critical to the process, utilising response data is equally important.  This will help you to determine how, when and with what content you will next contact them.

For example, let’s say your data suggests that Patrick, Sophie and Susanna (if it was a ‘Strictly Come Dancing’ list that is) are most likely to respond to similar personalised email offers they receive at the beginning of the month. While the end result was a familiar outcome i.e. none of them took advantage of your offer – the way they got there was vastly different.  Here’s how each scenario (or dance if you prefer) played out:

  • Patrick opens the email, clicks through to your website and spends 10 minutes browsing before leaving the site
  • Sophie, on the other hand, opens the email, and then discards it
  • Meanwhile, Susanna deletes the email without even opening it (that’s newsreaders for you!)

Now, in the past, your contact strategy might have entailed sending all three consumers variations of the same follow-up email.  However, a dynamic strategy has rules in place that suggest different follow-up tactics based on differing responses.   Here’s an example of how this could play out for you:

  • Given that Susanna didn’t even open the email, and history indicates she hasn’t opened any emails in the last six months, your follow-up with her might be via a different medium
  • Since Sophie showed she was receptive to email, a change in the offer or creative in the follow-up email might improve her response
  • Finally, given the time Patrick spent on your website, you will probably want to get in touch with him sooner than you do the others to keep his attention (given he had a short attention span during the foxtrot!)

Research indicates that leveraging segmentation and personalisation to create a dynamic contact strategy can improve email click-through rates and conversions – just read any of the DMA benchmark reports or national email surveys.

While personalising the content, creative and channel certainly plays a large role in boosting response, the dynamic aspect provides significant lift as well.

Since monitoring response and reacting to it are distinguishing elements of dynamic marketing, it’s easy to reevaluate and tweak individual messages or entire messaging funnels as the campaign continues.

This makes dynamic marketing critical when reaching customers and prospects that are especially difficult to contact and historically challenging to make convert – just like Tony Jacklin’s dancing in fact!

Font colour change causes 74% click difference

A split test with identical results doesn’t mean there is nothing to learn. A case in point is the split test covered here. It shows how valuable lessons can still be found by looking deeper even though the split test gave identical overall unique clicks results.

By drilling into individual calls to action we found a 74% difference between the control and treatment.

This split test is the third test run as a follow up to two previous tests. The first test delivered a 61% increase in unique clicks. Tests two and three were designed to tell us more about what drove that difference so the principles can be carried forwards and re-applied.

All tests were made on the DMA Infobox monthly email marketing advice newsletter – if you don’t get the newsletter signup here, its free.

This test was to understand the value of the Editorial section. In a previous test the Editorial section was in the control and not the treatment. Could adding the Editorial to the treatment increase response or would it just change reader behaviour?

Here are the emails tested, control on the left and treatment on the right.Infobox split test 3

The summary result was identical click through performance. What was there to learn from this?

I analysed which links were clicked. Each main article is promoted both in the Editorial section and in the main body following the Editorial.

A big difference between the control (left) and treatment (right) became apparent. The table below shows the uplift on the treatment over the control for links in Editorial area vs main body. Uplifts have above 95% statistical significance (check your test results with this ab split test calculator).

  Editorial links Body links
Uplift 74% -70%

In short whilst the overall number of clicks was the same, where people clicked was changed by the design. The treatment received the majority of the clicks in the Editorial area and had weak performance in body. In fact body clicks reduced by 70% over the control.

Why does the Editorial perform badly in the control?

There are two obvious reasons why the difference occurred:

  1. People find scan reading of white font on a blue background harder than the higher contrast of black on white. So they tend to skip over the editorial and scan the body and large sub-heads to pick out content of interest.
  2. Readers are using the ‘In this issue’ bookmark links in the control right column to skip to the body content.

Bookmark links, which just jump the reader to another place in the email rather than to a landing page, are not trackable. So we don’t have data to confirm or eliminate the second possibility.

The test results imply that the treatment can be reduced to just the black on white Editorial and the remaining simple email will give good performance.

Three lessons from this test

  • Look at individual link clicks for more insight and not just top line metrics when doing split tests on designs.
  • Focus design on simplicity. Get to the point and don’t make your email over complex simply because it looks more designed.
  • Avoid use of white font and coloured backgrounds. Black on white is just fine.

Remember the real world isn’t perfect.

Device tracking was included in the test with the intention of drilling into behaviour differences between desktop and mobile users.

When I looked at the device results they were quite remarkable. In fact so surprising I had to question the data. After digging deeper it became clear that a technical gremlin had meant the device tracking information was incomplete and the data could not be used.

The lesson here is be prepared to question everything. Especially if something looks too good or too bad to be true. Working with fundamentally wrong data is at best a waste of time and at worse takes you in the wrong direction, leading to worse results.

What next?

We’ve another test planned in this series that will drill into a different element of the control and treatment. I’ll be sure to share the results so come back next month and find out.

Email design change driving a 61% click increase

You can’t have missed the many reports and stats about the growing use of mobile devices to read email. Brands I work with are seeing anywhere from 15% to 70% of their emails being read on mobile devices. The average is 45% and the most popular email client, rated by number of emails opened, is now the iPhone, source http://emailclientmarketshare.com/.

The DMA have been testing a new mobile first email design. Its optimised for mobile with a skinny design approach of 400 pixels wide, as opposed to using another technique such as responsive design. If you’re not sure about the different mobile design methods then read this short primer that explains skinny, scalable, fluid and responsive email design.

The DMA use the same template across many of the DMA newsletters. The design format change was put to the test by the DMA production team with an A/B split for the DMAs ‘Infobox’ newsletter. A newsletter dedicated to email marketing.

The skinny design treatment delivered a 61% increase in clicks over the control, with statistical significance of 99%.

However, the conclusion is not the obvious one. Can you see why? Here are the two designs used in the split test:

InfoboxJuneSplit

We can’t conclude skinny rules just yet. In making a design change of this significance there are many factors at play. Potentially important changes in this design treatment include:

  • Mobile friendly width change to skinny 400 pixels
  • Single column layout
  • Removal of thumb nail images
  • Increase in font size
  • Large call to action buttons
  • Removal of editorial introduction
  • Reduced copy
  • Changed call to action copy
  • Removal of in this issue contents list

So the question arises as to what really drove 61% click increase? Drilling down into clicks on individual links gives more insight.

I would expect some of these factors to change click response evenly across all links, such as font size change.

However, I found one particular link in the treatment was responsible for the majority of the overall click increase in the split test. This implies the copy and call to action changes are driving the click increase and not the change to skinny design.

To put the theory to the test that copy changes were driving the difference a further test was run, this time further synchronising the copy between the control and treatment, in particular ensuring the same calls to action. These are the two emails in the split test:

InfoboxJulySplit

Note how the control, the design on the left above, now has the same calls to action and lead in copy to the CTA button as the treatment.

On this test there was no click rate response difference between the two designs and in fact both performed well. The skinny mobile friendly design no longer looks to be the cause of the improvement.

It would have been easy to initially conclude that the mobile friendly approach is driving the results after the first spit test. However, a failure to truly understand key causes of change means you’ll make future decisions based on your wrong perception of what’s important.

There are always unanswered questions, in this test what about:

  • The impact of the introduction editorial copy in the email, does it increase clicks or just re-distribute them?
  • Is there a difference in response between desktop readers and mobile readers with the different designs?

The DMA has further tests planned to provide more answers and gain clarity as to how design can improve click response for the DMA . Watch out for an update on this blog and I’ll be sure to share with you as to what we find next.

 

Testing? Forget it at your peril!

Have you noticed just how overwhelming it is? The noise is loud, really loud!  The volume is set to ‘rock’ and the music is booming from the e-mail marketing tent. Where’s my invite? Where’s the party? I’m a big music fan, but today I’m not talking about Glastonbury or a cool party in the park. No. In this instance the tickets are free and the music is streaming from the industry. Our industry.

I use the word music as an analogy to e-marketing articles, white papers, blogs, RSS feeds and social-media content. From Linkedin to Facebook, our broadcasts are everywhere. Our content is everywhere and growing at a phenomenal pace. I don’t know about you, but as a marketer with more than a decade of hands on experience, I believe we marketers need to get our house in order before we run off to the next cool festival.

My first DMA blog isn’t about music or anything new for that matter. My first blog is an opportunity to remind marketers to invest (or re-invest) in a basic discipline that pays big dividends. Test, test and test again.

I know the issue only too well. Testing takes time, something us marketers don’t have a great deal of! If you can find time to incorporate testing techniques in to your daily routine through manual or automated processes you will discover testing makes a big difference to the outcome of your campaigns. A small uplift in open or click through volumes can produce a big impact on conversions, sales and revenue. This is definitely music to my ears.

Over the 8 years, I’ve executed many testing combinations for a variety of customers, all of which helped me fine-tune my campaign management for greater returns. Five common A/B tests I incorporate regularly with my clients are shown below. Don’t forget, the best way to achieve successful results is to test, test and test again!

1) A/B creative test – With A/B creative tests you will need two versions of your creative. Each creative version could be configured with: A different layout, different images, different copy text or different positions for call to actions. How you make the two versions unique is up to you. The differences could be minimal or huge but you should pick one element to test at a time. There is no right or wrong methodology here.

2) A/B male / female test – With the A/B male / female test, you could; have two versions of your creative, one for male and the other for female with relevant content for each version. You could simply opt to use two different subject lines. Ideally, with this kind of test you should make the male version relevant to the male market and the female version relevant to the female market.

3) A/B geographic test – First off, you need to select two or more geographic regions. Let’s say north and south for example. Target your message accordingly for each region by using relevant images, subject titles and calls to action.

4) A/B Active vs. Inactive test – Inactive subscribers exist in every marketing database. The key to converting inactive subscribers is to segment your database in to active / inactive and target the inactive segment with tempting offers. Everyone wants to convert inactive subscribers to revenue-generating customers. Instead of sending a generic email to both segments, create a version specifically for inactive users by using eye catching subject titles and special offers to temp customers back.

5) A/B Generic vs. targeted message – Broadcasting a generic message to your entire database is easy. One creative, one subject title, one broadcast and one set of reports. The problem with generic messages is they simply don’t hit the spot with many recipients. When you start to test targeted messages against generic messages you will see a clear difference in opens and conversions. Targeted messages should be highly relevant to your recipient’s interests. Message relevance is key to achieving higher open rates and conversions.

Follow my simple rule. Make sure the basics are covered before you try and conquer the world.  There are endless combinations of testing techniques – try as many as you can, Invest as much time as you can to generate valuable return on investment. In the mean time, enjoy the industry music. It’s everywhere, it’s free and it’s a great way to pick up hints and tips to help you succeed.

Don’t just aim for an open with your subject line

CoinksdealsEver heard someone say the purpose of the subject line is to get the open? This is short sighted and the purpose and impact of the subject line goes much deeper. The thinking behind a subject line should be more than “what will make someone read this email?”

A case in point is some work I’ve recently completed for Coinks Deals. I’d like to share with you what was learnt about subject lines and how to best communicate with a large dormant database.

Coinks Points introduced a new deals service for their members and wished to provide these deal emails to members who responded to an introduction email about them. Coinks have millions of members, including hundreds of thousands of who had been suppressed from contact for over 12 months. The challenge was how to message to their entire member database, including the dormant members.

The messaging strategy I developed was a four email sequence using a high degree of personalisation to make connection and re-establish trust with members. Several tests were developed to optimise each step of the sequence, testing a variety of elements, including of course subject lines.

As always with testing, the results were insightful and I’m going to focus on one of the subject line tests and what you can learn from it.

For the third email in the sequence to the dormant segment, one of the tests was of these two subject lines:

  • Subject A: Are we still welcome in our inbox?
  • Subject B: Was it something we said?

The email itself was a short mostly plain email with a few links and a couple of central buttons shown belowcoinksbuttons

The subject line B gave a 67% higher open rate. However, what was interesting to show the impact of the subject line beyond the open was the ratio of clicks on the above two buttons.

For subject line ‘A’ the ratio of clicks on the first button to second button was 6.5 whereas for subject line ‘B’ it was 2.8. Customers with subject line ‘A’ were more inclined to click the first button. The test cell sample size was 12,000 and the difference in clicks was statistically significant.

The difference in ratio was due to the different subject lines, it changed how customers read the message and what they did as a result. In this case “Are we still welcome in your inbox?” prompted the customer to consider this very question and whether their answer is yes or no. Whereas “Was it something we said?” does not prompt the direct question and the more conciliatory tone creates more interest in deals.

In the many tests I’ve run over many clients I’ve time and time again seen that what happens in the email is skewed and changed by the subject line. The subject line should be designed to get the right people to open not the most people, the right people means those most likely to take the action you want. Plus the subject line should frame their thoughts correctly.

The subject line is used by customers to self-qualify, if the subject line does not accurately qualify the right people then customers who might have taken action do not open and conversely some open only to find it’s not the right message for them. In this case the risk is customers become less inclined to open again since they found they wasted their time previously.

Summarising two key learning’s:

  • When testing subject lines don’t stop evaluation at the open rate, get more insight by looking deeper at which individual links were clicked and the call to action of each, to learn why the subject line created a particular result.
  • Create subject lines with the call to action in mind. The power and impact of the subject line goes further than getting the read, it’s about getting the action and not just the read.

This was just one test out of many over a series of four emails. The compounded gain across the whole email sequence was an impressive 190%.

Next time you think about subject lines don’t focus on just getting the open but setting up the right thought sequence for the call action.

Acknowledgements: My thanks to Coinks Deals and Emailvision for permission to publish the results from this work.