Author Archives: Tim Watson

Tim Watson

About Tim Watson

Tim Watson has over 8 years experience in B2B and B2C Digital Marketing, helping blue chip brands with successful email marketing.

He is an elected member of the UK DMA Email Council, supporting the email marketing industry. Tim Chairs the Legal and Best Practice hub of the Email Council, authoring and reviewing DMA whitepapers and best practice documentation. He is also a frequent speaker and blogger on emerging email marketing trends.

Tim works as an independent email marketing consultant providing strategic support to email marketing teams.

A Privacy Policy that Wins Business

Business is built on trust and trust is built on transparency. Both the DMA and ICO have long urged companies to be clear with their customers as to what data is collected and why.

As soon as you act in a way that a customer doesn’t expect or makes them feel abused, then any hard work previously done building trust immediately evaporates.

Simply put, nobody will do business with a brand they don’t trust.

According to the Customer Acquisition Barometer 2014 85% of consumers will only share their information if it’s made clear that it will be used only by the company that collects it and 32% say they expect a clearly worded privacy policy before they share information.

And there is such concern about data and privacy that the EU Parliament is busy voting for much tighter rules on data use and protection.

Whilst the privacy policy is the cornerstone of ensuring compliance it’s no secret that few people read the privacy policy. Do you?

So it was refreshing to see a totally different approach to a privacy policy from Lookout. A visual approach that gives consumers the big picture about the key issues at a glance.

It’s even a responsive design so it looks beautiful on mobile as well as desktop, view it online here. To top it all it’s built on open source and brands can pinch the code to create their own consumer friendly privacy policy.

LookoutPrivacyDesktopIt’s responsive too, how it looks on a mobile device:

LookoutPrivacyMobileThis must be the most consumer friendly privacy policy – ever.

 

Often missed part of successful email split test strategy

A mega test for sure, Marketing Experiments took 300 subject lines suggested by their readers, picked what they felt were the best ten and ran a split test.

The table of results is below. Read the original post for details of how it was run but before you do read on here for a quick lesson in email split test strategy not mentioned elsewhere.

MECSplitNotice that the top three had the same level of performance, within level of confidence. The next five subject lines, lines 4 to 8, also shared the same performance level too.

Had only the top three subject lines been tested there would have been no winner! Had the test been just the five subject lines, lines 4 to 8, there would have been no winner – in five tests.

The crucial lesson is that testing is a game of skill and quantity. A key ingredient to email split test strategy is simply doing enough tests and different treatments to discover a big gain. Doing occasional tests won’t do it.

If you are running A/B subject line tests then be prepared for several tests showing no performance improvement. Don’t get despondent but keep persevering.

As a rule of thumb expect out of ten test treatments for one to make a big shift to the needle, 3 provide some learning and improvement, 3 no winner and 3 under perform (but you’ll still learn).

When you’ve enough data to run multiple tests concurrently then it’s great to do as it accelerates learning and optimisation. However, remember your sample sizes need to be large enough for statistical significance. A 10,000 list split into ten test cells of 1000 each won’t give any valid result.

Stay on the straight and narrow with this split test calculator. to work out sample sizes and check statistical significance of results.

 

Font colour change causes 74% click difference

A split test with identical results doesn’t mean there is nothing to learn. A case in point is the split test covered here. It shows how valuable lessons can still be found by looking deeper even though the split test gave identical overall unique clicks results.

By drilling into individual calls to action we found a 74% difference between the control and treatment.

This split test is the third test run as a follow up to two previous tests. The first test delivered a 61% increase in unique clicks. Tests two and three were designed to tell us more about what drove that difference so the principles can be carried forwards and re-applied.

All tests were made on the DMA Infobox monthly email marketing advice newsletter – if you don’t get the newsletter signup here, its free.

This test was to understand the value of the Editorial section. In a previous test the Editorial section was in the control and not the treatment. Could adding the Editorial to the treatment increase response or would it just change reader behaviour?

Here are the emails tested, control on the left and treatment on the right.Infobox split test 3

The summary result was identical click through performance. What was there to learn from this?

I analysed which links were clicked. Each main article is promoted both in the Editorial section and in the main body following the Editorial.

A big difference between the control (left) and treatment (right) became apparent. The table below shows the uplift on the treatment over the control for links in Editorial area vs main body. Uplifts have above 95% statistical significance (check your test results with this ab split test calculator).

  Editorial links Body links
Uplift 74% -70%

In short whilst the overall number of clicks was the same, where people clicked was changed by the design. The treatment received the majority of the clicks in the Editorial area and had weak performance in body. In fact body clicks reduced by 70% over the control.

Why does the Editorial perform badly in the control?

There are two obvious reasons why the difference occurred:

  1. People find scan reading of white font on a blue background harder than the higher contrast of black on white. So they tend to skip over the editorial and scan the body and large sub-heads to pick out content of interest.
  2. Readers are using the ‘In this issue’ bookmark links in the control right column to skip to the body content.

Bookmark links, which just jump the reader to another place in the email rather than to a landing page, are not trackable. So we don’t have data to confirm or eliminate the second possibility.

The test results imply that the treatment can be reduced to just the black on white Editorial and the remaining simple email will give good performance.

Three lessons from this test

  • Look at individual link clicks for more insight and not just top line metrics when doing split tests on designs.
  • Focus design on simplicity. Get to the point and don’t make your email over complex simply because it looks more designed.
  • Avoid use of white font and coloured backgrounds. Black on white is just fine.

Remember the real world isn’t perfect.

Device tracking was included in the test with the intention of drilling into behaviour differences between desktop and mobile users.

When I looked at the device results they were quite remarkable. In fact so surprising I had to question the data. After digging deeper it became clear that a technical gremlin had meant the device tracking information was incomplete and the data could not be used.

The lesson here is be prepared to question everything. Especially if something looks too good or too bad to be true. Working with fundamentally wrong data is at best a waste of time and at worse takes you in the wrong direction, leading to worse results.

What next?

We’ve another test planned in this series that will drill into a different element of the control and treatment. I’ll be sure to share the results so come back next month and find out.

Email design change driving a 61% click increase

You can’t have missed the many reports and stats about the growing use of mobile devices to read email. Brands I work with are seeing anywhere from 15% to 70% of their emails being read on mobile devices. The average is 45% and the most popular email client, rated by number of emails opened, is now the iPhone, source http://emailclientmarketshare.com/.

The DMA have been testing a new mobile first email design. Its optimised for mobile with a skinny design approach of 400 pixels wide, as opposed to using another technique such as responsive design. If you’re not sure about the different mobile design methods then read this short primer that explains skinny, scalable, fluid and responsive email design.

The DMA use the same template across many of the DMA newsletters. The design format change was put to the test by the DMA production team with an A/B split for the DMAs ‘Infobox’ newsletter. A newsletter dedicated to email marketing.

The skinny design treatment delivered a 61% increase in clicks over the control, with statistical significance of 99%.

However, the conclusion is not the obvious one. Can you see why? Here are the two designs used in the split test:

InfoboxJuneSplit

We can’t conclude skinny rules just yet. In making a design change of this significance there are many factors at play. Potentially important changes in this design treatment include:

  • Mobile friendly width change to skinny 400 pixels
  • Single column layout
  • Removal of thumb nail images
  • Increase in font size
  • Large call to action buttons
  • Removal of editorial introduction
  • Reduced copy
  • Changed call to action copy
  • Removal of in this issue contents list

So the question arises as to what really drove 61% click increase? Drilling down into clicks on individual links gives more insight.

I would expect some of these factors to change click response evenly across all links, such as font size change.

However, I found one particular link in the treatment was responsible for the majority of the overall click increase in the split test. This implies the copy and call to action changes are driving the click increase and not the change to skinny design.

To put the theory to the test that copy changes were driving the difference a further test was run, this time further synchronising the copy between the control and treatment, in particular ensuring the same calls to action. These are the two emails in the split test:

InfoboxJulySplit

Note how the control, the design on the left above, now has the same calls to action and lead in copy to the CTA button as the treatment.

On this test there was no click rate response difference between the two designs and in fact both performed well. The skinny mobile friendly design no longer looks to be the cause of the improvement.

It would have been easy to initially conclude that the mobile friendly approach is driving the results after the first spit test. However, a failure to truly understand key causes of change means you’ll make future decisions based on your wrong perception of what’s important.

There are always unanswered questions, in this test what about:

  • The impact of the introduction editorial copy in the email, does it increase clicks or just re-distribute them?
  • Is there a difference in response between desktop readers and mobile readers with the different designs?

The DMA has further tests planned to provide more answers and gain clarity as to how design can improve click response for the DMA . Watch out for an update on this blog and I’ll be sure to share with you as to what we find next.

 

Email welcome campaign lessons from Sainsbury’s DM

SainsburyWelcomeDMFrontSM

A few weeks back I got a new Nectar card and used it for the first time last Saturday, clocking up my very first Nectar points.

Impressively on the following Tuesday the postman delivered me a piece of DM from Sainsbury’s.

This was no co-incidence, the opening line of the DM is “As you’ve recently picked up your first Nectar points at Sainsbury’s…” – a great lead-in line that immediately makes the content relevant to the reader.

A slick piece of marketing, triggering a DM shot from first use of a Nectar card, the design is a great example of how to welcome new customers

SainsburyWelcomeDMInsideSm

The DM design and content shares many best practices with email design best practice

  • A distinctive circle attracts the eye as the starting point
  • The headline “We’re here to help you get started” has clarity and tells the reader immediately what this is about – no need for hype here
  • The first paragraphs speaks to the individual
  • The brown signs give visual clues to direct your eyes across the pages
  • Copy is chunked and easily scanned
  • Icons and images are functional and support the message, no pointless pictures of a stock happy shoppers

The DM format is a horizontal foldout but the same concepts for eye-path flow can be adapted to vertical email design.

The messaging of the DM has clear purpose addressing in a logical order

  • The different ways to collect Nectar points
  • What you can do with the points you’ve collected
  • A next step to finding out more

SainsburyWelcomeDMFoldOutSM

Welcome campaigns are about education, providing customers with key useful information so the relationship can continue to develop.

Finally when turning the last foldout of the DM there are vouchers that incentivise the Nectar point collecting habit.

So there it is; great design, brand education and incentives to continue the habit that’s just started.

The key ingredients of every Welcome campaign, whether in DM or Email.

 

Don’t just aim for an open with your subject line

CoinksdealsEver heard someone say the purpose of the subject line is to get the open? This is short sighted and the purpose and impact of the subject line goes much deeper. The thinking behind a subject line should be more than “what will make someone read this email?”

A case in point is some work I’ve recently completed for Coinks Deals. I’d like to share with you what was learnt about subject lines and how to best communicate with a large dormant database.

Coinks Points introduced a new deals service for their members and wished to provide these deal emails to members who responded to an introduction email about them. Coinks have millions of members, including hundreds of thousands of who had been suppressed from contact for over 12 months. The challenge was how to message to their entire member database, including the dormant members.

The messaging strategy I developed was a four email sequence using a high degree of personalisation to make connection and re-establish trust with members. Several tests were developed to optimise each step of the sequence, testing a variety of elements, including of course subject lines.

As always with testing, the results were insightful and I’m going to focus on one of the subject line tests and what you can learn from it.

For the third email in the sequence to the dormant segment, one of the tests was of these two subject lines:

  • Subject A: Are we still welcome in our inbox?
  • Subject B: Was it something we said?

The email itself was a short mostly plain email with a few links and a couple of central buttons shown belowcoinksbuttons

The subject line B gave a 67% higher open rate. However, what was interesting to show the impact of the subject line beyond the open was the ratio of clicks on the above two buttons.

For subject line ‘A’ the ratio of clicks on the first button to second button was 6.5 whereas for subject line ‘B’ it was 2.8. Customers with subject line ‘A’ were more inclined to click the first button. The test cell sample size was 12,000 and the difference in clicks was statistically significant.

The difference in ratio was due to the different subject lines, it changed how customers read the message and what they did as a result. In this case “Are we still welcome in your inbox?” prompted the customer to consider this very question and whether their answer is yes or no. Whereas “Was it something we said?” does not prompt the direct question and the more conciliatory tone creates more interest in deals.

In the many tests I’ve run over many clients I’ve time and time again seen that what happens in the email is skewed and changed by the subject line. The subject line should be designed to get the right people to open not the most people, the right people means those most likely to take the action you want. Plus the subject line should frame their thoughts correctly.

The subject line is used by customers to self-qualify, if the subject line does not accurately qualify the right people then customers who might have taken action do not open and conversely some open only to find it’s not the right message for them. In this case the risk is customers become less inclined to open again since they found they wasted their time previously.

Summarising two key learning’s:

  • When testing subject lines don’t stop evaluation at the open rate, get more insight by looking deeper at which individual links were clicked and the call to action of each, to learn why the subject line created a particular result.
  • Create subject lines with the call to action in mind. The power and impact of the subject line goes further than getting the read, it’s about getting the action and not just the read.

This was just one test out of many over a series of four emails. The compounded gain across the whole email sequence was an impressive 190%.

Next time you think about subject lines don’t focus on just getting the open but setting up the right thought sequence for the call action.

Acknowledgements: My thanks to Coinks Deals and Emailvision for permission to publish the results from this work.

Email engagement, often discussed now defined

Everyone wants it, but there is no industry consensus on the best way to measure it. I’m talking about engagement in the email channel.

Take an example, a fashion brand might send two or three emails per week. It’s not realistic to expect that most people are going to be interested in buying a new fashion item every week or even to review offers each week.

Just because someone is not in the mood to buy or look at current offers does them make them no longer engaged with a brand? Of course not, they gave permission to receive the emails, they showed engagement, ignoring a few emails does not mean a lack of engagement.

Classically campaign open and click rates are used to judge engagement. This was fine when brands sent one campaign per month. Email volumes have increased considerably in the last five years but metrics have not moved on.

A re-think is needed as the classic metrics measure campaigns not customers and as a result promote the wrong behaviour in email marketing.

Its customers that need to be engaged so measuring campaigns makes no sense, its customers that should be measured.

I’ve been working on a paper, along with my fellow DMA Email Council hub members, Dela Quist, Skip Fidura and Kath Pay. The paper goes to the core of how to measure customer engagement in the email channel and delivers a verdict, based on analysis of brand data.

The paper has been put together to kick-start the discussion in the email industry about just what should be measured and the debate is starting at Email Evolution 2013 conference in Miami this week. For DMA blog readers we’re releasing the paper to you ahead of the event.

I hope you download the paper and find it thought provoking. If you leave a comment one of the paper authors will reply.