Jeffrey Rice

Research Update: The state of email marketing testing and optimization

In July, I wrote the blog post, Email Marketing Research: 7 steps for successful email marketing testing and optimization. In it, I discussed how continuous experimentation is the quickest path to peak performance. It enables marketers to go beyond best practices to learn what works for their organization and, more importantly, their customers.

I’m preaching to the choir, right? Well, I also encouraged readers to take the annual email benchmark survey conducted by MarketingExperiments’ sister company, MarketingSherpa.

Thankfully, this blog’s readers, along with more than 2,700 other email marketers, participated in the study. In appreciation, I would like to share with you the current state of email marketing testing practices.

 

Email testing on the rise

The number of marketers who routinely test email campaigns rose 3% from 2010 to 42%. This is good news as the industry inches closer to making it a prevailing practice.

Unfortunately, nearly six of 10 email marketing budgets do not have any money earmarked for testing and optimization. The majority (63%) of tests are conducted by employees for whom the practice is a part-time and secondary job responsibility, but still a formal part of their job description.

The minority includes the 23% of the email researchers who report the task as their primary focus and full-time duty and the 19% of marketers who perform experiments on the side without it being listed in their job description.

 

Testing practices most routinely performed

This information may help benchmark your programs and processes against the industry. But for this year’s survey, we wanted to delve deeper into which formal processes and guidelines organizations routinely use to test and optimize email campaigns. Here is a look into what we found.

 

Chart: More time needed for brainstorming and defining the testing objective

How routinely does your organization implement the following testing practices?

Click to enlarge

The above chart displays common testing practices in chronological order from top to bottom. We asked marketers to share with us which tasks their organizations routinely execute. The survey uncovered organizations are spending the most time segmenting their lists, understanding the impact of the test on the entire funnel, and documenting their findings.

 

Segmenting is an effective and common practice

Taking the time to segment a list to target a specific audience is a requirement. Universally testing across the board will muddy the results. The respondents will inevitably give you data that pulls you in all different directions.

MarketingExperiments’ Research Partners have seen tremendous results when focusing on the “highly motivated and loyal subscriber” segment. For example, in a 60-day experiment, the research team discovered increasing from four email messages per month to 15 per month tripled the monthly revenue without any significant negative impact on unsubscribe or open rates.

 

Marketers often document effect of tests on the funnel

In addition, 43% of marketers in our study routinely document the impact of the test on the Marketing-Sales funnel. This is also a critical step in ensuring you understand the cause and effect of the experiment. A lack of complete information can quickly turn a success into a failure and jeopardize your brand’s success.

 

More time needed for brainstorming and defining the test objective

Where marketers can devote more time is to brainstorming optimization opportunities, identifying the key metric and reviewing the tests to decide on follow-up actions. Defining the question or key metric and reviewing the test is where the essential learning happens.

Remember the goal of the test is not to get a lift, rather a discovery. After all, if you just “luck into a lift” without knowing why you got it, you can’t replicate that tactic across your campaigns.

To create a solid foundation for a test, marketers must properly identify the research question, key performance indicator and test objective with clarity. To accomplish this step, try adapting the George T. Doran’s business mnemonic device for setting goals – S.M.A.R.T. – to email experiments. Keep in mind your email’s business objectives and how it can translate into a subscriber’s action (i.e., conversion).

 

Specific – Succinctly state the goal of the test for the entire team to understand. For A/B split tests, most brainstorming session questions start with “what” or “why” and the finished questions begins with the word “which.” Select only one variable or general element to test. Examples include subject line, headline or call-to-action.

Measurable – The element chosen must be one that can be measured, ideally throughout the entire sales process. This may go beyond your email metrics and be extended to website analytics or financial databases. Defining the primary metric that tracks a reader’s action will allow your team to decipher which treatment performed best.

Achievable – The question must be able to be answered based on the segment of the audience you will be testing. This will be determined by the firm’s ability to distinguish customer personas, behavior and sources (e.g., newsletter sign up, current customer, frequent buyer) within its email database.

Relevant – The goal of an email is to earn a click, not a sale. Understanding the placement of the testing objective in the subscriber’s thought sequence will allow you to choose the correct testing element that earns a micro-commitment from the reader, enabling her to move on to the next element of the email. For example, you may choose to test the headline to improve the transition from the “from:” field and subject line to the email body copy.

Timely – When formulating the key question, objective or metric, begin to think about external factors that could impact the testing results. Part of the testing process is to measure the email’s effectiveness at certain time intervals. A holiday or long weekend could influence open rates to the point where a research team may elect to extend the time it collects data.

 

An adept research question may read, “Which of these three email copy samples will result in the highest shopping basket recovery rate?”

To learn more about email testing and optimization benchmarks, click here to instantly download MarketingSherpa’s free 2012 Email Marketing Benchmark Report excerpt.

 

Related Resources:

MarketingSherpa 2012 Email Marketing Benchmark Report – Launch Special: Save $100 (Offer ends Nov. 30)

Email Marketing Optimization: How you can create a testing environment to improve your email results

Email Marketing: Testing subject lines

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Categories: Email Marketing Tags: , , , , ,



  1. November 16th, 2011 at 17:01 | #1

    Does Email Marketing work for any site? For instance, I have a site that dedicated to bringing the best priced Shirley Temple Movies Collection to other peoples homes. Should I look into Email Marketing with this kind of site?

    You can find my site here.

  2. November 16th, 2011 at 17:02 | #2

    Thanks Jeff.

  3. Jeffrey Rice
    Jeffrey Rice
    November 16th, 2011 at 17:25 | #3

    @Christopher
    Thanks for the question Christopher.

    Email marketing is a proven method to communicate effectively with a brand’s core audience. For those fans that love Shirley Temple movies, receiving messages that share movie memories, trivia or Shirley’s career highlights would be welcome.

    Before you begin, I recommend discovering if email marketing is the preferred channel to communicate with the majority of your core audience. (You might choose to do a poll on your site.) In addition, you must determine if you have the content her fans crave and the perseverance to deliver quality email communications on a consistent basis, whether it is quarterly, monthly or weekly.

  4. November 17th, 2011 at 13:34 | #4

    Do you suggest sending out different styles of the same email to see which has the best response rate?

    • Jeffrey Rice
      Jeffrey Rice
      November 17th, 2011 at 13:52 | #5

      Hi Susan,

      Yes our research found that 50% of email marketers test layout and images and 26% found the process very effective. A reminder if you are conducting and A/B split test the only element to change is the layout, the copy, call-to-action and subject line need to stay the same. 41% of marketers told us an even more effective strategy is to test the landing pages as the goal of the email is to get a click, not a sale. The landing page does the heavy lifting for a conversion.

  1. May 9th, 2012 at 07:31 | #1
  2. May 11th, 2012 at 08:48 | #2
  3. February 1st, 2013 at 16:27 | #3