Posts Tagged ‘a/b testing’

Email Marketing: 7 (more) testing opportunities to generate big wins on your next email test [Part 2]

May 2nd, 2016 No comments

Does your email audience prefer short or long emails? How about images versus GIFs?

If you don’t know the answer to any of these questions, it’s OK. All you need is an A/B email test. 

Testing allows us to better understand our customers, and determine ways we can better engage them.

Last week, we detailed nine experiment ideas for you to try on your next campaign. If those weren’t your style, we have seven more for you — for a total of 16 testing opportunities.

Today, we’ll be reviewing opportunities in your body messaging, calls-to-action and design.

Email Body Messaging Testing

Testing Opportunity #10. Messaging tone

In this test, from the Web clinic, “Email Copywriting Clinic: Live, on-the-spot analysis of how to improve real-world email campaigns,” researchers used two treatments to increase total lead inquiries from visitors who abandoned the free trial sign-up process.

The first treatment was designed based on the hypothesis that visitors did not convert because the copy didn’t engage them enough, so it took a direct response tone. The second treatment was based on the hypothesis that visitors experience high levels of anxiety over potential high-pressure salespeople or spam phone calls. This treatment took a more “customer service”-oriented tone.

By serving the needs of prospects, Treatment #2 increased the lead inquiry rate by 349%. When emailing those abandoning your trials or purchase process, they could be dropping off for particular reason that a simple email could address. By testing different tones and messaging, you can reduce their anxiety and close the deal.


Testing Opportunity #11. Email length

When testing the length of your emails, particularly ones linking to pages where conversion happens, keep this in mind: The goal of an email is a click — the goal of the landing page is a sale. 

If you give all your value and information away in the email, where is the incentive for readers to click? This can even apply to newsletters.

The team at our sister research company, MarketingSherpa, ran a test on its Best of the Week newsletter at the end of last year. The newsletter features the three most shared articles and blog posts from the previous week. Each post had a summary, previewing what readers would find. However, we wondered if the added length caused friction, causing readers to only click on the one or two posts they could see above the fold.

The treatment eliminated the summary, allowing all three posts to be seen quickly when opening the email.

The result? Both total and unique clickthrough went up. When we ran the test, I fully expected total clickthrough to increase. However, the 24% increase in unique clickthrough was a great bonus.


Call-to-action Testing

Testing Opportunity #12. The right CTA ask

The objective of an email is get the click; let your landing page do the selling.

You want to ensure you’re asking only for the next step in the path to conversion.

Here are a few alternatives to test:

·        “Learn more,” not “Buy now”

·        “Browse [product],” not “Shop now”

·        “See event agenda,” not “Purchase ticket”

·        “Explore plans,” not “Subscribe now”

·        “Get pricing,” not “Buy online”

Using your email CTA to ask for a macro-yes could turn off subscribers. While they might not be ready to “shop now,” they could be willing to “browse the latest collection.”


Testing Opportunity #13. A single ask

Some marketers like to add a number of calls-to-actions throughout their emails. At MECLABS, we refer to this as “conflated objectives.” You throw out as many options for them, hoping they like at least one enough to click.

However, when you start asking for too many things, customers get confused and don’t know what they should be doing. By deciding on a single goal for your email, subscribers know exactly what’s being asked of them.

If you have multiple calls-to-action in your email, try dialing it back to the one main CTA you want them to take.

If you absolutely need more than one call-to-action, use some sort of design to clearly show what the main action you want them to take is. If you have the options of downloading a trial and getting a quote, you have to decide which one is the objective of the email. From there, you can use a button to highlight the main CTA and just use a simple hyperlink for the secondary CTA.

Design Testing

Testing Opportunity #14. Responsive email design

Some might assume that responsive design will hands down beat non-responsive every time. However, best practices don’t work for everyone.

While CareerBuilder saw a 24% increase in CTR with its responsive design, a test on MECLABS Institute’s managing director and CEO Flint McGlaughlin’s email newsletter, FlintsNotes, ended with interesting results. While total clickthrough decreased with the responsive design, unique clickthrough resulted in no significant difference. Additionally, the responsive design saw a higher read rate.

How will your audience respond? You’ll only know by testing.


Testing Opportunity #15. Letter-style design versus promotional-style design

Designing emails can be fun — from finding the right graphics or images, to choosing different layouts. However, does your audience want those design extras? Or do they just get in their way?

A MECLABS Institute Research Partner tested a typical promotional-styled email against a simpler letter-styled email to see which design resulted in more engagement.

The letter-style design generated a 181% increase in conversion over the standard, promotional-style email.

Testing Opportunity #16. Imagery and graphics

Once you know your audience more actively engages with emails that use graphics or imagery, the next step is the find the most effective imagery or graphics.

For example, if you sell software, does your audience respond better to an image of someone using a computer or of detailed screenshots of your product in action?

Or is a static image best at all?

When Dell set out to market a new laptop that could convert into a tablet, an image just didn’t seem to do the transformation justice. The Dell team came up with the idea of utilizing a GIF to illustrate the morphing computer.

“I think [GIFs] are a good way for people to communicate what their main story is very quickly,” said David Sierk, Email Strategy and Analytics, Dell. “People are visual learners.”

When compared to quarterly benchmarks, Dell’s first GIF-centric email resulted in these lifts:

·        6% increase in open rate

·        42% increase in clickthrough rate

·        103% increase in conversion rate

·        109% increase in revenue

And the list could go on

The list could go on for a good long while, but hopefully these 16 opportunities can get your started. And that’s the important part: getting started. You don’t have to start big, and you don’t have to completely redesign every aspect of your email in one go. You can design a series of experiments to slowly build your customer theory and knowledge.

I’d also love to know what email tests have worked well and provided you with insight about your audiences. Leave your testing ideas in the comments below. And if you find success with any of these testing opportunities, let us know at

You might also like

MECLABS Email Messaging Online Course [From MECLABS Institute, parent company of MarketingExperiments]

Email Marketing: Template test drives double-digit increases for Dell [MarketingSherpa case study]

Marketing Research Chart: How do customers want to communicate? [MarketingSherpa chart]

Tips for Incorporating GIFs in Email [From MarketingSherpa Blog]

Email Marketing: 9 testing opportunities to generate big wins on your next email test [Part 1]

April 28th, 2016 No comments

Email is a great medium for testing. It’s low cost, and typically requires less resources than website testing. It’s also near the beginning of your funnel, where you can impact a large portion of your customer base.

Sometimes it can be hard to think of new testing strategies, so we’ve pulled from 20 years of research and testing to provide you with a launching pad of ideas to help create your next test.

In this post and next Monday’s, we’re going to review 16 testing opportunities you can test around seven email campaign elements.

To start you out, let’s look at nine opportunities that don’t even require you to change the copy in your next email.


Subject Line Testing

Testing Opportunity #1. The sequence of your message

Recipients of your email might give your subject line just a few words to draw them in, so the order of your message plays an important role.

In the MarketingExperiments Web clinic “The Power of the Properly Sequenced Subject Line: Improve email performance by using the right words, in the right order,” the team reviewed several tests that demonstrate the importance of thought sequence in your subject lines.

Try testing point-first messaging. Start with what the recipient will get from your message and the email.

Read more…

A/B Testing: Cut through your KPIs by knowing your ultimate goal

February 4th, 2016 No comments

Marketers often struggle to know what metrics to use when trying to decide on the positioning of their marketing collateral. This can lead to many problems. At MECLABS Institute, the parent company of MarketingExperiments, we have run experiments and tests for over 20 years to help answer this question.

Customers take many actions when moving through the funnel, but what is the ultimate goal the company is trying to achieve with their marketing collateral? By answering this question, companies can best determine what the most important KPI is to measure.

To best illustrate this point, let’s walk through an experiment that was run regarding metrics. By reviewing this experiment we will understand how important it is to have a clearly defined idea of what the ultimate goal is for your marketing collateral.


The Experiment:

Background: A large newspaper company offering various subscription options.

Goal: To determine the optimal regular price point after the introductory discounted offer rate.

Research Question: Which price point will generate the greatest financial return?

Test Design: A/B split test


Subscription services often offer a discounted introductory rate for new subscribers. This gives potential subscribers a low-risk opportunity to try out the service for a period of time before the cost defaults to the regular full price. In this test, The Boston Globe team hoped to determine the optimal price point for a monthly subscription after the introductory offer rate expired. 

Read more…

Categories: General Tags: , ,

This 1960s Statistician Can Teach You Everything You Need to Know About the Fundamentals of A/B Testing

January 21st, 2016 No comments

I did a training on selling training for the sales team today. It was what Millennials call “meta.”

I was talking about how our training uses scientifically valid experiments to back everything we say in our training rather than best practices, anecdotal case studies or just “expert advice.”

The question naturally arose: “What do we mean when we say ‘scientifically valid experiments’?”

When I answered the question in the meeting, I immediately thought it would be a good idea for a blog post. So, with that said, here’s the answer:

In short, it means that we use the scientific method to validate every piece of knowledge we transfer in the training (and also in our Web clinics and on this blog).

I found myself trying to explain what I learned in high school about the scientific method, and while I was able (I think) to get the basic gist across, I don’t think I did it justice.

Fortunately, after doing a little searching online, I found this guy.

His name is J. Stuart Hunter and he is one of the most influential statisticians of the last half of the twentieth century.

Fortunately, back in the 60s, he recorded some rad videos around experimental designs in a business context. If you can extrapolate a little bit from the industrial context and apply this to a marketing context, it should be everything you need to know about the scientific method, or “what we mean when we say ‘scientifically valid.’”



Read more…

The Importance of Testing: How one test applied to two email sends resulted in different audience responses

November 23rd, 2015 No comments

At MarketingExperiments, sister company of MarketingSherpa and parent company of MECLABS, we believe in testing. Our Web clinics are a testament to this belief – every month we share research that is designed to help marketers do their jobs better.

This culture of testing encouraged me to run my own test.

First, I ran the test on a MarketingSherpa send. After the results of that test came back, the same test was then applied to a MarketingExperiments’ newsletter.

By running virtually the same test twice for two different audiences not only did I learn more about the preferences of ourreaders, but I also learned how incredibly important it is to test, even when you are sure you know what the results will be.


The MarketingSherpa test

As the copy editor at MECLABS, I get to see all of the copy produced by both MarketingExperiments and MarketingSherpa. One of my responsibilities is overseeing the email newsletter sends. Every Monday, MarketingSherpa sends out a Best of the Week newsletter which features the most popular articles of the previous week and introduces the new book for the MarketingSherpa Book Giveaway.

The copy in these newsletters was extensive. Every article listed had a full summary which, as a reader, I imagined would seem overwhelming when opening this email on a Monday morning.

Read more…

Categories: General Tags: ,

How Variance Testing Increased Conversion 45% for Extra Space Storage

November 12th, 2015 No comments

When it comes to testing, A/B testing typically steals the spotlight, casting its sister procedure, variance testing, in the shadows. However, according to Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, that’s a mistake.

At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, Emily presented on how her team was able to utilize variance testing to transform Extra Space Storage’s Wild West testing culture into a wildly successful testing environment.

Before the team conducted variance testing, the company’s testing environment was structured like a free-for-all. There were few, if any, set rules in place, and, according to Emily, the person with the highest title and the loudest voice typically had their test implemented. All of this changed after the Extra Space Storage team ran some variance tests.

Variance testing measures two identical Web experiences to determine a site or page’s natural variability. This procedure generally constructs the rules for subsequent A/B tests to follow.

By focusing on variance testing and translating the results from this procedure into rules for A/B testing, Extra Space Storage achieved a 45% increase in conversion rate from the previous year. Watch the below excerpt to learn the results of the team’s test, the rules they developed and Emily’s advice on when to start variance testing and how to implement it.

  Read more…