Archive

Posts Tagged ‘email testing’

Email Marketing: 9 testing opportunities to generate big wins on your next email test [Part 1]

April 28th, 2016 No comments

Email is a great medium for testing. It’s low cost, and typically requires less resources than website testing. It’s also near the beginning of your funnel, where you can impact a large portion of your customer base.

Sometimes it can be hard to think of new testing strategies, so we’ve pulled from 20 years of research and testing to provide you with a launching pad of ideas to help create your next test.

In this post and next Monday’s, we’re going to review 16 testing opportunities you can test around seven email campaign elements.

To start you out, let’s look at nine opportunities that don’t even require you to change the copy in your next email.

 

Subject Line Testing

Testing Opportunity #1. The sequence of your message

Recipients of your email might give your subject line just a few words to draw them in, so the order of your message plays an important role.

In the MarketingExperiments Web clinic “The Power of the Properly Sequenced Subject Line: Improve email performance by using the right words, in the right order,” the team reviewed several tests that demonstrate the importance of thought sequence in your subject lines.

Try testing point-first messaging. Start with what the recipient will get from your message and the email.

By reordering the thought sequence — plus adding a few more tangible of details — the below subject line test saw a 10% relative increase in opens and the clickthrough rate increased by 15%.

 

Testing Opportunities #2 and #3. Internal issues and external events to build relevance

In the Web clinic, “Subject Lines that Convert: A review of 100+ successful subject lines reveals what motivates people to open (or delete) an email,” the team identified two ways to immediately connect with subscribers through the subject line: an internal issue or an external issue.

First, let’s look at an example of internal issues.

From This — Subject Line: [Company Name]: A New Way to Order

To This — Subject Line: [Company Name]: Now only 2-meal minimum order

The first subject line is vague and doesn’t clearly connect to why the subscriber should care. The second subject line connects to an issue that some of customers might have internally felt in the past. This built relevance to their wants and needs, enticed them to open the email and resulted in a 25.3% lift.

However, the greater impact is seen in clickthrough. Because the subject line clearly communicated a new solution to a known internal problem, subscribers opened the email with more interest and motivation, resulting in a 196% increase in clicks.

Check out this list of potential internal issues you can explore in your subject line testing:

  • Limited resources (time, money and help)
  • Unmet expectations (work and family)
  • Deficient skillsets (inability or inadequacy)
  • Operational difficulties (routine usability)
  • Fragmented perspectives (ignorance or misunderstanding)

Next, let’s review an external event subject line test.

From This — Subject Line: It’s easy to access your [Bank Name] Accounts Online. Sign On Now

To This — Subject Line: [Name], Your Account Information Is Ready To View

The first subject line is a general statement. It’s easy to access. Okay. You want me to sign on. Okay. But why? Why should I sign on now? It hasn’t connected an event with why I should take that step.

However, the second subject line states my information is now ready to view. Something has occurred. It gives me a reason to sign on. By providing a reason to sign on, it increased opens by 92.2% over the first subject line.

Here are a few other types of external events to consider when trying to build relevance:

  • An action or behavior
  • A conversation
  • A single exchange (completed or abandoned)
  • A cancellation (membership, contract or recurring transactions)
  • A service interaction

 

Preheader Copy Testing

Testing Opportunity #4. Value copy in preheader

One area of the email is often overlooked is the preheader text. However, many inboxes, both mobile and desktop, allow subscribers to see these extra 35 or so characters before opening your email. And what message are you sending with the text, “If you have trouble displaying this email, view it as a webpage”? Should I often expect problems with your email? Should I bother opening it if I do?

This space is an opportunity to add more value to your email and entice subscribers to open.

According to Justine Jordan, Marketing Director, Litmus, at a past MarketingSherpa Email Summit, you want the preheader copy to “tie into the subject line, bringing [readers] in and encouraging the click.”

Justine provided a few good examples she has come across.

 

“From” Field Testing

Testing Opportunity #5. Company versus person’s name

If you’ve been a long-time reader of MarketingExperiments, you’ve probably heard us say, “People don’t buy from companies; people buy from people.” This would could be a great test for your next email send. If your emails normally come from your company name, you might try humanizing your email by using the name of a prominent figure in your organization.

 

Testing Opportunity #6. Executive versus customer-involved employee

Once you determine that sending emails from a person works better, it could be worth a test to find the right person. While subscribers might recognize your CEO, they also know the chances of the CEO being directly involved in the email are slim. A lower-level employee with a title that connects to what your email is about could be found more favorable by subscribers because they might see that person as more real and involved. The email won’t feel faked.

 

Email Send Time Testing

In the Web clinic, “When Should You Send An Email? How one of the largest banks in the world discovered when to send its emails,” the MECLABS team revealed research about email send time based on multiple experiments in the MECLABS research library. The clinic detailed three testing opportunities around email send time.

 

Testing Opportunity #7. The time of day

Timing can greatly impact not only if your emails are opened, but the engagement level you achieve beyond the open.

Early morning sends could get subscribers to open on their commutes, but will they take action on a mobile device? Or would an afternoon send get lost in a crowded inbox?

A large financial institution wanted to increase the number of completed applications it received from an email. To do so, it tested two times of day: 3 a.m. versus 3 p.m. The 3 p.m. send time saw a 13.5% increase in clickthrough.

 

Testing Opportunity #8. The day of the week

The above time of day experiment also tested all seven days of the week. While Tuesday has often been cited as a good day to send emails, it performed the lowest. The best performing day: Sunday, with a 23.2% lift in clickthrough rate over Tuesday.

Remember, there is not a magical best day or time. Test and let your audience tell you which day (and time) works best for them. Even when testing the same group of people, different products or services could change the day or time the group is likely to respond. What works for B2C might not work for B2B. And what works for grocery stores might not work for media streaming brands.

 

Testing Opportunity #9. Frequency

The clinic identified a third opportunity in the email timing area: frequency. A large ecommerce company wanted to find the optimal send frequency for a portion of its list. For the company, this meant the frequency that would generate the most revenue without increasing the unsubscribe rate.

The team segmented the group into seven email frequencies:

  • 21 days
  • 14 days
  • 10 days
  • 7 days
  • 5 days
  • 3 days
  • 2 days

The team determined that when sending the email at the rate of once a week, the company would miss three times the amount of revenue it could be making if sending every other day without negatively affecting unsubscribes or the open rate.

That’s a huge potential lift in projected monthly revenue, and definitely worth a test for your list.

 

Stay tuned

Check back on Monday for the second portion of our email testing opportunities compilation, when we review experiment ideas around your design, body copy and calls-to-action.

 

You can follow Selena Blue, Manager of Editorial Content, MECLABS Institute on Twitter at @SelenaLBlue.

 

 You may also like

Email Marketing Chart: How send frequency impacts read rate [MarketingSherpa Chart]

Collaborative A/B Testing: Consumer Reports increases revenue per donation 32% [MarketingSherpa Case Study]

Email Marketing: Preheader testing generates 30% higher newsletter open rate for trade journal

 

Email Marketing: 5 test ideas for personalizing your email campaigns

September 3rd, 2015 No comments

Personalization is not new to email marketing; but has it lost some of its appeal with marketers?

Only 36% of marketers said they dynamically personalize email content using first names in subject lines and geo-location, according to the MarketingSherpa 2013 Email Marketing Benchmark Report. The report also revealed that only 37% of marketers segment email campaigns based on behavior.

However, marketers from various industries have seen incredible success with personalization. I dove into the library of MarketingSherpa, MarketingExperiments’ sister company, to find out how marketers have used both tried-and-true personalization tactics and innovative, tech-savvy strategies to better engage their customers and email audience.

No tactic or strategy is foolproof, so we suggest using these campaign tactics as testing ideas to see what works with your audience when it comes to email personalization.

 

Idea #1. Turn your email into a personal note, not a promotional email

As Flint McGlaughlin, Managing Director, MECLABS Institute, says, “People don’t buy from websites, people buy from people.”

The same applies to emails. As we saw in a recent MarketingExperiments’ Web clinic, “Personalized Messaging Tested: How little changes to an email send led to a 380% change in response rate,” when inviting your customers to take an action or attend an event, sending the email from a real person on your team can have a huge impact on the results of your campaign.

Read more…

Email Marketing: What assets should marketers be using to design better emails?

June 1st, 2015 No comments

Data is officially everywhere. It’s even infiltrating the design of emails — and for good reason.

“The more you know about your audience, obviously the better you can tailor an email design to someone,” Justine Jordan, Marketing Director, Litmus, said.

Justine sat down with Courtney Eckerle, Manager of Editorial Content, MarketingSherpa (sister company of MarketingExperiments), at MarketingSherpa Email Summit 2015, to discuss what tools marketers can access to better their email creatives.

When asked what is the biggest asset email marketers have when designing their next email, Justine answered data.

“Data can be a really powerful tool for helping a designer decide how to layout their campaigns,” she said.

Watch the whole interview here:

 

How can data make design better?

In the interview, Justine shared a few types of data that can benefit email designers:

  • What people have looked at in the past
  • What kind of email services people are opening up
  • What type of content has resonated with clients in the past

When asked how one of these could be applied to campaigns, Justine talked about technical compatibilities. For instance, GIFs don’t work properly in Outlook 2007. By using past data, you can know beforehand if a portion of your readers use that email service. If they do, and you use a GIF, then your email campaign won’t be as effective as it would have been if you had segmented that audience to use a more Outlook 2007 friendly design.

Read more…

A/B Testing: How to improve already effective marketing (and win a ticket to Email Summit in Vegas)

January 5th, 2015 183 comments

Editor’s Note: This subject line contest is no longer accepting entries. Check out “The Writer’s Dilemma:How to know which marketing copy will really be most effective” to see which entry won, why it won and what you can learn from that to further improve your own marketing.

This blog post ends with an opportunity for you to win a stay at the ARIA Resort & Casino in Las Vegas and a ticket to Email Summit, but it begins with an essential question for marketers:

How can you improve already successful marketing, advertising, websites and copywriting?

Today’s MarketingExperiments blog post is going to be unique. Not only are we going to teach you how to address this challenge, we’re going to also offer an example to help drive home the lesson. We’re going to cover a lot of ground today, so let’s dive in.

 

Give the people what they want …

Some copy and design is so bad, the fixes are obvious. Maybe you shouldn’t insult the customer in the headline. Maybe you should update the website that still uses a dot matrix font.

But when you’re already doing well, how can you continue to improve?

I don’t have the answer for you, but I’ll tell you who does — your customers.

There are many tricks, gimmicks and types of technology you can use in marketing, but when you strip away all the hype and rhetoric, successful marketing is pretty straightforward — clearly communicate the value your offer provides to people who will pay you for that value.

Easier said than done, of course.

How do you determine what customers want and the best way to deliver it to them?

Well, there are many ways to learn from customers, such as focus groups, surveys and social listening.

While there is value in asking people what they want, there is also a major challenge in it.

According to research from Dr. Noah J. Goldstein, Associate Professor of Management and Organizations, UCLA Anderson School of Management, “People’s ability to understand the factors that affect their behavior is surprisingly poor.”

Or, as Malcom Gladwell more glibly puts it when referring to coffee choices, “The mind knows not what the tongue wants.”

This is not to say that opinion-based customer preference research is bad. It can be helpful. However, it should be the beginning of your quest, not the end.

 

… by seeing what they actually do

You can use what you learn from opinion-based research to create a hypothesis about what customers want, and then run an experiment to see how they actually behave in real-world customer interactions with your product, marketing messages and website.

The technique that powers this kind of research is often known as A/B testing, split testing, landing page optimization or website optimization. If you are testing more than one thing at a time, it may also be referred to as multivariate testing.

To offer a simple example, you might assume that customers buy your product because it tastes great and because it’s less filling. Keeping these two assumptions in mind, you could create two landing pages — one with a headline that promotes that taste (treatment A) and another that mentions the low carbs (treatment B). You then send half the traffic that visits that URL to each version and see which performs better.

Here is a simple visual that Joey Taravella, Content Writer, MECLABS created to illustrate this concept: 

 

That’s just one test. To really learn about your customers, you must continue the process and create a testing-optimization cycle in your organization — continue to run A/B tests, record the findings, learn from them, create more hypotheses and test again based on these hypotheses.

This is true marketing experimentation, and it helps you build your theory of the customer.

 

Try your hand at A/B testing for a chance to win

Now that you have a basic understanding of marketing experimentation (there is also more information in the “You might also like” section of this blog post that you may find helpful), let’s engage in a real example to help drive home these lessons in a way you can apply to your own marketing challenges.

To help you take your marketing to the next level, The Moz Blog and MarketingExperiments Blog have joined forces to run a unique marketing experimentation contest.

In this blog post, we’re presenting you with a real challenge from a real organization and asking you to write a subject line that we’ll test with real customers. It’s simple; just leave your subject line as a comment in this blog post.

We’re going to pick three subject lines from The Moz Blog and three from the MarketingExperiments Blog and run a test with this organization’s customers.

Whoever writes the best performing subject line will win a stay at the ARIA Resort in Las Vegas as well as a two-day ticket to MarketingSherpa Email Summit 2015 to help them gain lessons to further improve their marketing.

Sound good? OK, let’s dive in and tell you about your client:

Read more…

4 Threats that Make Email Testing Dangerous and How a Major Retailer Overcame Them

October 2nd, 2014 No comments

To test emails, you just send out two versions of the same email. The one with the most opens is the best one, right?

Wrong.

“There are way too many validity threats that can affect outcomes,” explained Matthew Hertzman, Senior Research Manager, MECLABS.

A validity threat is anything that can cause researchers to draw a wrong conclusion. Conducting marketing tests without taking them into account can easily result in costly marketing mistakes.

In fact, it’s far more dangerous than not testing at all.

“Those who neglect to test know the risk they’re taking and market their changes cautiously and with healthy trepidation,” explains Flint McGlaughlin, Managing Director and CEO, MECLABS, in his Online Testing Course. “Those who conduct invalid tests are blind to the risk they take and make their changes boldly and with an unhealthy sense of confidence.”

These are the validity threats that are most likely to impact marketing tests:

  • Instrumentation effects — The effect on a test variable caused by an external variable, which is associated with a change in the measurement instrument. In essence, how your software platform can skew results.
    • An example: 10,000 emails don’t get delivered because of a server malfunction.
  • History effects — The effect on a test variable made by an extraneous variable associated with the passing of time. In essence, how an event can affect tests outcomes.
    • An example: There’s unexpected publicity around the product at the exact time you’re running the test.
  • Selection effects — An effect on a test variable by extraneous variables associated with the different types of subjects not being evenly distributed between treatments. In essence, there’s a fresh source of traffic that skews results.
    • An example: Another division runs a pay-per-click ad that directs traffic to your email’s landing page at the same time you’re running your test.
  • Sampling distortion effects — Failure to collect a sufficient sample size. Not enough people have participated in the test to provide a valid result. In essence, the more data you collect, the better.
    • An example: Determining that a test is valid based on 100 responses when you have a list with 100,000 contacts.

Ecommerce: 3 landing page elements to help increase product emphasis

July 14th, 2014 No comments

The elements on a product page are often one of the most underutilized tools a marketer has at their disposal. I say this, because let’s be honest, I’d wager few folks think of design elements on a product page in a “tool mindset.”

But in some respects, that’s exactly what they are, and ultimately, that’s how you will determine the kind of customer experience you build in ecommerce.

In this MarketingExperiments Blog post, I wanted to share three elements you can tweak to help emphasize important products and maybe even increase your revenue along the way.

 

Element #1. Size 

product-page-elements

 

Here’s an excellent example of how resizing a product image can help you place emphasis on it.

In the control, there were three products on the right sidebar and they were all equally weighted – that is a problem.

Nothing really stood out, which made drawing a clear conclusion for customers a little difficult.

In the treatment, instead of having three separate products on the page, the marketers hypothesized that a single product with a dropdown selection for a computer operating system would increase conversion.

Their hypothesis was right – the results from the tests included a 24% increase in revenue.

 

Element #2. Color 

email-product-testing

 

Here is another example of using elements in an email that you should pay close attention to because products are not trapped on pages in storefronts.

That perception is far from reality.

According to the MarketingSherpa Ecommerce Benchmark Study (download a complimentary copy at that link), email is one of the biggest drivers of ecommerce traffic.

In the treatment, the number of products were reduced, and bright red copy was used as supporting emphasis. I’m not fluent in Italian, but in any language, that is a good thing.

As you can see, color emphasis and copy now drive this email. From the changes in the treatment, I can intuitively understand the desired outcome:

  • I can order something at a great price
  • I get something free (gratis) as a thank-you gift
  • It only takes three easy steps to order

The treatment delivered a 24% increase in revenue with the right changes needed to have a powerful impact.

Read more…