Archive

Posts Tagged ‘email testing’

Email Marketing: What assets should marketers be using to design better emails?

June 1st, 2015 No comments

Data is officially everywhere. It’s even infiltrating the design of emails — and for good reason.

“The more you know about your audience, obviously the better you can tailor an email design to someone,” Justine Jordan, Marketing Director, Litmus, said.

Justine sat down with Courtney Eckerle, Manager of Editorial Content, MarketingSherpa (sister company of MarketingExperiments), at MarketingSherpa Email Summit 2015, to discuss what tools marketers can access to better their email creatives.

When asked what is the biggest asset email marketers have when designing their next email, Justine answered data.

“Data can be a really powerful tool for helping a designer decide how to layout their campaigns,” she said.

Watch the whole interview here:

 

How can data make design better?

In the interview, Justine shared a few types of data that can benefit email designers:

  • What people have looked at in the past
  • What kind of email services people are opening up
  • What type of content has resonated with clients in the past

When asked how one of these could be applied to campaigns, Justine talked about technical compatibilities. For instance, GIFs don’t work properly in Outlook 2007. By using past data, you can know beforehand if a portion of your readers use that email service. If they do, and you use a GIF, then your email campaign won’t be as effective as it would have been if you had segmented that audience to use a more Outlook 2007 friendly design.

Read more…

A/B Testing: How to improve already effective marketing (and win a ticket to Email Summit in Vegas)

January 5th, 2015 183 comments

Editor’s Note: This subject line contest is no longer accepting entries. Check out “The Writer’s Dilemma:How to know which marketing copy will really be most effective” to see which entry won, why it won and what you can learn from that to further improve your own marketing.

This blog post ends with an opportunity for you to win a stay at the ARIA Resort & Casino in Las Vegas and a ticket to Email Summit, but it begins with an essential question for marketers:

How can you improve already successful marketing, advertising, websites and copywriting?

Today’s MarketingExperiments blog post is going to be unique. Not only are we going to teach you how to address this challenge, we’re going to also offer an example to help drive home the lesson. We’re going to cover a lot of ground today, so let’s dive in.

 

Give the people what they want …

Some copy and design is so bad, the fixes are obvious. Maybe you shouldn’t insult the customer in the headline. Maybe you should update the website that still uses a dot matrix font.

But when you’re already doing well, how can you continue to improve?

I don’t have the answer for you, but I’ll tell you who does — your customers.

There are many tricks, gimmicks and types of technology you can use in marketing, but when you strip away all the hype and rhetoric, successful marketing is pretty straightforward — clearly communicate the value your offer provides to people who will pay you for that value.

Easier said than done, of course.

How do you determine what customers want and the best way to deliver it to them?

Well, there are many ways to learn from customers, such as focus groups, surveys and social listening.

While there is value in asking people what they want, there is also a major challenge in it.

According to research from Dr. Noah J. Goldstein, Associate Professor of Management and Organizations, UCLA Anderson School of Management, “People’s ability to understand the factors that affect their behavior is surprisingly poor.”

Or, as Malcom Gladwell more glibly puts it when referring to coffee choices, “The mind knows not what the tongue wants.”

This is not to say that opinion-based customer preference research is bad. It can be helpful. However, it should be the beginning of your quest, not the end.

 

… by seeing what they actually do

You can use what you learn from opinion-based research to create a hypothesis about what customers want, and then run an experiment to see how they actually behave in real-world customer interactions with your product, marketing messages and website.

The technique that powers this kind of research is often known as A/B testing, split testing, landing page optimization or website optimization. If you are testing more than one thing at a time, it may also be referred to as multivariate testing.

To offer a simple example, you might assume that customers buy your product because it tastes great and because it’s less filling. Keeping these two assumptions in mind, you could create two landing pages — one with a headline that promotes that taste (treatment A) and another that mentions the low carbs (treatment B). You then send half the traffic that visits that URL to each version and see which performs better.

Here is a simple visual that Joey Taravella, Content Writer, MECLABS created to illustrate this concept: 

 

That’s just one test. To really learn about your customers, you must continue the process and create a testing-optimization cycle in your organization — continue to run A/B tests, record the findings, learn from them, create more hypotheses and test again based on these hypotheses.

This is true marketing experimentation, and it helps you build your theory of the customer.

 

Try your hand at A/B testing for a chance to win

Now that you have a basic understanding of marketing experimentation (there is also more information in the “You might also like” section of this blog post that you may find helpful), let’s engage in a real example to help drive home these lessons in a way you can apply to your own marketing challenges.

To help you take your marketing to the next level, The Moz Blog and MarketingExperiments Blog have joined forces to run a unique marketing experimentation contest.

In this blog post, we’re presenting you with a real challenge from a real organization and asking you to write a subject line that we’ll test with real customers. It’s simple; just leave your subject line as a comment in this blog post.

We’re going to pick three subject lines from The Moz Blog and three from the MarketingExperiments Blog and run a test with this organization’s customers.

Whoever writes the best performing subject line will win a stay at the ARIA Resort in Las Vegas as well as a two-day ticket to MarketingSherpa Email Summit 2015 to help them gain lessons to further improve their marketing.

Sound good? OK, let’s dive in and tell you about your client:

Read more…

4 Threats that Make Email Testing Dangerous and How a Major Retailer Overcame Them

October 2nd, 2014 No comments

To test emails, you just send out two versions of the same email. The one with the most opens is the best one, right?

Wrong.

“There are way too many validity threats that can affect outcomes,” explained Matthew Hertzman, Senior Research Manager, MECLABS.

A validity threat is anything that can cause researchers to draw a wrong conclusion. Conducting marketing tests without taking them into account can easily result in costly marketing mistakes.

In fact, it’s far more dangerous than not testing at all.

“Those who neglect to test know the risk they’re taking and market their changes cautiously and with healthy trepidation,” explains Flint McGlaughlin, Managing Director and CEO, MECLABS, in his Online Testing Course. “Those who conduct invalid tests are blind to the risk they take and make their changes boldly and with an unhealthy sense of confidence.”

These are the validity threats that are most likely to impact marketing tests:

  • Instrumentation effects — The effect on a test variable caused by an external variable, which is associated with a change in the measurement instrument. In essence, how your software platform can skew results.
    • An example: 10,000 emails don’t get delivered because of a server malfunction.
  • History effects — The effect on a test variable made by an extraneous variable associated with the passing of time. In essence, how an event can affect tests outcomes.
    • An example: There’s unexpected publicity around the product at the exact time you’re running the test.
  • Selection effects — An effect on a test variable by extraneous variables associated with the different types of subjects not being evenly distributed between treatments. In essence, there’s a fresh source of traffic that skews results.
    • An example: Another division runs a pay-per-click ad that directs traffic to your email’s landing page at the same time you’re running your test.
  • Sampling distortion effects — Failure to collect a sufficient sample size. Not enough people have participated in the test to provide a valid result. In essence, the more data you collect, the better.
    • An example: Determining that a test is valid based on 100 responses when you have a list with 100,000 contacts.

Ecommerce: 3 landing page elements to help increase product emphasis

July 14th, 2014 No comments

The elements on a product page are often one of the most underutilized tools a marketer has at their disposal. I say this, because let’s be honest, I’d wager few folks think of design elements on a product page in a “tool mindset.”

But in some respects, that’s exactly what they are, and ultimately, that’s how you will determine the kind of customer experience you build in ecommerce.

In this MarketingExperiments Blog post, I wanted to share three elements you can tweak to help emphasize important products and maybe even increase your revenue along the way.

 

Element #1. Size 

product-page-elements

 

Here’s an excellent example of how resizing a product image can help you place emphasis on it.

In the control, there were three products on the right sidebar and they were all equally weighted – that is a problem.

Nothing really stood out, which made drawing a clear conclusion for customers a little difficult.

In the treatment, instead of having three separate products on the page, the marketers hypothesized that a single product with a dropdown selection for a computer operating system would increase conversion.

Their hypothesis was right – the results from the tests included a 24% increase in revenue.

 

Element #2. Color 

email-product-testing

 

Here is another example of using elements in an email that you should pay close attention to because products are not trapped on pages in storefronts.

That perception is far from reality.

According to the MarketingSherpa Ecommerce Benchmark Study (download a complimentary copy at that link), email is one of the biggest drivers of ecommerce traffic.

In the treatment, the number of products were reduced, and bright red copy was used as supporting emphasis. I’m not fluent in Italian, but in any language, that is a good thing.

As you can see, color emphasis and copy now drive this email. From the changes in the treatment, I can intuitively understand the desired outcome:

  • I can order something at a great price
  • I get something free (gratis) as a thank-you gift
  • It only takes three easy steps to order

The treatment delivered a 24% increase in revenue with the right changes needed to have a powerful impact.

Read more…

Lead Generation: Capturing more leads with clear value prop communication

October 3rd, 2013 2 comments

According to the MarketingSherpa 2012 Lead Generation Benchmark Report, 51% of marketers surveyed indicated the most effective platform for testing their value proposition was through email marketing campaigns.

This is no secret to savvy marketers. Austin McCraw, Senior Editorial Analyst, MECLABS, also discussed how to discover the essence of your value prop through email at Email Summit 2013.

Jon Ciampi, Vice President of Marketing, CRC Health, did just that and revealed his strategy at Lead Gen Summit 2013, happening right now in San Francisco.

In his session, “Lead Capture: How a healthcare company increased demand for services 300%,” Jon shared with the Summit audience how understanding customer motivations, driving traffic, and clearly communicating the value proposition all helped his company capture a higher quality of leads.

At CRC Health, Jon developed nine value propositions, and broke that list down into problem- and solution-focused messages. He combined the company’s in-house list with a purchased list consisting of psychiatrists and therapists who refer their patients to CRC Health. Then, the team crafted email subject lines reflecting the different value propositions to test where the customer was in regard to researching the problem, or looking for a solution.

Through testing, Jon discovered a 14.49% clickthrough rate in the top-performing subject line, and this was problem-focused messaging rather than solution-focused messaging. For CRC Health, the process of searching for a rehabilitation center is most likely a first-time experience for customers. Therefore, understanding that these prospects are looking for different options related to their problem, rather than immediately solving the issue, was extremely important to targeting their needs. 

 

“What we found is with rehab, everyone is focused on the problem. With our in-house list, patient-focused messages were more motivating and increased clickthrough rates,” Jon said.

Even though he made a breakthrough with testing value propositions through email, he did encounter the fact that one size does not fit all, particularly with his audience, and even more specifically with a purchased list.

For psychiatrists opening CRC Health sends, their top message for open and CTR was scientific-based. The subject lines and topics that most resonated with this segment were “improving addiction treatment with science and research,” “outdated addiction treatments fail patients,” and “CRC Health as the strongest clinical supervision in the nation.”

However, the audience that preferred more relationship-based messages was therapists. Messages like “Treatment fails when therapists & clients aren’t aligned,” and “Most rehabs can’t provide effective clinical supervision” were the top performers for this segment of CRC Health’s audience.

“Overall, self-serving messages performed far worse than patient-focused messages. Patient-oriented problem statements motivated them as well,” Jon said.

Through value prop testing with his audience via email messaging, Jon learned much more about his audience and their motivations.

As an exciting result of value proposition testing, he discovered a 3x to 4x increase in demand for services. According to Jon, when testing began, both inquiries and admissions increased.

“One of the top things I learned is to look at funnel. What are the motivations of your customers? … Also, understand their language. Different buyers with different perspectives will affect how your messages are interpreted,” Jon concluded.

  Read more…

Email Marketing: What you can learn from an 80% decrease in clickthrough rate

February 13th, 2013 No comments

On the MarketingExperiments blog, we often share tests we conduct with Research Partners. Today’s post was run on our own marketing campaign.

The team tested a promotional email for the MarketingSherpa 2012 Mobile Marketing Benchmark Report.

 

CONTROL

Subject Line: [Just Released] New Mobile Marketing Benchmark Report 

Click to enlarge

 

The control featured general copy about using mobile in your 2013 marketing strategy and what tactics are working for mobile.

After evaluating the control, the team hypothesized the email did not have information about the insights prospective customers will receive from reading this benchmark report.

From that analysis, the team crafted …

Read more…