Archive

Posts Tagged ‘email optimization’

Email Optimization: Testing best time of day and day of week for email interaction

September 22nd, 2014 6 comments

When do you check your personal email? Do you let it build up throughout the work week and go through it during the weekends? Do you check it on Monday when you’re also sorting through your work email? Or do you check it while you’re at lunch or on a quick, but much-needed, break from work?

In today’s MarketingExperiments Blog post, we’re going to explore which times of the day and days of the week people are most likely to interact with their emails — two questions of optimal interest for any emailing campaign.

 

Testing  the time of day when people interact with email

In email testing, we focus so much on the content and landing page of the email, but that hard work won’t pay off if email recipients don’t open or clickthrough the email. We wanted to get a better understanding of when people interact with emails to determine the best time of the day and day of the week to send promotional emails.

First, we began testing what time of day people are most likely to open and interact with emails.

Emails were currently being sent out on Mondays and Wednesdays at 7 a.m. EST. We hypothesized that by sending emails at various times throughout the day, we would learn the optimal times recipients are most likely to open and clickthrough their emails.

In an A/B split test, we sent a promotional email at 7 a.m., 3 a.m., 3 p.m. and 7 p.m. EST on a Monday. We wanted to isolate the general times of day people may be interacting with their email.

3 a.m. was tested to determine if people were more likely to interact with their emails as soon as they wake up in the morning and before they start their day, while 3 p.m. would tell us if people were checking their emails in the afternoons.

Lastly, 7 p.m. results would show that recipients were more likely to check and interact with their email in the evenings or later at night.

By sending emails at 7 p.m. EST instead of 7 a.m. EST, we saw a 12% lift in open rate:

  

Read more…

Online Testing: 5 steps to launching tests and being your own teacher

April 10th, 2014 No comments

Testing is the marketer’s ultimate tool. It allows us to not just guess what coulda, woulda, shoulda worked, but to know what actually works. But more than that, it gives us the power to choose what we want to know about our customers.

“As a tester, you get to be your own teacher, if you will, and pick tests that make you want to learn. And structure tests that give you the knowledge you’re trying to gain,” said Benjamin Filip, Senior Manager of Data Sciences, MECLABS.

So what steps do we take if we want to be our own teacher?

While conducting interviews about the live test ran at MarketingSherpa Email Summit 2014, I recently had the chance to discuss testing processes with Ben, as well as Lauren Pitchford, Optimization Manager, and Steve Beger, Senior Development Manager, also both of MECLABS. The three of them worked together with live test sponsor BlueHornet to plan, design and execute the A/B split test they validated in less than 24 hours.

Read on to learn what they had to share about the testing process that marketers can take away from this email live test. We’ll break down each of the steps of the live test and help you apply them to your own testing efforts.

 

Step #1. Uncover gaps in customer insights and behavior

As Austin McCraw, Senior Director of Content Production, MECLABS, said at Email Summit, “We all have gaps in our customer theory. Which gap do we want to fill? What do we want to learn about our customer?”

What do you wish you knew about your customers? Do they prefer letter-style emails or design-heavy promotional emails? Do they prefer a certain day of the week to receive emails? Or time of day? Does one valuable incentive incite more engagement than three smaller incentives of the same combined value?

Think about what you know about your customers, and then think about what knowledge could help you better market to them and their needs and wants.

 

Step #2. Craft possible research questions and hypotheses

When forming research questions and hypotheses, Ben said, “You have to have some background info. A hypothesis is an educated guess, it’s not just completely out of the blue.”

Take a look at your past data to interpret what customers are doing in your emails or on your webpages.

Lauren wrote a great post on what makes a good hypothesis, so I won’t dive too deeply here. Basically, your hypothesis needs three parts:

  • Presumed problem
  • Proposed solution
  • Anticipated result

 

Step #3. Brainstorm ways answer those questions

While brainstorming will start with you and your group, don’t stop there. At MECLABS, we use peer review sessions (PRS) to receive feedback on anything from test ideas and wireframes, to value proposition development and results analysis.

“As a scientist or a tester, you have a tendency to put blinders on and you test similar things or the same things over and over. You don’t see problems,” Ben said.

Having potential problems pointed out is certainly not what any marketers want to hear, but it’s not a reason to skip this part of the process.

“That’s why some people don’t like to do PRS, but it’s better to find out earlier than to present it to [decision-makers] who stare at you blinking, thinking, ‘What?’” Lauren explained.

However, peer review is more than discovering problems, it’s also about discovering great ideas you might otherwise miss.

“It’s very easy for us to fall into our own ideas. One thing for testers, there is the risk of thinking that something that is so important to you is the most important thing. It might bother you that this font is hard to read, but I don’t read anyway because I’m a math guy, so I just want to see the pretty pictures. So I’m going to sit there and optimize pictures all day long. That’s going to be my great idea. So unless you listen to other people, you’re not going to get all the great ideas,” Ben said.

Read more…

Email Marketing: 6 bad habits to avoid when testing emails

November 11th, 2013 No comments

In my experiences with helping our Research Partners with email campaigns, I’ve discovered when it comes to testing, it’s not a one-size-fits-all activity.

Your email campaigns are fundamentally different than landing pages or any other elements of your marketing mix and sales funnel.

Consequently, your approach to testing them will also be different.

They have different goals, elements, best practices and bad habits to avoid.

In today’s MarketingExperiments Blog post, I wanted to share six common bad habits to avoid when testing email campaigns.

 

Bad Habit #1. Not knowing how your email list is being split

This is a very common mistake I see often and it’s one of the most avoidable. Marketing teams will test with limited understanding of how their email list is being split into test cells.

Or worse, they don’t know how their email platform splits at all. This is troublesome because it can easily cause sampling errors that will skew your results.

So first, check the platform you’re using to learn how the list splitting algorithm works.

If there’s not any specific information about how the email platform is allocating test cells, consider testing a dual control email send to gain a better understanding of how much your data may vary.

Also, try to make sure each test cell has allocated recipients fairly, especially if your list has information in the database that indicates recipients have varying degrees of motivation.

The reason for this is unlike A/B split testing, where landing page traffic comes from multiple sources and is split at random, email lists are a finite traffic source.

What if I’m splitting lists myself, you ask?

If that’s the case – try to do so as randomly as possible.

 

Bad Habit #2. Drawing conclusions after only one test

Judging a test by a single email drop is a mistake, even if your testing tool says your results have reached statistical significance.

I recommend testing your treatments over multiple email drops to ensure you are seeing some form of consistency in your results before making a business decision.

Also, one common question I get about data analysis is which method of analysis should be used to interpret your results.

In this case, I recommend recording the data as separate points in time instead of lumping all of the data together.

The reason for this is the fixed points will give you a better picture of behavior across sends, which is likely more accurate given this approach also takes into account variability over time.

 

Bad Habit #3. Random send times

The results of an email drop represent a single point in time versus landing page testing which has a continuous stream of traffic to pages.

Consequently, if you are not consistent in the delivery of your email drops – time of day, day of week, etc. – this inconsistency will impact your ability to interpret results accurately.

Here’s why …

If you think about when you go through the emails in your own inbox, it’s likely you do so at random. So, the only way to account for that randomness is by sending emails on a consistent schedule.

Inherently, you can adjust that send schedule to test your way into discovering the ideal time to send your customers an email, but keeping the frequency constant is key.

 

Bad Habit #4. Not having a clear-cut goal in your testing

This is another common mistake I see that’s avoidable – lacking a clear test hypothesis.

Email is one of the strictest channels. The general conversion path of an email is something like this:

  1. You send an email to your list
  2. The customer receives your email in their inbox (unless it gets caught in a spam filter)
  3. They identify the sender, skim the subject line and choose to open or delete the email
  4. If they choose to open the email, hopefully they engage the content
  5. If all goes to plan after engaging the content, they convert

But even with the path clearly laid out, you still can’t go anywhere without a sense of direction.

That’s why you want to make sure you have a good hypothesis that is clear and testable right from the start to help keep your testing efforts strategic in focus.

 

Bad Habit #5. Inconsistent key performance indicators

Ultimately, conversion (or revenue) of the treatment cell should be used to determine the winner. Depending on your goals, the point here is to make sure you are consistent as you evaluate the results.

Also, I would caution judging test results solely on clickthrough or open rates, which tend to be the primary drivers in email tests. Secondary metrics can tell a very interesting story about customer behavior if you’re willing to look at the data from all angles.

 

Bad Habit #6. Not setting a standard decay time

So, what is time decay exactly?

To keep things simple, time decay is really just a set period of time for an activity to take place around an email drop – an open, a click, etc.

If you are judging multiple drops, the data for each drop should follow a standard decay guideline that everyone on your team understands and agrees with. We generally suggest a week (seven days) as enough time to call the performance of a typical email drop.

One caveat here worth a mention is there is no magic bullet with email decay time.

The goals and objectives for campaigns vary by industry, so there are no universal standards in place.

Your organization should come to a consensus about a standard decay time to judge campaign performance before the campaign gets underway.

Read more…

Email Marketing: Promotional vs. letter-style test increases conversion 181%

October 14th, 2013 4 comments

At the heart of email marketing campaigns, it often seems as if a tug-of-war is being waged.

On one side, you have gaining attention as a tactic and on the other, you have using conversation.

But, which of these is truly effective?

Let’s take a look at how the MECLABS research team tested a promotional-style email design against a letter-style and what we can learn from the results.

Before we get started, here’s a quick review of the research notes for a little background on the experiment.

Background: A large international media company focusing on increasing
subscription rates.

Goal: To increase the number of conversions based on the value proposition conveyed through the email.

Primary Research Question: Which email will generate the highest conversion rate?

Approach: A/B multifactor split test

 

Control 

 

The research team hypothesized the control featured popular design principles to create balance and hierarchy on the page.

The promotional-style email also featured heavy use of images and graphics to catch the readers’ attention and multiple call-to-action buttons for increased points of entry.

 

Treatment

 

In the treatment, a letter-style email was designed to look and feel more like a personal letter. The design limited the use of graphics and images and featured a single call-to-action button.

 

Results

 

What you need to know

By limiting the amount of graphics and focusing on engaging the customer in a conversation, the treatment outperformed the control by 181%. To learn more about why the letter-style email beat the promotional-style design, you can watch the free on-demand MarketingExperiments Web clinic replay of “Are Letter-Style Emails Still Effective?”

Read more…

Email Marketing: Subject line test increases open rate by 10%

August 12th, 2013 7 comments

Every year, MarketingExperiments’ sister brand MarketingSherpa holds its annual MarketingSherpa Email Awards as a showcase to recognize marketers who designed email campaigns that exceeded expectations.

In today’s MarketingExperiments Blog post, I wanted to share a simple subject line test from a previous gold medal winner you can use to aid your email marketing efforts.

 

Winning back hearts and minds one email at a time

Travelocity identified a segment of existing email subscribers who had not booked for over a year and wanted to win back that segment’s business.

The team worked with StrongMail to develop an email campaign strategy to generate engagement, and drive conversion from the lapsed set of subscribers.

The StrongMail team started evaluating previous campaigns and testing offers.

One of the elements StrongMail used to test those offers was a subject line treatment offering a 10% discount incentive to the lapsed segment.

Here were the two subject lines:

Subject Line A: “Save an additional 10% for a limited time only.” (Shorter subject line with generic offer.)

Subject Line B: “As our valued customer, get an extra 10% off for a limited time only.” (Longer subject line with the “valued customer” message.)

 

Results

Subject line B outperformed subject line A by a solid 10%.

 

What’s also interesting here is that when the 10% incentive was tested against a 15% discount, pictured above in a second round of testing, the increased incentive did not yield a significant difference in open rates.

 

What you need to know

A successful email marketing campaign requires more than identifying an unresponsive list. It also involves careful research of what has worked and not worked in the past and testing new approaches to engage a slumbering list.

The Travelocity and StrongMail teams were able to re-engage a significant percentage of those lapsed customers to generate incremental revenue that would likely have been lost to competitors.

Read more…

Email Copywriting: Simplification, specificity, focus on customer generates 400% increase in CTR

March 4th, 2013 3 comments

At MarketingSherpa Email Summit 2013, Donna Krizik, Director of Client Communications, Crestwood Associates LLC, presented some impressive email copywriting tests. Let’s take a closer look at what she learned about email body copy today on the MarketingExperiments blog …

 

Background:  Crestwood Associates is a Microsoft Dynamics ERP and CRM reseller. The new release of Dynamics CRM touted 45 new features. Crestwood wanted both existing ERP and CRM customers, along with prospects, to sign up for an informative webinar, which would then get customers and prospects to schedule an upgrade or new deployment.

Audience: Marketing and Sales staff who either have CRM or are in the market for CRM

Objective:  Get customers to click on a link to register or learn more

Primary Research Question:  What email copy will maximize my clickthroughs?

Test Design:  Complete copy change. Donna’s team changed everything (just about).

 

CONTROL

 

Hi Jamie,

Microsoft Dynamics CRM helps you make informed
decisions. Here’s how:

  • Increase your sales opportunities by identifying your top customers
  • Reduce operational costs by optimizing
  • Improve automation and efficiently across your organization

Click here to learn more about how Microsoft Dynamics CRM can drive your sales by improving efficiency.

To learn the Top 10 Reasons for Choosing Dynamics CRM, read this document.

 

 
 
 
Register now for our next lunch & learn on Dynamics CRM. Join us Tuesday, November 20th at 11am.

Fact Sheet: How to Improve Marketing, Boost Sales, and Bolster your Customer Experience with Dynamics CRM. Download now.

Whitepaper: Top 10 Reasons for Choosing Microsoft Dynamics CRM
Download now.

 

After reviewing the above email copy, Donna identified a few elements that might be hindering conversion:

  • Overall, there was just too much going on.
  • The first line focused on the product, not the customer: “Microsoft Dynamics CRM helps you …”
  • There were five calls-to-action (two in the main copy, plus three in the sidebar).
  • The call-to-action language was generic and did not offer any value (“Click here” and “read this document”).
  • The language in the bullet points was “fuzzy.”

 

TREATMENT #1

 

Hi Jamie,

Are you struggling each month, trying to get a handle on your sales pipeline? Microsoft Dynamics CRM helps you see key information at a glance. With CRM, you can:

  • Quickly identify your top customers
  • Reduce costs by eliminating double entry
  • Improve automation and efficiency across your organization

Click here to learn more about how Microsoft Dynamics CRM can drive your sales by improving efficiency.

 

 
Register now for our next lunch & learn on Dynamics CRM. Join us Tuesday, November 20th at 11am.

Fact Sheet: How to Improve Marketing, Boost Sales, and Bolster your Customer Experience with Dynamics CRM. Download now.

Whitepaper: Top 10 Reasons for Choosing Microsoft Dynamics CRM
Download now.

 

In this treatment, Donna reworded the main body copy to start with a customer pain point instead of a product. (“Are you struggling each month, trying to get a handle on your sales pipeline?”)

She also removed two of the five calls-to-action (CTAs), removing the “Top 10 Reasons to Choose CRM” whitepaper CTA from both the main body copy and the right-hand sidebar copy.

 

TREATMENT #2
 

Hi Jamie,

Are you struggling each month, trying to get a handle on your sales pipeline? Microsoft Dynamics CRM helps you see key information at a glance. With CRM, you can:

  • Quickly identify your top customers
  • Reduce costs by eliminating double entry
  • Improve automation and efficiency across your organization
 

 
Register now for our next lunch & learn on Dynamics CRM. Join us Tuesday, November 20th at 11am.

Fact Sheet: How to Improve Marketing, Boost Sales, and Bolster your Customer Experience with Dynamics CRM. Download now.

 

For this treatment, the copy stayed the same as Treatment #1; however, Donna removed one more call-to-action (removing “click here to learn more about …” from the main body copy). This email had two calls-to-action, in the right-hand sidebar.

She also changed the right-hand sidebar background color from dark purple to light green.

 

TREATMENT #3
 

Hi Jamie,

Are you struggling each month, trying to get a handle on your sales pipeline? Wish you could get a quick snapshot any time you want? You can. Here’s how:
Microsoft Dynamics CRM.

  • Set up a sales funnel dashboard in 3 minutes or less. [Really.]
  • Drill into any sales or marketing data for more info
  • Add as many dashboard sections as you want
  • Save to your CRM HomePage or share your dashboards with 1 click.
  • Get a look at Dynamics CRM in all its glory – attend our Expo – details in sidebar!
 

Get the LIVE CRM Experience

Register now for our Free Dynamics CRM Expo.

–See the product in action
–Take a test drive,
–Create your own dashboard,

…and get all of your questions answered at this free expo.

Tuesday, November 20th
9am – Noon
3025 Highland Pkwy
Downers Grove, IL
Preview the Agenda here. Don’t miss out, this event fills up quickly! Refreshments provided.

 
Again, Donna started with a pain point question her audience of salespeople might relate to along with a solution. This time, however, Donna “picked ONE feature of CRM that would solve their problem and focused on it (a very sexy feature) and quantified some ease of use and functionality they can take advantage of” instead of talking about CRM in general, high-level terms as in the previous two treatments that had more general language like “with CRM you can quickly identify your top customers.”

Here are some other changes Donna made in Treatment #3:

  • Added a transition at the bottom of the main copy calling attention to the CTA on the sidebar
  • Had only one CTA in the sidebar, which focused on the event
  • Had a clear call-to-action that did not require much commitment from the audience for taking the next step – “Preview the Agenda here”
  • Added urgency

 

RESULTS

 

 

The simpler email (one call-to-action as opposed to five) focusing on the customer’s pain points instead of the product, and suggested a specific solution, generated a 400% higher clickthrough rate than the control.

 

What you need to understand

Here are Donna’s key email body copy takeaways, based on this test:

  • Focus on audience – speak directly with them and their pain point.
  • Focus on one key benefit, map it to the pain and solve it.
  • Eliminate multiple equally weighted CTAs. They are too confusing, and muddy the ‘What do you want me to do?’ question.
  • Soften your ask (learn, preview, sample, see, view, tour).

Justin Bridegan, Senior Marketing Manager, MECLABS, moderated this session at Email Summit, so I asked him for his key takeaways from this test, as well.

“It’s very important in the first couple of sentences to really engage the audience. If you don’t engage them in the beginning, it doesn’t matter what the rest of the copy says. Pulling them in with pain points and benefits really makes a difference,” Justin said.

“Focus on your customers’ challenges and pain points first, before you get into your product. You have to pull them in, just like a good movie. On Netflix, I watch the first five minutes. If I’m bored, I pull it,” he added.

Aside from the opening copy, Justin also alluded to simplifying the email itself.

“In a movie, if too much is going on too soon, people will say, ‘Whoa, forget it.’ Every time she removed something she got a better result.”

“Marketers are under pressure to do too much with their emails right now. They keep trying to put more into it to get more out of it. That’s not the place, or the right time, to do that,” Justin advised marketers. “I’ve found that in my own copy. One call-to-action is better. This email is primarily for this one, specific goal. Then, let the landing page sell them.”

“It should all make sense together, and if it doesn’t, it doesn’t belong in the email,” Justin concluded.

Aside from the simpler email more focused on the customer’s pain point, the specificity of the solution likely also played a role in the higher clickthrough rate.

“Specificity converts,” said Flint McGlaughlin, Managing Director, MECLABS. “In marketing, there should be no such thing as a general message. The marketer communicates with an aim. This aim should dictate everything else we say. This aim should influence, even constrain, every word we say.”

 

Related Resources:

Optimization Summit 2013 in Boston, May 20-23, 2013

Email Copywriting: How a change in tone increased lead inquiry by 349%

Email Optimization: 4 optimization suggestions to test in your next send

Email Copywriting: Tips from 3 of your peers