Archive for the ‘Analytics & Testing’ Category

Finding Your Ideal Email Send Time to Maximize Relevancy

July 2nd, 2015 No comments

We get this question probably more than any other: when is the best time to send an email to customers?

The answer is that it is totally unique to your organization. There are many factors that could influence what your perfect send time is and the motivations of your customers.

Testing for the correct send time for your customers can increase the relevancy of your email campaigns and, in turn, increase your overall clickthrough rate. To discover your unique send time, there are two simple tests that you can perform that will focus your email campaigns to send on the best day, at the best time.

The first test you should run is a day of the week test.


Newsletter Day of the Week Test

Earlier this year we helped a tourism organization find out when to send its monthly promotional newsletter. To do this, we used Monday as the control and evenly split the traffic of newsletters to be sent each day of the week.

Based on the primary KPI, Opened/Delivered (open rates), the test had favorable results, and at least one treatment outperformed the control significantly.

However, the secondary KPI, Clicked/Delivered (clickthrough rates), had inconclusive results.


Primary metric — open rate: Monday was used as the control, and Saturday had the highest performing day with 5.5% relative difference with a 99% Level of Confidence (LOC).

Tuesday and Wednesday also outperformed the control with 3.8% relative difference with a 96% level of confidence. Tuesday, Wednesday and Saturday had statistically significant higher open rates than the other days of the week.

The remaining days —Thursday, Friday and Sunday — underperformed, though they had low levels of confidence. This indicates that there was little difference between these days and Monday open rates.


Secondary metric — click rate: Tuesday and Saturday have the highest click rates compared to Monday, though none of the treatments reached a statistically significant level of confidence.

Tuesday reached a 93% LOC compared to Monday. We cannot say conclusively that this day outperforms, though we may infer that it has the potential to do so. Saturday has a similar performance, with only 88% LOC.

Testing for open rate, the main metric measured by the team, validated that Saturday outperformed at a 5.5% relative difference to Monday. Tuesday and Wednesday also outperformed with 96% LOC. Test results show higher open rates on those particular days.

The main discovery for the marketers here was that emails that were sent on Tuesday, Wednesday and Saturday performed best. We decided to get more granular and test when, exactly, during those days customers would be the most engaged.


Time of Day Test

Our hypothesis was that the monthly newsletter’s open rate and clickthrough rate may be suffering due to the “time of day” it is being sent.

By sending the monthly newsletter at a time that presents less friction (due to work or personal distractions) for the ideal visitor, there will be a positive increase in the total clickthrough rate of the monthly newsletter.

The second test sent a total of 177,916 emails four different times a day — 8:00 a.m., 11:00 a.m., 3:00 p.m. and 7:00 p.m., but only on Tuesday and Wednesday.

The testing team decided not to test on Saturday due to organizational constraints. The first email was sent on Tuesday at 8:00 a.m. and was designed to be the control. Out of the 21,746 emails sent, it had an open rate of 4.83%. The traffic was split evenly during the test period between the control and the seven treatments by an average of 12.75%.



We discovered that Wednesday evenings at 7 p.m. had the highest clickthrough and open rate — 16% relative difference compared to control.

Tuesday and Wednesday at 11 a.m. and 3 p.m. had high click rates. While Tuesday and Wednesday evenings at 7 p.m. had the next highest open rates and while Wednesday evenings at 7 p.m. had the highest click rate, it looks like Tuesday and Wednesday at 11 a.m. and 3 p.m. had the most consistently high click and open rates.


As we can see from what this company learned about its best send times, testing is a critical factor in helping your emails achieve the highest relevancy for your customers.

Email marketing has become a battle of time. Even the time it takes to scan an email for 20 seconds is precious.

Be conscious of the time your customers are choosing to spend with you by delivering emails right when they want to read them. Every organization’s send time will be different. With these two simple tests, you can find out your ideal send time.


You might also like

Email Marketing Timing: When is the optimal time to send your next marketing email? [More from the blogs]

Timing and Email Marketing: Sunday generated 23% higher clickthrough than Tuesday in test [More from the blogs]

Discover the Best Time to Send Email: 4 Test Ideas [From MarketingSherpa]

Infographic: Email open rates by time of day [From the MarketingSherpa blog]

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg
Categories: Analytics & Testing Tags:

Website Analytics: How to use data to determine where to test

May 28th, 2015 6 comments

At MarketingExperiments, we use patented heuristics to evaluate websites, emails and other digital mediums. 

Often people think that a heuristic evaluation is a purely qualitative approach to a problem. This can be true, but when you combine quantitative analytics with the qualitative knowledge you increase the power to make meaningful change.

This post will show relevant metrics for three of these elements that any marketer — from beginner to advanced — can use to discover opportunities for improvement.


Step #1. Look at the qualitative elements of your website

Often people just ask for data dumps. To make matters worse, they want it in a very short time. On top of that, most data scientists use purely what they are comfortable with: numbers.

To add context and save time, you must evaluate the site to see where you should focus your data analysis.

Put yourself in the customer’s mindset and go to your site. If you own creative or design, try to remove those biases as best as possible. What makes sense to you or feels like the right amount of information may be completely overwhelming to a customer who isn’t familiar with your product or industry.


Look for things that are broken, functions that are clunky, images that don’t make sense or add value and difficulty in completing the conversion. You must objectively look at all the pages in the conversion path and be familiar enough with them to make sense of the data that you pull.

Pull data to illuminate the points you saw to give validity to the theory. Key heuristic elements and data help prove the problem.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Understanding Your Customer’s Story: How one company increased conversion 104% by identifying motivation

May 21st, 2015 2 comments

Every time someone wants to buy something from your brand, there’s a story that explains why they want what you’re selling. Identifying that story is key to making the sale.

How do we know this is true? Because when we know someone’s story, we know their motivation. If someone is highly motivated to get a solution, they’ll put up with almost anything — a poorly written email, a slow website or even a convoluted sales flow — to get it.

Consider this patented heuristic:


This isn’t a math formula. It’s a guide that MarketingExperiments and its parent company, MECLABS Institute, derived from analyzing tens of thousands of sales flows. This heuristic reflects what it takes to convert (C) a prospect into a customer and shows how the five variables — motivation (m), value (v), incentive (i), friction (f) and anxiety (a) — relate to each other. The numbers next to the variables identify how powerfully they affect conversion. Note that motivation is the most heavily weighted variable.

If formulas make your eyes cross, all you need to know is this: if a customer is highly motivated, none of the other elements (such as friction, anxiety or a poorly communicated value proposition) can stop them from moving forward in the sales process.

The most recent Web clinic looked at clues that revealed customers’ stories and, consequently, their motivation. Watch it and, within 30 minutes, you’ll get critical information that you can use immediately to drive an impressive lift in conversions.

Consider the experience, the second company outlined during the Web Clinic, of a Canadian window manufacturer who was a student of MarketingExperiments. He called on MECLABS to help him increase conversions from his online site.


The Control

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

The Power of a Specific Offer to a Specific Prospect

May 7th, 2015 No comments

Specificity converts. In marketing, there should be no such thing as a general message. The marketer communicates with an aim. This aim should dictate everything else we say. This aim should influence, even constrain, every word we say.

— Flint McGlaughlin, Managing Director and CEO, MECLABS Institute

Specificity converts. A specific offer to a specific person will outperform a general offer to a general person.

This concept relates to a recent email test we ran with our MarketingSherpa audience and ran again (with a slight twist) with our MarketingExperiments audience.

First, in case you’re not familiar with MarketingSherpa, allow me to briefly explain our sister company.

MarketingSherpa’s content is geared toward exploring general marketing principles. This is also where companies and marketers can share specific marketing stories, such as Mellow Mushroom’s social media strategy and Red Bull’s content marketing advice.

Alternatively, the MarketingExperiments audience delves more specifically into the tactics of marketing strategy. MarketingExperiments is about specific tests that the reader can apply to their own marketing.

Now that you understand the difference in content related to the tested audiences, let’s get into the test itself.


The test

We tested an email invitation for a recent follow-up Web clinic MarketingSherpa hosted. The clinic’s objectives were to examine the results of a live optimized email test, which was run by the MarketingSherpa audience at Email Summit 2015 alongside Flint McGlaughlin.

The test consisted of two treatments:

  1. Treatment A focused on the Email Summit follow-up test, only mentioning live optimization from the MECLABS team.
  2. Treatment B switched the emphasis by focusing on the live optimization from the MECLABS team, only mentioning the Email Summit follow-up test.

In essence, both emails were an invite to the same Web clinic and messaged the same two offers, just with different expressions of focus.


Treatment A: Email Summit follow-up

Subject line: Does peer review work? See the results of the audience optimized email from the Email Summit 2015

Preheader: Plus live email optimization from the MECLABS research team. 

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Here’s Why Most A/B Testing is Boring and Not Really Worth the Trouble

April 6th, 2015 1 comment

Do a quick Google search on “things to a/b test on a website,” scan the results for a moment, then come back and read the rest of this article.

Most of you reading this are marketers, so you know I’m taking a big risk by telling you to go do something else before you read my article.

In fact if you’re reading this now, you’re probably one of the very few who made it back from that incredibly distracting activity I had you do. Thank you. You are exactly the person I want to be reading this. The others can go on their merry way. They are not the ones who need to hear this.

I had you do that search because the Internet is full of people telling you to test things on your website such as color, button size, layouts, forms, etc. I wanted you to get an idea for what’s out there.

Now, I want you to understand why almost everyone writing those articles is wrong

… or at the very least, missing the point.

Please don’t view this as me putting down the people who wrote those articles. I know a few of them personally, and I highly respect the work they are doing. This is not about whether their work is good or bad.

I’ve personally written many articles exactly like the ones they’re writing. In fact, they have one up on me because at least their articles are ranking in Google for popular search terms.

The reason they are missing the point is that most of those articles are focused on the elements of a page rather than the serving of a customer.

I get why they do it.

Webpages are far easier to understand than people. Webpages are a collection of 0s and 1s. People are a collection of who knows what.

And most of you, readers, are looking for webpage fixes — not a deeper, fuller way to serve your customer.

There is nothing necessarily wrong with you, but it’s just that we naturally focus on our own self-interest. It isn’t wrong in itself.

What is wrong is the methods we use to achieve our own goals. I don’t mean morally wrong. I mean practically wrong.


Our objective should always be: Make as much money possible.

MECLABS Institute has found after more than 15 years of research that the best method for achieving this objective is to spend as much money possible on serving your customer.

Until we can view every A/B test we run as an opportunity to better serve our customers, we will just be running (ultimately) ineffective tests on page elements.

It doesn’t really matter in the long run which color, layout or page element is going to perform well.

The Internet is constantly changing. Design trends are always going to influence how we look at webpages and their elements. What matters for marketers in the long run is how well we understand and, consequently, how well we can serve our customers.

Flint McGlaughlin, Managing Director and CEO, MECLABS, calls this understanding of our customers “customer wisdom.

This is also why he often says, “The goal of a test is not to get a lift, but rather to get a learning.”

However, it’s one thing to hear this, another to really understand what it means.

It really means we want to conduct research, not run a test.

We want to learn a tangible lesson about our customer so that we can apply it to other areas of our marketing and achieve a maximum return on the amount of time and energy we spend on testing.

Let me show you what I mean with a real-world example. Here’s what happens when you just run an A/B test that is focused on a page element. Let’s take color for instance.

You have two treatments. The only thing changed is the background color. 


You also have a result. In this case, the result was a 19.5% increase in clickthrough at a 92% level of confidence. But here’s where things get tricky.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Digital Analytics: How to use data to tell your marketing story

March 12th, 2015 No comments

When it comes to being a data-driven marketing team, there is not as much opposition between content and data as once thought.

Two central themes that highlight this idea came out of the Opening Session of The Adobe Summit — The Digital Marketing Conference. They are:

  • Use data correctly to support a story
  • Ensure the story you’re telling can be relayed to a wider audience

Marketers need to quit treating their data analysts as number-crunching minions and start seeing them as contributors with a vital perspective of the greater customer story.

Nate Silver, Founder and Editor in Chief,, spoke about how useless data can be if you can’t communicate it to a wider audience. The practice of collecting, analyzing and interpreting data can be very costly, and marketers need to maximize ROI by making sure they tell the correct story and that it can be spread across their organization.

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg