Archive

Archive for the ‘Analytics & Testing’ Category

Website Optimization: How a B2B publishing company increased free trial sign-ups by 36.4%

July 6th, 2015 No comments

A popular method to acquire new customers for online subscription models is through free trials. The hope is that by using a service or product, prospective customers can fully experience and appreciate the value you have to offer.

However, before the experience begins, you must first get them to see the value of the trial. After all, it can be a friction- and anxiety-filled process since many trials require credit card information at the time of sign-up.

Increasing the completion rate for the trial sign-up process was one of three steps the Euromoney Institutional Investor team took to revamp its strategy and increase conversions. The team accomplished this through online testing after seeing that 60% of traffic in the funnel did not complete the process.

At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, Ben Eva, Global Head of Conversion Management, Euromoney Institutional Investor, shared three online tests his team used to learn about their customers and to increase that completion rate.

What is the best layout/design of the offer page?                            

Euromoney publishes over 200 online information services — which equals a lot of potential for testing. Because each audience varies, sometimes the same type of test is needed to find the best result for each audience.

Ben shared two tests that revolve around page design and how to best communicate the value of the publications and their free trials.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Finding Your Ideal Email Send Time to Maximize Relevancy

July 2nd, 2015 No comments

We get this question probably more than any other: when is the best time to send an email to customers?

The answer is that it is totally unique to your organization. There are many factors that could influence what your perfect send time is and the motivations of your customers.

Testing for the correct send time for your customers can increase the relevancy of your email campaigns and, in turn, increase your overall clickthrough rate. To discover your unique send time, there are two simple tests that you can perform that will focus your email campaigns to send on the best day, at the best time.

The first test you should run is a day of the week test.

 

Newsletter Day of the Week Test

Earlier this year we helped a tourism organization find out when to send its monthly promotional newsletter. To do this, we used Monday as the control and evenly split the traffic of newsletters to be sent each day of the week.

Based on the primary KPI, Opened/Delivered (open rates), the test had favorable results, and at least one treatment outperformed the control significantly.

However, the secondary KPI, Clicked/Delivered (clickthrough rates), had inconclusive results.

 

Primary metric — open rate: Monday was used as the control, and Saturday had the highest performing day with 5.5% relative difference with a 99% Level of Confidence (LOC).

Tuesday and Wednesday also outperformed the control with 3.8% relative difference with a 96% level of confidence. Tuesday, Wednesday and Saturday had statistically significant higher open rates than the other days of the week.

The remaining days —Thursday, Friday and Sunday — underperformed, though they had low levels of confidence. This indicates that there was little difference between these days and Monday open rates.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg
Categories: Analytics & Testing Tags:

Website Analytics: How to use data to determine where to test

May 28th, 2015 6 comments

At MarketingExperiments, we use patented heuristics to evaluate websites, emails and other digital mediums. 

Often people think that a heuristic evaluation is a purely qualitative approach to a problem. This can be true, but when you combine quantitative analytics with the qualitative knowledge you increase the power to make meaningful change.

This post will show relevant metrics for three of these elements that any marketer — from beginner to advanced — can use to discover opportunities for improvement.

 

Step #1. Look at the qualitative elements of your website

Often people just ask for data dumps. To make matters worse, they want it in a very short time. On top of that, most data scientists use purely what they are comfortable with: numbers.

To add context and save time, you must evaluate the site to see where you should focus your data analysis.

Put yourself in the customer’s mindset and go to your site. If you own creative or design, try to remove those biases as best as possible. What makes sense to you or feels like the right amount of information may be completely overwhelming to a customer who isn’t familiar with your product or industry.

 

Look for things that are broken, functions that are clunky, images that don’t make sense or add value and difficulty in completing the conversion. You must objectively look at all the pages in the conversion path and be familiar enough with them to make sense of the data that you pull.

Pull data to illuminate the points you saw to give validity to the theory. Key heuristic elements and data help prove the problem.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Understanding Your Customer’s Story: How one company increased conversion 104% by identifying motivation

May 21st, 2015 2 comments

Every time someone wants to buy something from your brand, there’s a story that explains why they want what you’re selling. Identifying that story is key to making the sale.

How do we know this is true? Because when we know someone’s story, we know their motivation. If someone is highly motivated to get a solution, they’ll put up with almost anything — a poorly written email, a slow website or even a convoluted sales flow — to get it.

Consider this patented heuristic:

 

This isn’t a math formula. It’s a guide that MarketingExperiments and its parent company, MECLABS Institute, derived from analyzing tens of thousands of sales flows. This heuristic reflects what it takes to convert (C) a prospect into a customer and shows how the five variables — motivation (m), value (v), incentive (i), friction (f) and anxiety (a) — relate to each other. The numbers next to the variables identify how powerfully they affect conversion. Note that motivation is the most heavily weighted variable.

If formulas make your eyes cross, all you need to know is this: if a customer is highly motivated, none of the other elements (such as friction, anxiety or a poorly communicated value proposition) can stop them from moving forward in the sales process.

The most recent Web clinic looked at clues that revealed customers’ stories and, consequently, their motivation. Watch it and, within 30 minutes, you’ll get critical information that you can use immediately to drive an impressive lift in conversions.

Consider the experience, the second company outlined during the Web Clinic, of a Canadian window manufacturer who was a student of MarketingExperiments. He called on MECLABS to help him increase conversions from his online site.

 

The Control

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

The Power of a Specific Offer to a Specific Prospect

May 7th, 2015 No comments

Specificity converts. In marketing, there should be no such thing as a general message. The marketer communicates with an aim. This aim should dictate everything else we say. This aim should influence, even constrain, every word we say.

— Flint McGlaughlin, Managing Director and CEO, MECLABS Institute

Specificity converts. A specific offer to a specific person will outperform a general offer to a general person.

This concept relates to a recent email test we ran with our MarketingSherpa audience and ran again (with a slight twist) with our MarketingExperiments audience.

First, in case you’re not familiar with MarketingSherpa, allow me to briefly explain our sister company.

MarketingSherpa’s content is geared toward exploring general marketing principles. This is also where companies and marketers can share specific marketing stories, such as Mellow Mushroom’s social media strategy and Red Bull’s content marketing advice.

Alternatively, the MarketingExperiments audience delves more specifically into the tactics of marketing strategy. MarketingExperiments is about specific tests that the reader can apply to their own marketing.

Now that you understand the difference in content related to the tested audiences, let’s get into the test itself.

 

The test

We tested an email invitation for a recent follow-up Web clinic MarketingSherpa hosted. The clinic’s objectives were to examine the results of a live optimized email test, which was run by the MarketingSherpa audience at Email Summit 2015 alongside Flint McGlaughlin.

The test consisted of two treatments:

  1. Treatment A focused on the Email Summit follow-up test, only mentioning live optimization from the MECLABS team.
  2. Treatment B switched the emphasis by focusing on the live optimization from the MECLABS team, only mentioning the Email Summit follow-up test.

In essence, both emails were an invite to the same Web clinic and messaged the same two offers, just with different expressions of focus.

 

Treatment A: Email Summit follow-up

Subject line: Does peer review work? See the results of the audience optimized email from the Email Summit 2015

Preheader: Plus live email optimization from the MECLABS research team. 

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Here’s Why Most A/B Testing is Boring and Not Really Worth the Trouble

April 6th, 2015 1 comment

Do a quick Google search on “things to a/b test on a website,” scan the results for a moment, then come back and read the rest of this article.

Most of you reading this are marketers, so you know I’m taking a big risk by telling you to go do something else before you read my article.

In fact if you’re reading this now, you’re probably one of the very few who made it back from that incredibly distracting activity I had you do. Thank you. You are exactly the person I want to be reading this. The others can go on their merry way. They are not the ones who need to hear this.

I had you do that search because the Internet is full of people telling you to test things on your website such as color, button size, layouts, forms, etc. I wanted you to get an idea for what’s out there.

Now, I want you to understand why almost everyone writing those articles is wrong

… or at the very least, missing the point.

Please don’t view this as me putting down the people who wrote those articles. I know a few of them personally, and I highly respect the work they are doing. This is not about whether their work is good or bad.

I’ve personally written many articles exactly like the ones they’re writing. In fact, they have one up on me because at least their articles are ranking in Google for popular search terms.

The reason they are missing the point is that most of those articles are focused on the elements of a page rather than the serving of a customer.

I get why they do it.

Webpages are far easier to understand than people. Webpages are a collection of 0s and 1s. People are a collection of who knows what.

And most of you, readers, are looking for webpage fixes — not a deeper, fuller way to serve your customer.

There is nothing necessarily wrong with you, but it’s just that we naturally focus on our own self-interest. It isn’t wrong in itself.

What is wrong is the methods we use to achieve our own goals. I don’t mean morally wrong. I mean practically wrong.

 

Our objective should always be: Make as much money possible.

MECLABS Institute has found after more than 15 years of research that the best method for achieving this objective is to spend as much money possible on serving your customer.

Until we can view every A/B test we run as an opportunity to better serve our customers, we will just be running (ultimately) ineffective tests on page elements.

It doesn’t really matter in the long run which color, layout or page element is going to perform well.

The Internet is constantly changing. Design trends are always going to influence how we look at webpages and their elements. What matters for marketers in the long run is how well we understand and, consequently, how well we can serve our customers.

Flint McGlaughlin, Managing Director and CEO, MECLABS, calls this understanding of our customers “customer wisdom.

This is also why he often says, “The goal of a test is not to get a lift, but rather to get a learning.”

However, it’s one thing to hear this, another to really understand what it means.

It really means we want to conduct research, not run a test.

We want to learn a tangible lesson about our customer so that we can apply it to other areas of our marketing and achieve a maximum return on the amount of time and energy we spend on testing.

Let me show you what I mean with a real-world example. Here’s what happens when you just run an A/B test that is focused on a page element. Let’s take color for instance.

You have two treatments. The only thing changed is the background color. 

 

You also have a result. In this case, the result was a 19.5% increase in clickthrough at a 92% level of confidence. But here’s where things get tricky.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg