Archive

Archive for the ‘Analytics & Testing’ Category

Three Steps to Boosting Conversions by Building Customer Relationships

August 20th, 2015 No comments

MarketingExperiments’ Marketer’s Creed, Article 1, states:

“People don’t buy from companies, from stores or from websites; people buy from people. Marketing is not about programs; it is about relationships.”

NextAfter, an organization dedicated to helping nonprofits discover what truly makes donors give, put this concept to the test during an end-of-year fund-raising push. Tim Kachuriak, the company’s Chief Innovation and Optimization Officer, is so passionate about this cause that he cut his family vacation short and flew cross-country to join us in the studio for the most recent MarketingExperiments Web clinic to share his results and show how what NextAfter learned could benefit any organization.

Watch it here. 

Background: The Heritage Foundation, a think tank, was soliciting end-year donations.

Goal: To increase donations.

Research question: Which email treatment will generate the most revenue?

Test: A/B split test

 

Version A

This email drips with formality and reads like a direct-mail piece. It begins with “Dear Fellow Conservative” and goes on to exhort the reader to “make a bold statement by standing with the Heritage Foundation.” It is signed by Jim DeMint, the foundation’s president.

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Website Optimization: How a B2B publishing company increased free trial sign-ups by 36.4%

July 6th, 2015 No comments

A popular method to acquire new customers for online subscription models is through free trials. The hope is that by using a service or product, prospective customers can fully experience and appreciate the value you have to offer.

However, before the experience begins, you must first get them to see the value of the trial. After all, it can be a friction- and anxiety-filled process since many trials require credit card information at the time of sign-up.

Increasing the completion rate for the trial sign-up process was one of three steps the Euromoney Institutional Investor team took to revamp its strategy and increase conversions. The team accomplished this through online testing after seeing that 60% of traffic in the funnel did not complete the process.

At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, Ben Eva, Global Head of Conversion Management, Euromoney Institutional Investor, shared three online tests his team used to learn about their customers and to increase that completion rate.

What is the best layout/design of the offer page?                            

Euromoney publishes over 200 online information services — which equals a lot of potential for testing. Because each audience varies, sometimes the same type of test is needed to find the best result for each audience.

Ben shared two tests that revolve around page design and how to best communicate the value of the publications and their free trials.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Finding Your Ideal Email Send Time to Maximize Relevancy

July 2nd, 2015 No comments

We get this question probably more than any other: when is the best time to send an email to customers?

The answer is that it is totally unique to your organization. There are many factors that could influence what your perfect send time is and the motivations of your customers.

Testing for the correct send time for your customers can increase the relevancy of your email campaigns and, in turn, increase your overall clickthrough rate. To discover your unique send time, there are two simple tests that you can perform that will focus your email campaigns to send on the best day, at the best time.

The first test you should run is a day of the week test.

 

Newsletter Day of the Week Test

Earlier this year we helped a tourism organization find out when to send its monthly promotional newsletter. To do this, we used Monday as the control and evenly split the traffic of newsletters to be sent each day of the week.

Based on the primary KPI, Opened/Delivered (open rates), the test had favorable results, and at least one treatment outperformed the control significantly.

However, the secondary KPI, Clicked/Delivered (clickthrough rates), had inconclusive results.

 

Primary metric — open rate: Monday was used as the control, and Saturday had the highest performing day with 5.5% relative difference with a 99% Level of Confidence (LOC).

Tuesday and Wednesday also outperformed the control with 3.8% relative difference with a 96% level of confidence. Tuesday, Wednesday and Saturday had statistically significant higher open rates than the other days of the week.

The remaining days —Thursday, Friday and Sunday — underperformed, though they had low levels of confidence. This indicates that there was little difference between these days and Monday open rates.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg
Categories: Analytics & Testing Tags:

Website Analytics: How to use data to determine where to test

May 28th, 2015 6 comments

At MarketingExperiments, we use patented heuristics to evaluate websites, emails and other digital mediums. 

Often people think that a heuristic evaluation is a purely qualitative approach to a problem. This can be true, but when you combine quantitative analytics with the qualitative knowledge you increase the power to make meaningful change.

This post will show relevant metrics for three of these elements that any marketer — from beginner to advanced — can use to discover opportunities for improvement.

 

Step #1. Look at the qualitative elements of your website

Often people just ask for data dumps. To make matters worse, they want it in a very short time. On top of that, most data scientists use purely what they are comfortable with: numbers.

To add context and save time, you must evaluate the site to see where you should focus your data analysis.

Put yourself in the customer’s mindset and go to your site. If you own creative or design, try to remove those biases as best as possible. What makes sense to you or feels like the right amount of information may be completely overwhelming to a customer who isn’t familiar with your product or industry.

 

Look for things that are broken, functions that are clunky, images that don’t make sense or add value and difficulty in completing the conversion. You must objectively look at all the pages in the conversion path and be familiar enough with them to make sense of the data that you pull.

Pull data to illuminate the points you saw to give validity to the theory. Key heuristic elements and data help prove the problem.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Understanding Your Customer’s Story: How one company increased conversion 104% by identifying motivation

May 21st, 2015 2 comments

Every time someone wants to buy something from your brand, there’s a story that explains why they want what you’re selling. Identifying that story is key to making the sale.

How do we know this is true? Because when we know someone’s story, we know their motivation. If someone is highly motivated to get a solution, they’ll put up with almost anything — a poorly written email, a slow website or even a convoluted sales flow — to get it.

Consider this patented heuristic:

 

This isn’t a math formula. It’s a guide that MarketingExperiments and its parent company, MECLABS Institute, derived from analyzing tens of thousands of sales flows. This heuristic reflects what it takes to convert (C) a prospect into a customer and shows how the five variables — motivation (m), value (v), incentive (i), friction (f) and anxiety (a) — relate to each other. The numbers next to the variables identify how powerfully they affect conversion. Note that motivation is the most heavily weighted variable.

If formulas make your eyes cross, all you need to know is this: if a customer is highly motivated, none of the other elements (such as friction, anxiety or a poorly communicated value proposition) can stop them from moving forward in the sales process.

The most recent Web clinic looked at clues that revealed customers’ stories and, consequently, their motivation. Watch it and, within 30 minutes, you’ll get critical information that you can use immediately to drive an impressive lift in conversions.

Consider the experience, the second company outlined during the Web Clinic, of a Canadian window manufacturer who was a student of MarketingExperiments. He called on MECLABS to help him increase conversions from his online site.

 

The Control

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

The Power of a Specific Offer to a Specific Prospect

May 7th, 2015 No comments

Specificity converts. In marketing, there should be no such thing as a general message. The marketer communicates with an aim. This aim should dictate everything else we say. This aim should influence, even constrain, every word we say.

— Flint McGlaughlin, Managing Director and CEO, MECLABS Institute

Specificity converts. A specific offer to a specific person will outperform a general offer to a general person.

This concept relates to a recent email test we ran with our MarketingSherpa audience and ran again (with a slight twist) with our MarketingExperiments audience.

First, in case you’re not familiar with MarketingSherpa, allow me to briefly explain our sister company.

MarketingSherpa’s content is geared toward exploring general marketing principles. This is also where companies and marketers can share specific marketing stories, such as Mellow Mushroom’s social media strategy and Red Bull’s content marketing advice.

Alternatively, the MarketingExperiments audience delves more specifically into the tactics of marketing strategy. MarketingExperiments is about specific tests that the reader can apply to their own marketing.

Now that you understand the difference in content related to the tested audiences, let’s get into the test itself.

 

The test

We tested an email invitation for a recent follow-up Web clinic MarketingSherpa hosted. The clinic’s objectives were to examine the results of a live optimized email test, which was run by the MarketingSherpa audience at Email Summit 2015 alongside Flint McGlaughlin.

The test consisted of two treatments:

  1. Treatment A focused on the Email Summit follow-up test, only mentioning live optimization from the MECLABS team.
  2. Treatment B switched the emphasis by focusing on the live optimization from the MECLABS team, only mentioning the Email Summit follow-up test.

In essence, both emails were an invite to the same Web clinic and messaged the same two offers, just with different expressions of focus.

 

Treatment A: Email Summit follow-up

Subject line: Does peer review work? See the results of the audience optimized email from the Email Summit 2015

Preheader: Plus live email optimization from the MECLABS research team. 

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg