Archive

Archive for the ‘Analytics & Testing’ Category

Copywriting: Brevity is the soul of marketing

November 20th, 2014 No comments

I’ve always loved this quote:

“Brevity is the soul of wit.” – William Shakespeare

To me, its beauty rests in the powerful meaning packed in six simple words. Brevity can also be used as a tool to aid your marketing, as I discovered from a recent email experiment.

But first, a little more detail about the experiment.

Background: A global producer of high-quality audio equipment and accessories.

Goal: To increase clickthrough rates in an email.

Research Question: Which email will generate the highest clickthrough rate?

Test Design: A/B multifactor, radical redesign split test

 

Control email-test-control

 

In a preliminary review of the control, the MECLABS research team hypothesized the control was at risk of underperforming and could use some strategic tweaks.

 

Treatment

 

For the treatment, the team removed the stock image that focused on lifestyle to showcase the product using a larger image.

The headline was also changed to “An entirely new way to stream music wirelessly in any room easier than ever before” to help add emphasis on utility.

 

Results

 

The treatment saw a 27% increase in clickthrough rate that validated with a 99% statistical level of confidence.

 

What you need to know

If there is one simple takeaway from this test, from my point of view, it’s that brevity is the heart of relevance and the soul of marketing.

In this case, the customers seemed to agree.

The optimized headline was more concise with the product’s ability to “stream music wirelessly” to “any room.”

Clearly communicating what you can do with a product is likely to generate more relevance and appeal for email recipients over the long run.

But what about the images, you may ask? Clarity also was a driver of delivering meaning and here’s why:

The optimized email removed the stock image of people and focused solely on the product itself.

By telling you what the product can do versus where it fits in your kitchen, the new imagery immediately and clearly connected recipients with the essence of the offer.

If you’re interested in learning more about how this experiment held convert attention into clicks, you can check out the newly released Web clinic, “Converting Opens to Clicks.”

 

You may also like

Marketing Automation: Precor achieves 74% lift in new leads via segmented database overhaul [Case study]

Email Marketing: The Kentucky Derby’s customer-centric newsletter reduces opt-out rate 64% [Case study]

Newsletter Engagement: 3 tactics Calendars.com used to improve its monthly sends [More from the blogs]

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Testing and Optimization: 4 inspirational examples of experimentation and success

November 6th, 2014 1 comment

At our sister publication, MarketingSherpa, we publish four case study beats – B2B, B2C, Email and Inbound – with stories covering actual marketing efforts from your peers each week. Not every case study features a testing and optimization element, but many do.

For this MarketingExperiments Blog post, I wanted to share a quick summary of several of these case studies, along with links to the entire article (including creative samples) in case any pique your interest and you want to dig into the entire campaign.

So, without further ado, read on for four MarketingSherpa case studies that feature testing and optimization of various digital marketing channels, strategies and tactics.

 

Case Study #1. 91% conversion lift from new copy and layout

This case study features AwayFind, a company that provides mobile email alerts, and covers an effort to test, and hopefully improve, its homepage performance.

Brian Smith, Director of Marketing, AwayFind, said, “Our primary driver of traffic is our PR efforts. Our homepage is effectively our primary landing page, and we need to convert that traffic into premium users.”

The testing included both changing copy and layout elements. The main copy change was instead of focusing on features, the treatment copy focused on benefits, and layout tweaks included a shortened headline, the remaining copy was split between a subhead and a smaller block of text, and the color of the subhead text was also modified.

In this test, the treatment achieved:

  • 42% increase in clicks to the sign-up page
  • 91% increase in registrations for the trial

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

4 Threats that Make Email Testing Dangerous and How a Major Retailer Overcame Them

October 2nd, 2014 No comments

To test emails, you just send out two versions of the same email. The one with the most opens is the best one, right?

Wrong.

“There are way too many validity threats that can affect outcomes,” explained Matthew Hertzman, Senior Research Manager, MECLABS.

A validity threat is anything that can cause researchers to draw a wrong conclusion. Conducting marketing tests without taking them into account can easily result in costly marketing mistakes.

In fact, it’s far more dangerous than not testing at all.

“Those who neglect to test know the risk they’re taking and market their changes cautiously and with healthy trepidation,” explains Flint McGlaughlin, Managing Director and CEO, MECLABS, in his Online Testing Course. “Those who conduct invalid tests are blind to the risk they take and make their changes boldly and with an unhealthy sense of confidence.”

These are the validity threats that are most likely to impact marketing tests:

  • Instrumentation effects — The effect on a test variable caused by an external variable, which is associated with a change in the measurement instrument. In essence, how your software platform can skew results.
    • An example: 10,000 emails don’t get delivered because of a server malfunction.
  • History effects — The effect on a test variable made by an extraneous variable associated with the passing of time. In essence, how an event can affect tests outcomes.
    • An example: There’s unexpected publicity around the product at the exact time you’re running the test.
  • Selection effects — An effect on a test variable by extraneous variables associated with the different types of subjects not being evenly distributed between treatments. In essence, there’s a fresh source of traffic that skews results.
    • An example: Another division runs a pay-per-click ad that directs traffic to your email’s landing page at the same time you’re running your test.
  • Sampling distortion effects — Failure to collect a sufficient sample size. Not enough people have participated in the test to provide a valid result. In essence, the more data you collect, the better.
    • An example: Determining that a test is valid based on 100 responses when you have a list with 100,000 contacts.
Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Subscription Checkouts Optimized: How experimentation led to compounding gains at the revenue level

August 25th, 2014 No comments

Subscriptions have been the lifeblood of almost every media publication since the conception of the industry.

But imagine for a moment that you were trying to subscribe to your favorite newspaper and you were presented with something that looked like the page below.

 

Experiment #1. Reworking disconnected, confusing pages

checkout-test-control

 

This was the first step in the checkout process for  subscribing to a large media publication. 

Editor’s Note: To protect their competitive advantage, we have blurred their identity.

Once a customer entered their ZIP code to determine whether this publication could be delivered to their area, they were taken to this page. Put yourself in the mind of the customer and think about how this page would have been received.

That is precisely what the marketing team did. What they saw was a very disconnected page that gave the customer almost no reassurance that they were still buying from the well-known media publication.

  • The publication logo was almost entirely missing from the page.
  • The colors on the page did not match the brand of the company.
  • The two levels of navigation at the top of the page provided multiple opportunities to click away.
  • The entire process seemed complicated to the customer.

Though there were a number of things the team wanted to change on this page, they needed a new page that changed only a few elements. Every day this page was live on the site, the publication was losing potential revenue from customers finding the process too difficult to complete. A long, arduous Web redesign was not an option. They needed to recover some of that revenue as fast as possible.

So the team ran an experimental treatment in an online test that they thought would require the least amount of time and resources and still achieve a high return on investment. The treatment is displayed below.

checkout-test-treatment

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Marketing Analytics: Show your work

August 14th, 2014 1 comment

Data handling and analytics can sometimes offer shocking results, as global B2B company National Instruments discovered after a surprising decrease in an email campaign’s conversion rate.

 

Key Obstacle: Concern about the new numbers

“When I first saw the number change, I was a bit freaked out,” said Stephanie Logerot, Database Marketing Specialist, National Instruments.

Stephanie, as a strategist, felt her greatest challenge was communicating the new way of looking at the data to National Instruments’ stakeholders outside of the database marketing team. This meant making certain everyone understood why the numbers dropped after implementing the new, more stringent data criteria.

 

A little background

A recent MarketingSherpa Email Marketing case study– “Marketing Analytics: How a drip email campaign transformed National Instruments’ data management” – detailed this marketing analytics challenge at National Instruments.

The data challenge arose from a drip email campaign set around its signature product.

The campaign was beta tested in some of National Instruments’ key markets: United States, United Kingdom and India. After the beta test was completed, the program rolled out globally.

The data issue came up when the team looked into the conversion metrics.

The beta test converted at 8%, the global rollout at 5%, and when a new analyst came in to parse the same data sets without any documentation on how the 5% figure was determined, the conversion rate dropped to 2%.

While interviewing the team for the case study, as what often happens in these detailed discussions, I ended up some great material that didn’t make it into the case study and wanted to share that material with you.

 

The team

For the case study, I interviewed Ellen Watkins, Manager, Global Database Marketing Programs, Stephanie, the database marketing specialist, and Jordan Hefton, Global Database Marketing Analyst, all of National Instruments at the time. Jordan was the new analyst who calculated the 2% conversion rate.

In this MarketingExperiments Blog post, you’ll learn how the team dealt with the surprising drop in conversion, and how they communicated why data management and analytics was going to be held to a new standard going forward.

The team overcame this obstacle with a little internal marketing.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Why Responsive Design Does Not Care About Your Customers

July 31st, 2014 4 comments

Responsive design, like any new technology or technique, does not necessarily increase conversion.

This is because when practicing Web optimization, you are not simply optimizing a design; you are optimizing a customer’s thought sequence. In this experiment, we discovered the impact responsive design has on friction experienced by the customer.

Background: A large news media organization trying to determine whether it should invest in responsive mobile design.

Goal: To increase free trial signups.

Research Question: Which design will generate the highest rate of free trial sign-ups across desktop, tablet and mobile platforms: responsive or unresponsive?

Test Design: A/B multifactorial split test

 

The Control: Unresponsive design

unresponsive-design

 

During an initial analysis of the control page, the MECLABS research team hypothesized that by testing a static page versus an overlay for the free trial, they would learn if visitors were more motivated with a static page as there is no clutter in the background that might cause distraction.

From this, the team also theorized that utilizing a responsive design would increase conversion as the continuity of a user-friendly experience would improve the customer experience across multiple devices.

The design for the control included a background image.

 

The Treatment: Responsive design

responsive-design

 

In the treatment, the team removed the background image to reduce distraction and implemented a responsive design to enhance user experience across all devices.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg