Archive

Archive for the ‘Analytics & Testing’ Category

Marketing Analytics: Show your work

August 14th, 2014 1 comment

Data handling and analytics can sometimes offer shocking results, as global B2B company National Instruments discovered after a surprising decrease in an email campaign’s conversion rate.

 

Key Obstacle: Concern about the new numbers

“When I first saw the number change, I was a bit freaked out,” said Stephanie Logerot, Database Marketing Specialist, National Instruments.

Stephanie, as a strategist, felt her greatest challenge was communicating the new way of looking at the data to National Instruments’ stakeholders outside of the database marketing team. This meant making certain everyone understood why the numbers dropped after implementing the new, more stringent data criteria.

 

A little background

A recent MarketingSherpa Email Marketing case study– “Marketing Analytics: How a drip email campaign transformed National Instruments’ data management” – detailed this marketing analytics challenge at National Instruments.

The data challenge arose from a drip email campaign set around its signature product.

The campaign was beta tested in some of National Instruments’ key markets: United States, United Kingdom and India. After the beta test was completed, the program rolled out globally.

The data issue came up when the team looked into the conversion metrics.

The beta test converted at 8%, the global rollout at 5%, and when a new analyst came in to parse the same data sets without any documentation on how the 5% figure was determined, the conversion rate dropped to 2%.

While interviewing the team for the case study, as what often happens in these detailed discussions, I ended up some great material that didn’t make it into the case study and wanted to share that material with you.

 

The team

For the case study, I interviewed Ellen Watkins, Manager, Global Database Marketing Programs, Stephanie, the database marketing specialist, and Jordan Hefton, Global Database Marketing Analyst, all of National Instruments at the time. Jordan was the new analyst who calculated the 2% conversion rate.

In this MarketingExperiments Blog post, you’ll learn how the team dealt with the surprising drop in conversion, and how they communicated why data management and analytics was going to be held to a new standard going forward.

The team overcame this obstacle with a little internal marketing.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Why Responsive Design Does Not Care About Your Customers

July 31st, 2014 4 comments

Responsive design, like any new technology or technique, does not necessarily increase conversion.

This is because when practicing Web optimization, you are not simply optimizing a design; you are optimizing a customer’s thought sequence. In this experiment, we discovered the impact responsive design has on friction experienced by the customer.

Background: A large news media organization trying to determine whether it should invest in responsive mobile design.

Goal: To increase free trial signups.

Research Question: Which design will generate the highest rate of free trial sign-ups across desktop, tablet and mobile platforms: responsive or unresponsive?

Test Design: A/B multifactorial split test

 

The Control: Unresponsive design

unresponsive-design

 

During an initial analysis of the control page, the MECLABS research team hypothesized that by testing a static page versus an overlay for the free trial, they would learn if visitors were more motivated with a static page as there is no clutter in the background that might cause distraction.

From this, the team also theorized that utilizing a responsive design would increase conversion as the continuity of a user-friendly experience would improve the customer experience across multiple devices.

The design for the control included a background image.

 

The Treatment: Responsive design

responsive-design

 

In the treatment, the team removed the background image to reduce distraction and implemented a responsive design to enhance user experience across all devices.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Testing: 3 resources to inspire your ecommerce optimization

July 3rd, 2014 No comments

Optimizing to improve a customer experience can be a little overwhelming when you consider all the nuts and bolts that make up an entire ecommerce property in its entirety.

In this MarketingExperiments Blog post, we’ll take a look at three ecommerce resources from our testing library that will hopefully spark a few ideas that you can to add to your testing queue.

 

Read: A/B Testing: Product page testing increases conversion 78%

ebook-retailer-versions

 

How it can help

This experiment with a MECLABS Research Partner is a great example illustrating how testing elements on your product pages that are probable cause for customer concern is the best way to alleviate anxiety.

 

Watch: Marketing Multiple Products: How radical thinking about a multi-product offer led to a 70% increase in conversion

 

In this Web clinic replay, Austin McCraw, Senior Director of Content Production, MECLABS, shared how radical thinking about a multi-product offer led one company to a 70% increase in conversion.

 

How it can help

 One big takeaway from this clinic you need to understand is that strategic elimination of competing offers on pages with multiple products can help drive customers’ focus to the right product choices for their needs.

 

Learn: Category Pages that Work: Recent research reveals design changes that led to a 61.2% increase in product purchases

 

These slides are from a Web clinic on category pages in which Flint McGlaughlin, Managing Director, MECLABS, revealed the results of category page design changes that increased clicks and conversions across multiple industries.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Testing: How to use A/A testing to break through the noise

June 30th, 2014 2 comments

Getting a lift from your testing efforts can be satisfying and rewarding.

Not to mention, increases in conversion have changed the fortunes of entire enterprises and the careers of the marketers who advocated testing.

But is a lift truly a lift, or is it simply a false positive resulting from natural variation?

In this MarketingExperiments Blog post, I wanted to share an excellent example of using A/A testing (and yes, you are reading that correctly) from Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, presented at Web Optimization Summit 2014.

 

What does variance in testing look like?

variance-testing-homepage

 

Here’s the example Emily shared with the audience to help put variance in context using a control and treatment of Extra Space Storage’s homepage.

There is absolutely no difference between these pages except for the 15% difference in conversion.

According to Emily, that’s when you need to start investigating how variance is potentially impacting your testing efforts because there should be little to no difference in terms of performance in identical pages.

“A 15% lift is more concerning,” Emily explained, “because there should be no difference with the same experience.”

 

A/A testing is not A/B testing

variance-testing-explanation

 

Emily also noted a key distinction between A/A and A/B testing that is really important to grasp:

  • A/A testing – Can help you measure the natural variability (noise) of a website by testing an identical experience.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

A/B Testing: Product page testing increases conversion 78%

June 26th, 2014 3 comments

Product pages are arguably the heart of an ecommerce website.

It’s where potential customers learn about your products in a guided conversation that should deliver value and an overall top-notch customer experience.

Consequently, the elements on those pages are also potentially where you’re losing conversions due to anxiety.

At MarketingExperiments, we define customer anxiety as “a psychological concern stimulated by a given element in the sales or sign-up process.”

So how do you identify and mitigate anxiety on product pages?

In this MarketingExperiments Blog post, I wanted to share a recent experiment where an e-book retailer asked that same question and started testing to discover a way to answer it.

But, before we dive in, let’s view the background notes on the test to put the experiment into context.

Background: A large e-book retailer.

Goal: To increase the overall number of e-book sales.

Research Question: Which attempt to reduce anxiety will result in the highest conversion rate?

Test Design: A/B variable cluster split test

 

Side by side

ebook-retailer-versions

 

The team hypothesized that testing key product page elements could help them determine the true impact of anxiety on a product page.

Here is a quick breakdown of the elements the team chose to test in each treatment:

  • Version A – Attempted to reduce anxiety by using security seals
  • Version B – Highlighted compatibility by illustrating the product is multi-device friendly
  • Version C – Provided a synopsis of the content to help customers determine if the e-book would suit their interests at the top of the page
  • Version D – Emphasized quick accessibility to the product upon purchase

 

Results

 product-pages-test-results

 

Moving the product description up on the page resulted in a 78% relative increase in conversion.

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Optimization: Testing value prop to grow your tribe

June 23rd, 2014 1 comment

I have a deep respect for the marketers at nonprofits.

How they deliver appeal and exclusivity to donors, in some circumstances, can potentially make or break solvency.

Consequentially, I would argue testing and optimizing value proposition for nonprofits is vital.

In this MarketingExperiments Blog post, we’ll take a look at an experiment from a Web Optimization Summit 2014 presentation from featured speaker Tim Kachuriak, Chief Innovation and Optimization Officer, Next After, on “selling the intangible.”

Before we begin, here are some background notes on the test.

Background: The Heritage Foundation, a think tank located in Washington, D.C.

Objective: To increase the donation conversion rate.

Primary Research Question: How does value proposition affect conversion rate?

Test Design: Radical redesign A/B split test

 

Side by side

donation-page-experiment

 

Here are the control and treatment versions of the donation pages side by side.

According to Tim, the primary focus for his team was gaining a deeper understanding of how value proposition impacts donor behavior.

 

Treatment

treatment-elements-donation-page

 

In the treatment, Tim and the team identified elements on the landing page that would likely have the greatest impact on value proposition:

  • Headline – Deliver value right up front
  • Bullets – Quickly highlight reasons to donate
  • Testimonials – Share third-party sources who are fans
  • Call-to-action – Make intentions for donors clear and easy

 

Results

donation-page-test-results

 

The treatment outperformed the control by 189%.

Fellow optimization fanatics should also take note here that the winner was also a long-copy page with the CTA below the fold.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg