Archive

Posts Tagged ‘a/b testing’

A/B Testing: Cut through your KPIs by knowing your ultimate goal

February 4th, 2016 No comments

Marketers often struggle to know what metrics to use when trying to decide on the positioning of their marketing collateral. This can lead to many problems. At MECLABS Institute, the parent company of MarketingExperiments, we have run experiments and tests for over 20 years to help answer this question.

Customers take many actions when moving through the funnel, but what is the ultimate goal the company is trying to achieve with their marketing collateral? By answering this question, companies can best determine what the most important KPI is to measure.

To best illustrate this point, let’s walk through an experiment that was run regarding metrics. By reviewing this experiment we will understand how important it is to have a clearly defined idea of what the ultimate goal is for your marketing collateral.

 

The Experiment:

Background: A large newspaper company offering various subscription options.

Goal: To determine the optimal regular price point after the introductory discounted offer rate.

Research Question: Which price point will generate the greatest financial return?

Test Design: A/B split test

 

Subscription services often offer a discounted introductory rate for new subscribers. This gives potential subscribers a low-risk opportunity to try out the service for a period of time before the cost defaults to the regular full price. In this test, The Boston Globe team hoped to determine the optimal price point for a monthly subscription after the introductory offer rate expired. 

Read more…

Categories: General Tags: , ,

This 1960s Statistician Can Teach You Everything You Need to Know About the Fundamentals of A/B Testing

January 21st, 2016 No comments

I did a training on selling training for the sales team today. It was what Millennials call “meta.”

I was talking about how our training uses scientifically valid experiments to back everything we say in our training rather than best practices, anecdotal case studies or just “expert advice.”

The question naturally arose: “What do we mean when we say ‘scientifically valid experiments’?”

When I answered the question in the meeting, I immediately thought it would be a good idea for a blog post. So, with that said, here’s the answer:

In short, it means that we use the scientific method to validate every piece of knowledge we transfer in the training (and also in our Web clinics and on this blog).

I found myself trying to explain what I learned in high school about the scientific method, and while I was able (I think) to get the basic gist across, I don’t think I did it justice.

Fortunately, after doing a little searching online, I found this guy.

His name is J. Stuart Hunter and he is one of the most influential statisticians of the last half of the twentieth century.

Fortunately, back in the 60s, he recorded some rad videos around experimental designs in a business context. If you can extrapolate a little bit from the industrial context and apply this to a marketing context, it should be everything you need to know about the scientific method, or “what we mean when we say ‘scientifically valid.’”

 

 

Read more…

The Importance of Testing: How one test applied to two email sends resulted in different audience responses

November 23rd, 2015 No comments

At MarketingExperiments, sister company of MarketingSherpa and parent company of MECLABS, we believe in testing. Our Web clinics are a testament to this belief – every month we share research that is designed to help marketers do their jobs better.

This culture of testing encouraged me to run my own test.

First, I ran the test on a MarketingSherpa send. After the results of that test came back, the same test was then applied to a MarketingExperiments’ newsletter.

By running virtually the same test twice for two different audiences not only did I learn more about the preferences of ourreaders, but I also learned how incredibly important it is to test, even when you are sure you know what the results will be.

 

The MarketingSherpa test

As the copy editor at MECLABS, I get to see all of the copy produced by both MarketingExperiments and MarketingSherpa. One of my responsibilities is overseeing the email newsletter sends. Every Monday, MarketingSherpa sends out a Best of the Week newsletter which features the most popular articles of the previous week and introduces the new book for the MarketingSherpa Book Giveaway.

The copy in these newsletters was extensive. Every article listed had a full summary which, as a reader, I imagined would seem overwhelming when opening this email on a Monday morning.

Read more…

Categories: General Tags: ,

How Variance Testing Increased Conversion 45% for Extra Space Storage

November 12th, 2015 No comments

When it comes to testing, A/B testing typically steals the spotlight, casting its sister procedure, variance testing, in the shadows. However, according to Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, that’s a mistake.

At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, Emily presented on how her team was able to utilize variance testing to transform Extra Space Storage’s Wild West testing culture into a wildly successful testing environment.

Before the team conducted variance testing, the company’s testing environment was structured like a free-for-all. There were few, if any, set rules in place, and, according to Emily, the person with the highest title and the loudest voice typically had their test implemented. All of this changed after the Extra Space Storage team ran some variance tests.

Variance testing measures two identical Web experiences to determine a site or page’s natural variability. This procedure generally constructs the rules for subsequent A/B tests to follow.

By focusing on variance testing and translating the results from this procedure into rules for A/B testing, Extra Space Storage achieved a 45% increase in conversion rate from the previous year. Watch the below excerpt to learn the results of the team’s test, the rules they developed and Emily’s advice on when to start variance testing and how to implement it.

  Read more…

Nonprofit Testing: How one small change led to 190% increase in clickthrough

September 28th, 2015 1 comment

As marketers, we all dream of expensive radical redesigns of our websites that check off every item on our wish lists. Over the years here at MarketingExperiments, though, we have routinely discovered that with a proper understanding of customers, the smallest changes can often yield the biggest results.

A recent test from our friends over at NextAfter demonstrates this fact.

 

Background

As Tim Kachuriak, Chief Innovation and Optimization Officer, NextAfter, noted when he joined us for our August Web clinic, Personalized Messaging Tested, NextAfter works exclusively with nonprofit organizations to discover what truly makes donors give.

Dallas Theological Seminary (DTS) is one such organization that NextAfter has partnered with to help answer this question.

In an earlier experiment with DTS, Kevin Peters, Vice President, NextAfter, had found that visitors arriving at the organization’s primary donation page were highly motivated to give. He discovered this by testing two forms of the page.

The first version of the donation page cut immediately to the chase, asking donors to “make a gift online.” The second version of the page posited that perhaps DTS was asking too much, too soon, and prefaced the “ask” with copy highlighting the unique value proposition of DTS. Quotes from well-known figures in the Christian community were also leveraged to build additional credibility. 

Read more…

The A-Z of A/B Testing: Handy guide to nearly 200 conversion optimization definitions

September 14th, 2015 No comments

Here at MarketingExperiments, applying the scientific method to digital marketing campaigns is at the heart of what we do.

As digital marketing, A/B testing and optimization have become so important over the last 15 years that we’ve literally built the MECLABS Institute around them, a whole new language has sprung up around these ideas, complete with its own unique buzzwords, acronyms and industry shorthand.

As marketers, particularly those just starting to test, it can be exhausting just figuring out how to talk the talk.

To help you cut through the jargon and get straight to the heart of these terms, we at MarketingExperiments have created the Marketing and Online Testing Dictionary. This tool not only provides definitions for nearly two hundred marketing terms related to A/B and multivariate testing, but also offers additional resources to help you learn more about many of the ideas and concepts presented herein.

Download your copy of the Marketing and Online Testing Dictionary now, or risk being that guy or girl in the next marketing meeting, clumsily Googling optimization terms on your smartphone under the conference table while praying that your CMO doesn’t notice.

 

Read more…