Internet Marketing Journal - MarketingExperiments.com

FREE subscription to more than $20 million in marketing research

Enter your email below to receive the latest in marketing research and exclusives offers from MarketingExperiments.

create Lab ID
We value your privacy. View details on how to contact us.

You can also get our latest research delivered to you via our social media feeds below.

 

Follow us on Twitter Join our Optimization Group Visit the MarketingExperiments Blog
Listen to our MP3 Podcast Listen to our podcast on iTunes Watch videos on our YouTube Channel

Free Clinic

Our Next
Free Clinic



Optimizing Subscription Paths: How a radical webpage redesign produced a 173% lift in customer response

Wednesday, October 8, 2014
4:00 - 4:35 p.m. EDT

Reserve My Place


Research Topics

Site Optimization

Email Optimization

Search Optimization

Marketing Optimization


Training


Search


"Your emails and website provide the common sense that we often leave at the door to the office."

David Jones
Senior Practice Leader
Kaiser Permanente




"I often change small things across many pages as MarketingExperiments gets me thinking about how to increase the straighforwardness of my pages for the visitor."

Amanda Schaner
Marketing Coordinator
Home Science Tools




"The live web clinics are very humbling. I think we're doing a pretty good job. But every time I attend, I realize we could do things much better for our clients. I consider MarketingExperiments my encyclopedia, and parrot the principles I learn every chance I get."

Troy O'Bryan
Co-Founder and Chief Response Officer
Response Capture, Inc

Home arrow Site Optimization arrow A/B Split Testing
A/B Split Testing
Tuesday, 16 August 2005
Synopsis

Topic: A/B Split Testing — How to use A/B Split Testing to Increase Conversion Rates, Challenge Assumptions and Solve Problems

We recently released the audio recording of our clinic on this topic. You can listen to a recording of this clinic here:

A/B Split Testing

Most of us are familiar with the concept of using A/B split testing to determine which elements on a page are helping the performance of a web page, and which are not.

For instance, one might typically test two different headlines on a landing page. One would then outperform the other, and you would know which is the top-performing page.

However, there is more you can do with A/B Split testing.

  1. You can use A/B split testing to better understand visitor behaviors and priorities when visiting your site.
  2. You can use A/B split testing to solve specific problems you have with your site pages. In other words, use it as a diagnostic tool to find out what is going wrong and how to fix it.
  3. You can use A/B split testing to dramatically challenge assumptions you may have about the "best" way to design or write a page. (Test not only changes in minor elements, but also complete and dramatic redesigns of an entire page.)

This brings up another important point. Testing yields the most valuable results only when you test repeatedly. A one-shot test will tell you very little. But when you make a consistent habit of testing, cumulative tests over time can have a dramatic impact on the success of your site.

Our own test results, outlined in this brief, reveal just how much can be achieved and learned though a "simple" A/B split test.

Findings

Let's look at three tests and their results, each showing how we can not only improve results, but also learn more about what works, what doesn't... and what readers are looking for.

TEST 1: Testing the Impact of Current Events of Email Response Rates.

We set up a simple test with two emails, both of which were written to drive click-throughs to a site and convert leads into sales.

  • In email A we wrote the sales text within the context of an emotionally charged news story that was making headlines at the time.
  • In email B we wrote the email without specific mention of the event, but still alluded to "recent events in the news."
  • The essential difference between the two emails is that one mentioned the event by name, and the other didn't.
  • We tested our messaging to more than opt-in 337,466 email addresses.
  • We compiled the results after 12 days, although some clicks continue to trickle in.

Here are the results of the test after the first 12 days:

Email Copy Test Results
  Email A Email B
Emails Sent 168,733 168,733
Clicks 5,119 4,395
Click-Through Rate (CTR) 3.03% 2.60%
Sales 175 122
Conversion (Click to Sale) 3.42% 2.78%
Conversion (Email to Sale) 0.104% 0.072

Check boxWhat You Need To UNDERSTAND: Email A (specifically mentioning the news story and events surrounding it) significantly outperformed Email B. CTR increased by 16.5% and overall conversion (email to sale) increased by 43.4%.

The email that disclosed the specifics of this news piece generated 53 more orders (an increase of 43.4%) than the email that only alluded to the events surrounding the story without mentioning specifics.

TEST 2: Testing a Specific Problem

In our second test, we believed that customers visiting our test site with an 800x600 or 1024x768 resolution monitor were not finding the relevant sales language for the primary site product unless they scrolled down that page. 

We set up an A/B/C split to test this hypothesis:

  • Page A was the original page.
  • Page B featured slightly shortened data and used a "click here" anchor text to take visitors down the page. This page showed the order process on a 1024x768 resolution monitor and on an 800x600 resolution monitor it displayed the offer copy for the primary product.
  • Page C was a radical redesign in which the order process was partially viewable on 800x600 and higher. It used two columns to make more information available "above the fold."

Here are the results of our testing:

A/B/C Split Test
  Page A Page B Page C
Percent of Traffic 34% 33% 33%
New Sales 244 282 114
Change N/A 15.57% - 53.28%

Check boxWhat You Need To UNDERSTAND: Page B outperformed the original page by 15.57%. Page C was a dismal failure.

Screen shots of the pages above are available below:

In a recent web clinic, we surveyed our audience as to which of these pages would perform the best. They overwhelmingly chose Page C, showing that what seems "intuitive" to most marketers is not always revealed as the best page after testing.

The survey results and clinic notes are available here.

In this test, our hypothesis about important sales language appearing higher on the page proved correct. However, the two-column approach of Page C was ineffective.

TEST 3: Challenging Assumptions by Testing the "Obvious" and Learning from the Results

In this test we created two versions of a simple sales page online. Each page was approximately two screens in length and asked the reader to complete a short form in order to receive a free informational product.

  • In version A we added some personal elements to the page, including a photo of the writer, a personal introduction and a signature. In other words, we created a personal "sales letter" on the web page.
  • In version B the sales copy was largely the same, but without the personal elements: no photo, no salutation, and no signature.

Which version won? Conventional wisdom online suggested to us that the personal version would be the winner. Here are the results of our testing:

A/B Split Test
Version A (Personalized) Version B (Institutional)
Conversion Rate 34.6% 39.9%

Check boxWhat You Need To UNDERSTAND: Version B outperformed Version A by 15.3%.

In this case, our expectation that the personalized copy would perform better was simply not met.

There are two points to consider here:

First, if we had never tested these pages, we would have left money on the table by assuming the personal version would do better.

Second, this was only one test. What if we had taken the personal version and doubled the length of the copy as well, adding more of an individual sales pitch? What if we had changed the photo in some way, or repositioned it on the page?

KEY POINT: Each test provides answers. But each test should also stimulate further thought and additional rounds of testing to learn more.

A/B Split Testing Protocol:

Whether you have conducted any A/B split testing before or not, the following steps may help you formalize a regular testing program, and help you improve your results.

Test Protocol for A/B Split Testing for Landing Pages:

  1. Develop Your Capabilities and Select the Right Tools

    A/B split testing tools vary from simple CGI scripts to sophisticated software applications. You will find a list of services in the Literature Review at the end of this report.

    Even without sophisticated A/B testing capability, sequential testing offers you an opportunity to learn many insights about your pages. For more on sequential testing, see the Concluding Comments, below.

  2. Identify Your Established Control Page

    Your control page will be the page against which you test all subsequent optimization efforts. If you are just getting started with A/B testing, your control page will be your current landing page before any optimization. When a new page performs better than the existing control page, it then becomes your control page in subsequent testing.

  3. Establish Your Testing Goals and Parameters

    What are you trying to accomplish with A/B split testing? Are you after more subscribers, a higher conversion rate, or a greater return on investment on your PPC campaigns? Your goals will determine your testing parameters, which will determine the potential success of your testing efforts.

  4. Determine Your Sufficient Test Interval

    This time period should allow you enough time to gather sufficient data to gauge real insight about your A/B tests. Identify the number of unique visitors and/or conversions needed to establish good data and then determine how long it will take you to generate this traffic. This number will vary from business to business, but should give you enough data to confidently declare a "winner."

  5. Create 1-3 Radical Redesigns

    KEY POINT: These pages are not subtle optimizations changing only one or two elements on the page, but are wholly different pages representing a radically different approach.

  6. Evaluate These Redesigns in A/B Split Tests

    Test these alternate landing pages against the control page. Ideally, each page will be tested against every other page, but if that is impractical, test two pages at a time and keep the best as your control for subsequent testing.

  7. Based on Results, Determine Your True Control Page

    The radical redesign method will be more likely to generate a quantum leap in improved conversion rate than optimizing a mediocre page with little potential. Once you have identified the best general approach, you are now ready to optimize individual elements on the page.

  8. Optimize with Traditional Variable-Specific A/B Testing Variables to test:

    Headline
    Call to Action
    Page Copy
    Graphics
    Color
    Configuration of Page Elements
    Etc.

Concluding Comments:

There are a few important points to remember about A/B split testing:

  1. Even if you can't set up a true A/B split test (where two versions of a page are being displayed one after another to alternate visitors to your site), it is very easy to create a sequential A/B split test.

    KEY POINT: A sequential test is when you show one version of a page for a certain period, like two days or a week, and then show another version for the following two days or a week. The results may be a little less reliable, but can still yield valuable information and trends.

  2. Testing gives you the opportunity to maximize conversion rates, solve problems, and challenge assumptions. And keep in mind that you have opportunities beyond testing small changes to a page.

    You can also challenge an existing page by designing and writing a radically different version, where almost everything is different. In fact, it is through these dramatically changed approaches that you are most likely to achieve breakthrough improvements.

  3. Testing provides companies with an invaluable means to demonstrate to company heads and management the hard figures behind any suggested changes or improvements.

    Persuading management on the basis of subjective expertise alone is a tough road to follow. But if you have hard test results on your side, the persuasion process becomes much easier.

  4. Use of consistent testing will increase the knowledge base of your web group or company significantly. You will learn more, and soon be able to determine a set of optimized practices that work best for your particular business.

    The absence of rigorous testing leaves you in the dark, depending on guesswork alone when creating your pages.

  5. Establish a testing protocol for your web sites and emails. In the section above we have outlined a protocol to get you started. In addition, the following document will provide a template for A/B testing that will help guide the experimentation process.

    ABTestingTemplate.doc

If you have suggestions for topics you think we should study, please let us know.

As we glean practical, accurate data, we will share the results. We promise to do our best to help you discover what really works.

::Top Of Page::

Notes

RELATED MEC REPORTS:

::Top Of Page::

Literature Review

As part of our research on this topic, we have prepared a review of the best Internet resources on this topic.

Rating System

These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.

* = Decent | ** = Good | *** = Excellent | **** = Indispensable

Articles:

Software/Services:

About This Brief

Credits:

  1. Editor — Flint McGlaughlin
  2. Writers — Brian Alt
    Nick Usborne
  3. Contributors — Jalali Hartman
    Aaron Rosenthal
    Jimmy Ellis
  4. HTML Designer — Cliff Rainer

::Top Of Page::