Archive

Archive for the ‘Analytics & Testing’ Category

Online Testing: 3 resources to inspire your ecommerce optimization

July 3rd, 2014 No comments

Optimizing to improve a customer experience can be a little overwhelming when you consider all the nuts and bolts that make up an entire ecommerce property in its entirety.

In this MarketingExperiments Blog post, we’ll take a look at three ecommerce resources from our testing library that will hopefully spark a few ideas that you can to add to your testing queue.

 

Read: A/B Testing: Product page testing increases conversion 78%

ebook-retailer-versions

 

How it can help

This experiment with a MECLABS Research Partner is a great example illustrating how testing elements on your product pages that are probable cause for customer concern is the best way to alleviate anxiety.

 

Watch: Marketing Multiple Products: How radical thinking about a multi-product offer led to a 70% increase in conversion

 

In this Web clinic replay, Austin McCraw, Senior Director of Content Production, MECLABS, shared how radical thinking about a multi-product offer led one company to a 70% increase in conversion.

 

How it can help

 One big takeaway from this clinic you need to understand is that strategic elimination of competing offers on pages with multiple products can help drive customers’ focus to the right product choices for their needs.

 

Learn: Category Pages that Work: Recent research reveals design changes that led to a 61.2% increase in product purchases

 

These slides are from a Web clinic on category pages in which Flint McGlaughlin, Managing Director, MECLABS, revealed the results of category page design changes that increased clicks and conversions across multiple industries.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Testing: How to use A/A testing to break through the noise

June 30th, 2014 2 comments

Getting a lift from your testing efforts can be satisfying and rewarding.

Not to mention, increases in conversion have changed the fortunes of entire enterprises and the careers of the marketers who advocated testing.

But is a lift truly a lift, or is it simply a false positive resulting from natural variation?

In this MarketingExperiments Blog post, I wanted to share an excellent example of using A/A testing (and yes, you are reading that correctly) from Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, presented at Web Optimization Summit 2014.

 

What does variance in testing look like?

variance-testing-homepage

 

Here’s the example Emily shared with the audience to help put variance in context using a control and treatment of Extra Space Storage’s homepage.

There is absolutely no difference between these pages except for the 15% difference in conversion.

According to Emily, that’s when you need to start investigating how variance is potentially impacting your testing efforts because there should be little to no difference in terms of performance in identical pages.

“A 15% lift is more concerning,” Emily explained, “because there should be no difference with the same experience.”

 

A/A testing is not A/B testing

variance-testing-explanation

 

Emily also noted a key distinction between A/A and A/B testing that is really important to grasp:

  • A/A testing – Can help you measure the natural variability (noise) of a website by testing an identical experience.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

A/B Testing: Product page testing increases conversion 78%

June 26th, 2014 3 comments

Product pages are arguably the heart of an ecommerce website.

It’s where potential customers learn about your products in a guided conversation that should deliver value and an overall top-notch customer experience.

Consequently, the elements on those pages are also potentially where you’re losing conversions due to anxiety.

At MarketingExperiments, we define customer anxiety as “a psychological concern stimulated by a given element in the sales or sign-up process.”

So how do you identify and mitigate anxiety on product pages?

In this MarketingExperiments Blog post, I wanted to share a recent experiment where an e-book retailer asked that same question and started testing to discover a way to answer it.

But, before we dive in, let’s view the background notes on the test to put the experiment into context.

Background: A large e-book retailer.

Goal: To increase the overall number of e-book sales.

Research Question: Which attempt to reduce anxiety will result in the highest conversion rate?

Test Design: A/B variable cluster split test

 

Side by side

ebook-retailer-versions

 

The team hypothesized that testing key product page elements could help them determine the true impact of anxiety on a product page.

Here is a quick breakdown of the elements the team chose to test in each treatment:

  • Version A – Attempted to reduce anxiety by using security seals
  • Version B – Highlighted compatibility by illustrating the product is multi-device friendly
  • Version C – Provided a synopsis of the content to help customers determine if the e-book would suit their interests at the top of the page
  • Version D – Emphasized quick accessibility to the product upon purchase

 

Results

 product-pages-test-results

 

Moving the product description up on the page resulted in a 78% relative increase in conversion.

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Optimization: Testing value prop to grow your tribe

June 23rd, 2014 1 comment

I have a deep respect for the marketers at nonprofits.

How they deliver appeal and exclusivity to donors, in some circumstances, can potentially make or break solvency.

Consequentially, I would argue testing and optimizing value proposition for nonprofits is vital.

In this MarketingExperiments Blog post, we’ll take a look at an experiment from a Web Optimization Summit 2014 presentation from featured speaker Tim Kachuriak, Chief Innovation and Optimization Officer, Next After, on “selling the intangible.”

Before we begin, here are some background notes on the test.

Background: The Heritage Foundation, a think tank located in Washington, D.C.

Objective: To increase the donation conversion rate.

Primary Research Question: How does value proposition affect conversion rate?

Test Design: Radical redesign A/B split test

 

Side by side

donation-page-experiment

 

Here are the control and treatment versions of the donation pages side by side.

According to Tim, the primary focus for his team was gaining a deeper understanding of how value proposition impacts donor behavior.

 

Treatment

treatment-elements-donation-page

 

In the treatment, Tim and the team identified elements on the landing page that would likely have the greatest impact on value proposition:

  • Headline – Deliver value right up front
  • Bullets – Quickly highlight reasons to donate
  • Testimonials – Share third-party sources who are fans
  • Call-to-action – Make intentions for donors clear and easy

 

Results

donation-page-test-results

 

The treatment outperformed the control by 189%.

Fellow optimization fanatics should also take note here that the winner was also a long-copy page with the CTA below the fold.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Web Optimization: 5 steps to create a small testing program

June 16th, 2014 No comments

At Web Optimization Summit 2014, Ryan Hutchings, Director of Marketing, VacationRoost, shared the nuts and bolts behind putting together a foundational testing process.

In today’s MarketingExperiments Blog post, I wanted to walk through Ryan’s five steps you can use to create a small testing program in your organization.

 

Step #1. Decide what to test 

test-ideas-speadsheet

 

When deciding what to test, the trick, according to Ryan, is prioritization.

There are lots of things to test in a conversion funnel, but limits of time and resources are important to factor in when putting together a test plan.

One of the tools Ryan uses to help his team prioritize smaller testing efforts is a spreadsheet of test ideas from across the organization.

The items highlighted in the screenshot above are columns that list test ideas and their prospective confidence levels that the team thinks will produce a lift.

“This helps us prioritize,” Ryan explained. “It gives us a starting point.”

 

Step #2. Identify a target conversion goal 

bounce-rate-goals

 

Ryan explained that the next step is to identify target conversion goals. To help do that, the VacationRoost team sets ideal ranges for their KPIs.

During his session, he used bounce rates as one example of where KPIs can help you set some target conversion goals and identify some testing opportunities.

“Bounce rate is a good example and a good starting point for a lot of people when talking about individual landing page optimization,” Ryan said.

One additional small mention to add is the disclaimer that the illustration is only an example.  When it comes to bounce rates, 37% (represented in the image above) is just to visualize the importance of setting standards, and is not inherently an industry goal.

 

Step #3. Create a hypothesis

 small-test-ppc

 

Ryan explained that his team uses the MECLABS, parent company of MarketingExperiments, Conversion Heuristic to help them turn test ideas into testable hypotheses. Using a repeatable methodology helps the team vet testing ideas and keeps testing focused.

“Everything is based on the heuristic, and that’s all we use, Ryan said.

 

Step #4. Build wireframes, develop the treatments and launch the test

 landing-page-test

 

If you’re going use a methodology to help identify testing opportunities, you should also consider how that methodology can help you build a treatment to test against your control.

Ryan explained how the Conversion Heuristic is also used in developing treatment designs to help keep  testing centered on the specific variables they want to explore.

One example he shared in his session was a PPC landing page in which VacationRoost wanted to test the impact of quality seals on delivering the value proposition.

“As you can see, these are two totally different pages as you’re looking at it,” Ryan explained, “and when we look at it, we say, ‘OK, what do we want to impact?’”

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Online Testing: 3 steps for finding a testable hypothesis

June 9th, 2014 No comments

Oftentimes in our Research Partnerships, each party is excited and anxious to jump in and begin testing. Right from the start, most Partners have a good idea of where their site or pages are lacking and bring lots of great ideas to the table.

While having a suboptimal webpage can often be thought of as “losing money as we speak,” it is important to take the time to complete what we call the “discovery phase.”

This discovery phase can be summed up in three simple analyses that you can perform to develop a great test hypothesis to help you learn more about your customers.

 

Step #1. Evaluate your data and identify conversion gaps in the funnel

This will help you identify the page or area of your site to focus on first.

Evaluating your data can help you understand how users are behaving on your site. You can start by looking at basic metrics like new versus returning visitors, traffic sources, bounce rates and exit rates to help you identify where your conversion process has the greatest leaks.

The other side of the coin is that identifying those gaps also gives you insights into where your biggest testing and optimization opportunities exist to help you plug those leaks.

For instance, a high bounce rate may indicate users are not finding what they are expecting on a given page. Regardless of which metrics you are evaluating, think of your data as a window into the mind of your customer.

 

Step #2. Assess your competitors to gain valuable insights on what to test

There’s no need to reinvent the wheel.

Looking at competitors’ sites can give you an idea of what visitors are accustomed to seeing on similar webpages to the one you are testing.

Here are a few examples of elements to look for and test:

  • Should the button be on the left or right side of the page?
  • Where is the best place on the page for product images?
  • Are any companies utilizing dropdowns or sliders for price ranges?

You are trying to figure out what works best for your pages and users. After all, imitation is the sincerest form of flattery, right?

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg