Posts Tagged ‘a/b testing’

Testing and Optimization: 4 inspirational examples of experimentation and success

November 6th, 2014 1 comment

At our sister publication, MarketingSherpa, we publish four case study beats – B2B, B2C, Email and Inbound – with stories covering actual marketing efforts from your peers each week. Not every case study features a testing and optimization element, but many do.

For this MarketingExperiments Blog post, I wanted to share a quick summary of several of these case studies, along with links to the entire article (including creative samples) in case any pique your interest and you want to dig into the entire campaign.

So, without further ado, read on for four MarketingSherpa case studies that feature testing and optimization of various digital marketing channels, strategies and tactics.


Case Study #1. 91% conversion lift from new copy and layout

This case study features AwayFind, a company that provides mobile email alerts, and covers an effort to test, and hopefully improve, its homepage performance.

Brian Smith, Director of Marketing, AwayFind, said, “Our primary driver of traffic is our PR efforts. Our homepage is effectively our primary landing page, and we need to convert that traffic into premium users.”

The testing included both changing copy and layout elements. The main copy change was instead of focusing on features, the treatment copy focused on benefits, and layout tweaks included a shortened headline, the remaining copy was split between a subhead and a smaller block of text, and the color of the subhead text was also modified.

In this test, the treatment achieved:

  • 42% increase in clicks to the sign-up page
  • 91% increase in registrations for the trial

  Read more…

Email Optimization: Testing best time of day and day of week for email interaction

September 22nd, 2014 6 comments

When do you check your personal email? Do you let it build up throughout the work week and go through it during the weekends? Do you check it on Monday when you’re also sorting through your work email? Or do you check it while you’re at lunch or on a quick, but much-needed, break from work?

In today’s MarketingExperiments Blog post, we’re going to explore which times of the day and days of the week people are most likely to interact with their emails — two questions of optimal interest for any emailing campaign.


Testing  the time of day when people interact with email

In email testing, we focus so much on the content and landing page of the email, but that hard work won’t pay off if email recipients don’t open or clickthrough the email. We wanted to get a better understanding of when people interact with emails to determine the best time of the day and day of the week to send promotional emails.

First, we began testing what time of day people are most likely to open and interact with emails.

Emails were currently being sent out on Mondays and Wednesdays at 7 a.m. EST. We hypothesized that by sending emails at various times throughout the day, we would learn the optimal times recipients are most likely to open and clickthrough their emails.

In an A/B split test, we sent a promotional email at 7 a.m., 3 a.m., 3 p.m. and 7 p.m. EST on a Monday. We wanted to isolate the general times of day people may be interacting with their email.

3 a.m. was tested to determine if people were more likely to interact with their emails as soon as they wake up in the morning and before they start their day, while 3 p.m. would tell us if people were checking their emails in the afternoons.

Lastly, 7 p.m. results would show that recipients were more likely to check and interact with their email in the evenings or later at night.

By sending emails at 7 p.m. EST instead of 7 a.m. EST, we saw a 12% lift in open rate:


Read more…

Online Testing: 3 resources to inspire your ecommerce optimization

July 3rd, 2014 No comments

Optimizing to improve a customer experience can be a little overwhelming when you consider all the nuts and bolts that make up an entire ecommerce property in its entirety.

In this MarketingExperiments Blog post, we’ll take a look at three ecommerce resources from our testing library that will hopefully spark a few ideas that you can to add to your testing queue.


Read: A/B Testing: Product page testing increases conversion 78%



How it can help

This experiment with a MECLABS Research Partner is a great example illustrating how testing elements on your product pages that are probable cause for customer concern is the best way to alleviate anxiety.


Watch: Marketing Multiple Products: How radical thinking about a multi-product offer led to a 70% increase in conversion


In this Web clinic replay, Austin McCraw, Senior Director of Content Production, MECLABS, shared how radical thinking about a multi-product offer led one company to a 70% increase in conversion.


How it can help

 One big takeaway from this clinic you need to understand is that strategic elimination of competing offers on pages with multiple products can help drive customers’ focus to the right product choices for their needs.


Learn: Category Pages that Work: Recent research reveals design changes that led to a 61.2% increase in product purchases


These slides are from a Web clinic on category pages in which Flint McGlaughlin, Managing Director, MECLABS, revealed the results of category page design changes that increased clicks and conversions across multiple industries.

Read more…

Web Optimization: 5 steps to create a small testing program

June 16th, 2014 No comments

At Web Optimization Summit 2014, Ryan Hutchings, Director of Marketing, VacationRoost, shared the nuts and bolts behind putting together a foundational testing process.

In today’s MarketingExperiments Blog post, I wanted to walk through Ryan’s five steps you can use to create a small testing program in your organization.


Step #1. Decide what to test 



When deciding what to test, the trick, according to Ryan, is prioritization.

There are lots of things to test in a conversion funnel, but limits of time and resources are important to factor in when putting together a test plan.

One of the tools Ryan uses to help his team prioritize smaller testing efforts is a spreadsheet of test ideas from across the organization.

The items highlighted in the screenshot above are columns that list test ideas and their prospective confidence levels that the team thinks will produce a lift.

“This helps us prioritize,” Ryan explained. “It gives us a starting point.”


Step #2. Identify a target conversion goal 



Ryan explained that the next step is to identify target conversion goals. To help do that, the VacationRoost team sets ideal ranges for their KPIs.

During his session, he used bounce rates as one example of where KPIs can help you set some target conversion goals and identify some testing opportunities.

“Bounce rate is a good example and a good starting point for a lot of people when talking about individual landing page optimization,” Ryan said.

One additional small mention to add is the disclaimer that the illustration is only an example.  When it comes to bounce rates, 37% (represented in the image above) is just to visualize the importance of setting standards, and is not inherently an industry goal.


Step #3. Create a hypothesis



Ryan explained that his team uses the MECLABS, parent company of MarketingExperiments, Conversion Heuristic to help them turn test ideas into testable hypotheses. Using a repeatable methodology helps the team vet testing ideas and keeps testing focused.

“Everything is based on the heuristic, and that’s all we use, Ryan said.


Step #4. Build wireframes, develop the treatments and launch the test



If you’re going use a methodology to help identify testing opportunities, you should also consider how that methodology can help you build a treatment to test against your control.

Ryan explained how the Conversion Heuristic is also used in developing treatment designs to help keep  testing centered on the specific variables they want to explore.

One example he shared in his session was a PPC landing page in which VacationRoost wanted to test the impact of quality seals on delivering the value proposition.

“As you can see, these are two totally different pages as you’re looking at it,” Ryan explained, “and when we look at it, we say, ‘OK, what do we want to impact?’”

Read more…

Online Testing: 3 steps for finding a testable hypothesis

June 9th, 2014 No comments

Oftentimes in our Research Partnerships, each party is excited and anxious to jump in and begin testing. Right from the start, most Partners have a good idea of where their site or pages are lacking and bring lots of great ideas to the table.

While having a suboptimal webpage can often be thought of as “losing money as we speak,” it is important to take the time to complete what we call the “discovery phase.”

This discovery phase can be summed up in three simple analyses that you can perform to develop a great test hypothesis to help you learn more about your customers.


Step #1. Evaluate your data and identify conversion gaps in the funnel

This will help you identify the page or area of your site to focus on first.

Evaluating your data can help you understand how users are behaving on your site. You can start by looking at basic metrics like new versus returning visitors, traffic sources, bounce rates and exit rates to help you identify where your conversion process has the greatest leaks.

The other side of the coin is that identifying those gaps also gives you insights into where your biggest testing and optimization opportunities exist to help you plug those leaks.

For instance, a high bounce rate may indicate users are not finding what they are expecting on a given page. Regardless of which metrics you are evaluating, think of your data as a window into the mind of your customer.


Step #2. Assess your competitors to gain valuable insights on what to test

There’s no need to reinvent the wheel.

Looking at competitors’ sites can give you an idea of what visitors are accustomed to seeing on similar webpages to the one you are testing.

Here are a few examples of elements to look for and test:

  • Should the button be on the left or right side of the page?
  • Where is the best place on the page for product images?
  • Are any companies utilizing dropdowns or sliders for price ranges?

You are trying to figure out what works best for your pages and users. After all, imitation is the sincerest form of flattery, right?

  Read more…

Email Marketing: Using A/B tests to challenge your assumptions

April 21st, 2014 1 comment

Dan Ariely has a Ph.D. in business administration.

He also has a Ph.D. in cognitive psychology.

I can think of no better description of a high-performing marketer. Someone who understands management and organizations, yes, but who also can provide unique insights into mental processes (i.e., the mind of the customer).

We were honored to have Dan as a keynote speaker at MarketingSherpa Email Summit 2014. While there, he stopped by the Media Center to discuss email marketing, human intuition and rationality (or the lack thereof) with MarketingSherpa Reporter Allison Banko.


As Dan discussed, if we approach marketing as a three-step process …

  1. Doubting ourselves
  2. Having a bit more humility
  3. Testing

… we can use email marketing as a quick feedback loop to gain a deeper understanding of the cognitive psychology of what headlines, subject lines and offers will get a customer to act (and which ones will not). By doing this, ultimately, we can improve business results.

“You have to try things that you think won’t work out,” Dan said. “If you try only the things you think will work out, you will never learn.”

This is a perfect explanation of Web optimization, which is the focus of our next Summit.

At Web Optimization Summit 2014 in New York City, one of our featured speakers will be Michael Norton of Harvard Business School, a colleague of Dan. They conducted research together on The “IKEA Effect”: When Labor Leads to Love. They discovered a cognitive bias in consumers – people tend to place a disproportionate value on products they help to create.

That bias prevents you from seeing the marketing campaigns and landing pages you create the way customers experience them. You helped create it, so you place disproportionate value on it.

But as we’ll explore at Web Optimization Summit, A/B testing is helping marketers see with new eyes –the eyes of the customer.

Read more…