Archive

Archive for the ‘Analytics & Testing’ Category

Testing and Optimization: A/B tests on landing pages, email and paid search from case studies

March 5th, 2015 No comments

No matter what type of digital marketing campaigns you are executing, there are elements in every channel that can be tested and optimized to improve campaign performance.

For example, email subject lines, copy, design and even the “from” field can be tested. Webpage elements ripe for testing include design, layout, copy, color, call-to-action button elements and more. With paid search you should be testing keywords on an ongoing basis to continually improve your PPC spend, but you can also test ad copy and calls-to-action.

At MarketingSherpa (sister company of MarketingExperiments), we publish case studies in our newsletters every week, and very often those case studies include a testing and optimization element. For today’s MarketingExperiments Blog post, I wanted to share three of those examples taken from previously published newsletter case studies.

I hope these tests give you some ideas on testing your own digital marketing channels.

 

Test #1. Webpage: Increasing lead generation on a landing page

This first test was actually a collaboration between researchers at MECLABS (the parent company of MarketingExperiments) and HubSpot and was conducted during Optimization Summit 2012. The full test was covered in the article, “A/B Testing: How a landing page test yielded a 6% increase in leads.”

A lead form landing page for HubSpot’s software with a free special report incentive for filling out the registration form was tested, with the Summit attendees providing input on what to test.

Before the Summit, the testing team came up with four hypothesis options:

Hypothesis 1 — Visitors arriving to the page are highly motivated to download the e-book based on brand recognition. Removing friction from the page will result in a higher conversion rate.

Hypothesis 2 — Communicating the urgency of the offer — that the free e-book download is a limited-time offer — will result in a higher conversion rate.

Hypothesis 3 — Adding more visual value to the page, such as charts and graphs from the e-book, will result in a higher conversion rate.

Hypothesis 4 — Incorporating pricing information to increase the perceived value of the e-book will result in a higher conversion rate.

The audience was allowed to choose which one to test and decided on Hypothesis 2.

 

Control

 

Treatment (Hypothesis 2)

 

The only difference between the two versions was an emblem on the page, stating, “Limited Time Offer,” to add urgency to the incentive.

The test was executed during the two days of Summit. At a 97% confidence level, it determined that the Treatment outperformed the Control 6.8%.

 

Test #2. Email: Testing every element in an email send

International SOS is a B2B company providing medical and travel security risk services to international corporations, governments and NGOs. The MarketingSherpa Email Newsletter case study, “Email Marketing: 400% webinar attendance increase for B2B company through relevance and A/B testing,” covered how International SOS regularly tested multiple elements in its email sends to continually optimize its campaigns.

Nadia Karasawa, Assistant Marketing Manager, International SOS, explained that emailed webinar invites were sent every month to the same audience.  That consistency made it easy for the team to A/B test email elements and benchmark against previous results.

Here are some of the elements tested and what International SOS learned about its email audience:

  • Call-to-action — the team found out that repeating the call-to-action three times offered the best results
  • “Register Now” outperformed “Register” with 4% more conversions
  • An orange call-to-action button outperformed blue or gray by 5%
  • Describing the webinar as “express,” even though the 30-minute length was not changed, increased registration and attendance
  • Other discoveries include: four bullet points outperformed three, social sharing did not improve results and both six and seven form field registrations performed equally well

What was the value of constantly testing and optimizing so many email elements?

The results the team was able to achieve were:

 

The 2011 send vs. 2012 send:

  • A 72% increase in registration
  • A 23% increase in attendees

The 2012 send vs. 2013 send:

  • A 47% increase in registration
  • A 23% increase in attendees

 

Test #3. Paid Search: Testing PPC ad copy

Testing and Optimization: Effort across entire PPC funnel leads to 83,000% boost in membership application performance,” features a test on paid search ad copy, pitting four different versions of the ad against each other. This test was run by GS1 US, a nonprofit that issues prefixes used to create U.P.C. and supply chain barcodes.

The entire optimization program covered every element of its paid search, from ad copy to landing pages and finally the application form that served as the final conversion of the PPC campaign. For this post we are focusing on the ad copy test.

The team created four versions of the PPC ad and compared results:

 

Ad #1:

Need to Get UPC Barcodes?
GS1 US, the only authorized source
in the U.S. for your U.P.C. barcodes
Barcodes.GS1US.org

Ad #2:

Need to Get U.P.C. Barcodes?
Buy Authentic U.P.C. Barcodes
only from GS1 US Apply Online
Barcodes.GS1US.org

Ad #3:

UPC Barcodes
GS1 US, the only authorized source
in the U.S. for your U.P.C. barcodes
Barcodes.GS1US.org

Ad #4:

UPC Barcodes
Buy authentic U.P.C. Barcodes
only from GS1 US Apply Online
Barcodes.GS1US.org

 

Results

Ad #1 won the overall test with the highest clickthrough rate, besting Ad #2 by 110%, Ad #3 150.7% and Ad #4 by 252%, all at a 99.7% confidence level for the test.

One reason GS1 US tested two sets of ad titles and body text was to find the optimal combination. However, the tests were also designed to discover which language best resonated with its target audience to use for messaging in other venues, such as landing pages.

 

David is a Reporter for MECLABS Institute. You can follow on Twitter at @DavidOnline.

 

You might also like

Online Testing: Why are you really testing? [More from the blogs]

Email Testing: How the Obama campaign generated approximately $500 million in donations from email marketing [MarketingSherpa case study]

Email Marketing: Education group utilizes A/B testing to increase open rates by 39% [MarketingSherpa case study]

Email Preheaders Tested: The surprising sensitivity of a single line of text [More from the blogs]

 

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

The Writer’s Dilemma: How to know which marketing copy will really be most effective

February 5th, 2015 1 comment

I’m staring at a blank page on my screen. There are several directions I could go with this piece of writing, and I’m not sure which will be most helpful to you:

  • How to improve the conversion rate of your email marketing
  • How to best understand and serve your customers
  • How to split test your email marketing

I’m sure you face this dilemma as a copywriter or marketing manager as well:

  • Which subject line will be most effective?
  • How should you craft the headline?
  • What body copy would be most helpful (and generate the most response) from customers?

So that’s what today’s MarketingExperiments Blog post will be about. Essentially, your product and offers likely have many elements of value, and there are many ways you can message that value, but what will work best with your potential customers?

To give you a process to follow, I’ll use an example:

We recently ran a public experiment to help answer the above questions for VolunteerMatch, a nonprofit organization with a unique funding model. It sells a Software as a Service (SaaS) product to companies to help fund its organization, which has generated close to $1 billion in social value each year through its work with nonprofits and volunteers.

Let’s take a look at the process we used for this public experiment and how you can repurpose it for your own marketing efforts.

 

Step #1: Get some new ideas

You think, breathe, eat, sleep and dream about the products and services you advertise and market. So sometimes it helps to step out of your box and get a new perspective.

For example, MarketingExperiments’ parent company, MECLABS Institute, uses Peer Review Sessions to foster idea collection and collaboration from new and unique viewpoints.

To get some new ideas for VolunteerMatch, we launched the public experiment with a contest on the MarketingExperiments Blog as well as on The Moz Blog where we asked marketers to comment on the blog post with their ideas for effective subject lines with a chance to win tickets to Email Summit and a stay at the event’s host hotel, the ARIA Resort & Casino. We received subject line ideas from 224 marketers.

However, this is only one way to step outside the box and get a fresh perspective on your products and services. You could also:

  • Talk to people in departments you don’t normally engage with (e.g., customer service, sales, product development, IT, accounting, legal … keep your options open)
  • Conduct surveys or focus groups with potential customers
  • Read reviews, feedback forms, forum conversations and social media to learn the language the customers use when talking about your products
  • Get on the phone and interview customers (and even people who chose not to be customers)
  • Read websites, magazines and newspapers aimed at your buyer and see what language they use and values they emphasize
  • Go to a museum, national park, art fair, farmer’s market, the symphony or some other creative endeavor to help spark some new thinking

My point is cast a wide net. Get a lot of ideas at this point.

 

Step #2: Coalesce these ideas around key points of value

Once you have all of these ideas, they will likely naturally fall into a few main categories of value around your products or services.

When conducting this public experiment with VolunteerMatch, we started with three elements of value (listed below) to help focus marketers who were entering the contest. When they entered, they would leave a comment on the blog post with their suggested subject line and which category of value that subject line was intended to communicate.

Defining the value upfront will help you know what elements of value you already consider important to your product or service when conducting Step #1.

However, it is important to stay open minded. When you assign the feedback you’ve received into different categories of value, you may find that all of the feedback doesn’t necessarily fit into the categories you’re using. You can find gold in these outliers — new value categories for your product that you had not considered before.

The three categories of value we focused on for VolunteerMatch were:

  • Category #1: Proof, recognition, credibility
  • Category #2: Better, more opportunities to choose from
  • Category #3: Ease of use

We also gave marketers an opportunity to come up with a category of value we may have overlooked.

From the suggestions we received on the blog post, I picked a new category to test along with the previous categories of value we had already identified. Suzanne suggested “I would argue that true volunteers are motivated by something more profound from within: dedicated volunteers are passionate about a particular cause.”

Based on this response, we added one more category of value:

  • Category #4: Passion

 

Step #3: Identify the best expressions of these categories of value

Now that you’ve identified a few areas of value to focus on, look through all of the messaging for the value from the suggestions you received and identify a few examples of wording that you think is the most effective.

I read through each and every subject line suggested in the comments on the MarketingExperiments Blog, and Cyrus Shepard, Head of SEO and Content, Moz, read through all the subject lines proposed by marketers through The Moz Blog.

We settled on these seven subject lines:

Category #1: Proof

  • Attention Business Leaders: How to Increase your ROI through Employee Volunteer Initiatives
  • Volunteering matters. We have the proof.

Category #2: Network size

  • CC Your Boss: 1,000+ Ways To Make A Difference (Inside)
  • Does your company care? Thousands of ways to prove it.

Category #3: Ease of use (app)

  • The volunteer app your coworkers will talk about
  • The One App That Can Change The Way Your Company Gives Back

Category #4: Passion (no feature)

  • Spread the Only “Good” Office Virus
  • Spread the Only “Good” Office Virus (I’ll tell you why this subject line is listed twice in the next step)

 

Step #4: Test with your audience to see which value and messaging combination is the most effective

In this case, my colleague, Jon Powell, Senior Manager, Executive Research and Development, MECLABS Institute, ran a split test with VolunteerMatch’s email list to see which subject lines would be most effective and which value is most appealing to potential customers.

Testing with your potential customers is another way to break down that fourth wall with customers and discover what is really most valuable about your product to inform and improve your copywriting.

Here was the email that was sent. (Note: The last, bolded line was changed for different treatments to correspond to the value expressed in the subject line that was tested.)

 

I listed the “passion” subject line twice because Jon used it as a double treatment. Essentially, this is a way to make sure the results that you see from an experiment are valid.

There should not be a significant difference between those two treatments since the subject line was the same. If there is a significant difference, it could be an indication of a validity threat, and you must question your data even further before trusting it (an issue we fortunately did not have with this test).

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Testing and Optimization: How to get that “ultimate lift”

January 19th, 2015 2 comments

What would you rather have: a 32-inch flat screen TV for $100 or a 72-inch flat screen TV for $150? After considering the first 32 inches cost $100, you would probably pay the additional $50 for another 40 inches.

This same principal can be thought of in terms of testing and optimization, with one caveat — you have to buy the 32-inch TV first.

 

A discovery, not a lift

Many attempting to optimize and test within webpages want big lifts; however here at MECLABS Institute, we always say the goal of a test is not to get a lift but to gain discoveries about customer behavior. This makes sense on face value, but to be honest, when I first heard the expression, I thought to myself, “Well sure, that sounds like a good backstop in case you don’t get a lift.” However, I soon learned that it is more than a backstop or worse — an excuse.

As the curator for Dr. Flint McGlaughlin’s personal website, I often come across insightful observations. This next excerpt speaks particularly well to this topic of optimization and testing to obtain more than just a lift:

Too often, marketers are focused on results instead of reasons. We need to move deeper than ‘how much,’ into ‘why so,’ to answer an even more important question: What does this tell me about my customer or prospect? And so the goal of an optimization test transcends the notion of a lift and asks for learning. With sufficient insights we can obtain the ultimate lift. The more you know about the customer, the easier it is to predict their behavior. The easier it is to predict their behavior, the more you know about your value proposition. — Flint McGlaughlin

I have bolded what I think is the most important part of that quote for the sake of our discussion today. I am going to repeat it because it is so significant: “The goal of an optimization test transcends the notion of a lift and asks for learning. With sufficient insights we can obtain the ultimate lift.” — Flint McGlaughlin

Now we may ask ourselves, “What is the ultimate lift”? Some may think it is the biggest or most important criteria on some arbitrary scale. In my opinion, the “ultimate” lift is gaining insight about your customer and your value proposition that can be leveraged across all marketing channels.

 

Value Proposition 101

Before we go any further, if you are reading this article and do not know what I mean when I say “value proposition,” I urge you to investigate our research specifically around value proposition. However, for the sake of brevity (and this blog post), here is the oversimplified crash course:

A company’s value proposition is essentially trying to answer the question “If I am you ideal prospect, why should I buy from you rather than your competitors?

The answer should be a “because” statement that stresses the appeal and exclusivity of the offer in a clear and credible way. The offer also needs to be supported by factual claims which will add to the credibility of the offer.

 

Testing for the “ultimate lift”

Now that we have a basic understanding of a value proposition, here is an example from a past MECLABS research partner. In this experiment, we achieved the “ultimate lift” because of customer discoveries gained through value proposition testing.

 

Experiment ID: TP1306
Background: Provides end-to-end market solutions for small and medium-sized businesses.
Primary Research Question: Which page will obtain the most form submissions?

First, here is the control:

 

CONTROL

 

After analyzing the offer on the page, MECLABS analysts identified the following value proposition for the offer.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

A/B Testing: How to improve already effective marketing (and win a ticket to Email Summit in Vegas)

January 5th, 2015 183 comments

Editor’s Note: This subject line contest is no longer accepting entries. Check out “The Writer’s Dilemma:How to know which marketing copy will really be most effective” to see which entry won, why it won and what you can learn from that to further improve your own marketing.

This blog post ends with an opportunity for you to win a stay at the ARIA Resort & Casino in Las Vegas and a ticket to Email Summit, but it begins with an essential question for marketers:

How can you improve already successful marketing, advertising, websites and copywriting?

Today’s MarketingExperiments blog post is going to be unique. Not only are we going to teach you how to address this challenge, we’re going to also offer an example to help drive home the lesson. We’re going to cover a lot of ground today, so let’s dive in.

 

Give the people what they want …

Some copy and design is so bad, the fixes are obvious. Maybe you shouldn’t insult the customer in the headline. Maybe you should update the website that still uses a dot matrix font.

But when you’re already doing well, how can you continue to improve?

I don’t have the answer for you, but I’ll tell you who does — your customers.

There are many tricks, gimmicks and types of technology you can use in marketing, but when you strip away all the hype and rhetoric, successful marketing is pretty straightforward — clearly communicate the value your offer provides to people who will pay you for that value.

Easier said than done, of course.

How do you determine what customers want and the best way to deliver it to them?

Well, there are many ways to learn from customers, such as focus groups, surveys and social listening.

While there is value in asking people what they want, there is also a major challenge in it.

According to research from Dr. Noah J. Goldstein, Associate Professor of Management and Organizations, UCLA Anderson School of Management, “People’s ability to understand the factors that affect their behavior is surprisingly poor.”

Or, as Malcom Gladwell more glibly puts it when referring to coffee choices, “The mind knows not what the tongue wants.”

This is not to say that opinion-based customer preference research is bad. It can be helpful. However, it should be the beginning of your quest, not the end.

 

… by seeing what they actually do

You can use what you learn from opinion-based research to create a hypothesis about what customers want, and then run an experiment to see how they actually behave in real-world customer interactions with your product, marketing messages and website.

The technique that powers this kind of research is often known as A/B testing, split testing, landing page optimization or website optimization. If you are testing more than one thing at a time, it may also be referred to as multivariate testing.

To offer a simple example, you might assume that customers buy your product because it tastes great and because it’s less filling. Keeping these two assumptions in mind, you could create two landing pages — one with a headline that promotes that taste (treatment A) and another that mentions the low carbs (treatment B). You then send half the traffic that visits that URL to each version and see which performs better.

Here is a simple visual that Joey Taravella, Content Writer, MECLABS created to illustrate this concept: 

 

That’s just one test. To really learn about your customers, you must continue the process and create a testing-optimization cycle in your organization — continue to run A/B tests, record the findings, learn from them, create more hypotheses and test again based on these hypotheses.

This is true marketing experimentation, and it helps you build your theory of the customer.

 

Try your hand at A/B testing for a chance to win

Now that you have a basic understanding of marketing experimentation (there is also more information in the “You might also like” section of this blog post that you may find helpful), let’s engage in a real example to help drive home these lessons in a way you can apply to your own marketing challenges.

To help you take your marketing to the next level, The Moz Blog and MarketingExperiments Blog have joined forces to run a unique marketing experimentation contest.

In this blog post, we’re presenting you with a real challenge from a real organization and asking you to write a subject line that we’ll test with real customers. It’s simple; just leave your subject line as a comment in this blog post.

We’re going to pick three subject lines from The Moz Blog and three from the MarketingExperiments Blog and run a test with this organization’s customers.

Whoever writes the best performing subject line will win a stay at the ARIA Resort in Las Vegas as well as a two-day ticket to MarketingSherpa Email Summit 2015 to help them gain lessons to further improve their marketing.

Sound good? OK, let’s dive in and tell you about your client:

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Website Optimization: Not testing can cost you money

December 15th, 2014 2 comments

I’ve had some pretty terrible online shopping experiences. I’ve dealt with impossible product pages, awkwardly laid-out shopping carts and some sketchy checkout processes.

It seemed as if companies were simply allowing customers to shop online, not encouraging it — especially smaller, specialized stores.

Then came the rise of sites like Amazon and Zappos.

Today, there is no excuse not to optimizing and improvng the customer experience.

At IRCE 2014, MarketingSherpa Reporter, Allison Banko, sat down with Lisa Foreman, Marketing Conversion Manager, Nations Photo Lab, to discuss the necessity of testing.

“If your website is not user friendly, then you’re just not going to convert the customers,” Lisa said. “And it’s easy.”

Lisa explained that the testing technology available rules out any excuse that marketers may have had before when it came to not testing.

“As a marketer without technical experience, I can set up tests on my own without the help from my developer … and I can declare statistical significance as soon as they are ready and get them rolled out,” she said.

The barrier to beginning a testing program without knowing how to code is deteriorating, Lisa added, which is great news for marketers in a world where customers demand instant, seamless experiences across devices and pages.

Developing savvy-looking sites might get your internal marketing department excited, but Lisa warned her peers, “You should be testing it first.”

She suggested that money spent on the development of a new template or designing new pages and experiences are wasted if these changes don’t actually improve the customer’s experience.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg

Web Optimization: 3 strategies to improve testing operations at your company

December 11th, 2014 1 comment

In a previous blog post, we detailed how Felix + Iris, a newly launched eyewear ecommerce site, made simple tweaks to its hero unit to improve home try-on conversion 72%.

In this blog post, read about how the Felix + Iris marketing team has embraced testing, and how the team shares results throughout the company. Read on to hear more from Jon Corwin, User Experience Lead, One Click Ventures (parent company of Felix + Iris), and how his strategies achieved testing and optimization success.

 

Step #1. Integrate testing into company culture

At One Click Ventures, the testing function exists in the marketing department.

“There is very much an iterative approach or kind of a lean methodology that One Click has taken,” Jon said.

Jon explained, as far as buy-in goes, testing is not something the team has had to convince others outside of Marketing of its value.

“It’s more of a conversation of what we should test – not whether,” he said.

Marketing team members seek approval from the content team on copy changes, or the design team for anything creative, typography or image-related. Jon also explained the team’s director of marketing will, from a strategic standpoint, help make those decisions.

However, Jon explained the testing function for marketing is autonomous.

“Our testing started off as a skunkworks operation. It was almost like scratching our own itch, and launching small tests and sharing the wins after the fact,” he said.

From there, he explained it has grown and the team has embraced it as another feedback tool to help keep the company a lean operation.

With the newly launched Felix + Iris brand, the team realized testing can be used as a tool to help manage risk.

Instead of buying into a new feature on one of the One Click Venture sites, the team can build a small prototype, launch it and validate that the feature is helpful, or not, with A/B testing.

Once the team has that knowledge, Marketing can send that feature to the tech team and have similar features built out, or use lessons learned from tests to better inform how they should craft future campaigns.

“Right now, it is very much a small operation, but one that has been key in helping make some of these decisions, be it design, messaging, new feature build-out, so on, so forth,” he said.

 

Step #2. Share results constantly

Jon explained there are many different ways the marketing team shares testing results within the organization.

Once tests are completed and the results have been analyzed, Jon will email those results to the stakeholders for that specific test. In addition, weekly conversion meetings, held by Jon, are used to discuss lessons learned from tests.

Jon and the team keep a master ledger of all testing efforts, called the Test Tracker, which is in the form of an easy-to-read spreadsheet.

“That’s where we’ll document all of the testing activity and final test results, with the goal being that that’s our testing bible filled with Felix + Iris best practices based on testing we’ve done in the past,” Jon explained.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • del.icio.us
  • Digg