Archive

Posts Tagged ‘a/b testing’

Here’s Why Most A/B Testing is Boring and Not Really Worth the Trouble

April 6th, 2015 1 comment

Do a quick Google search on “things to a/b test on a website,” scan the results for a moment, then come back and read the rest of this article.

Most of you reading this are marketers, so you know I’m taking a big risk by telling you to go do something else before you read my article.

In fact if you’re reading this now, you’re probably one of the very few who made it back from that incredibly distracting activity I had you do. Thank you. You are exactly the person I want to be reading this. The others can go on their merry way. They are not the ones who need to hear this.

I had you do that search because the Internet is full of people telling you to test things on your website such as color, button size, layouts, forms, etc. I wanted you to get an idea for what’s out there.

Now, I want you to understand why almost everyone writing those articles is wrong

… or at the very least, missing the point.

Please don’t view this as me putting down the people who wrote those articles. I know a few of them personally, and I highly respect the work they are doing. This is not about whether their work is good or bad.

I’ve personally written many articles exactly like the ones they’re writing. In fact, they have one up on me because at least their articles are ranking in Google for popular search terms.

The reason they are missing the point is that most of those articles are focused on the elements of a page rather than the serving of a customer.

I get why they do it.

Webpages are far easier to understand than people. Webpages are a collection of 0s and 1s. People are a collection of who knows what.

And most of you, readers, are looking for webpage fixes — not a deeper, fuller way to serve your customer.

There is nothing necessarily wrong with you, but it’s just that we naturally focus on our own self-interest. It isn’t wrong in itself.

What is wrong is the methods we use to achieve our own goals. I don’t mean morally wrong. I mean practically wrong.

 

Our objective should always be: Make as much money possible.

MECLABS Institute has found after more than 15 years of research that the best method for achieving this objective is to spend as much money possible on serving your customer.

Until we can view every A/B test we run as an opportunity to better serve our customers, we will just be running (ultimately) ineffective tests on page elements.

It doesn’t really matter in the long run which color, layout or page element is going to perform well.

The Internet is constantly changing. Design trends are always going to influence how we look at webpages and their elements. What matters for marketers in the long run is how well we understand and, consequently, how well we can serve our customers.

Flint McGlaughlin, Managing Director and CEO, MECLABS, calls this understanding of our customers “customer wisdom.

This is also why he often says, “The goal of a test is not to get a lift, but rather to get a learning.”

However, it’s one thing to hear this, another to really understand what it means.

It really means we want to conduct research, not run a test.

We want to learn a tangible lesson about our customer so that we can apply it to other areas of our marketing and achieve a maximum return on the amount of time and energy we spend on testing.

Let me show you what I mean with a real-world example. Here’s what happens when you just run an A/B test that is focused on a page element. Let’s take color for instance.

You have two treatments. The only thing changed is the background color. 

 

You also have a result. In this case, the result was a 19.5% increase in clickthrough at a 92% level of confidence. But here’s where things get tricky.

Read more…

Permission Pass Email Send: A proven method for cleaning your mailing list

April 2nd, 2015 No comments

If you are reading this, you are likely in one of two positions:

  1. You have decided it is time to cleanse your email list of the inactive subscribers that no longer engage with your email sends, or …
  2. You need to stay compliant with your email management software (EMS), and you are being required to send your subscribers a permission pass to keep emailing them. A permission pass is a one-time send to an email list to reconfirm permission to email.

If you are in the latter position, don’t panic. This is actually a good opportunity to clean up your list and increase engagement with your current list.

At MarketingExperiments, our team recently did just that. We sent out a permission pass email to clean our list of inactive subscribers (which only drag down our rates).

We decided to run a test on the permission pass email based off of a previous blog that Daniel Burstein, Director of Editorial Content, MarketingSherpa, wrote back in September for a re-engagement campaign MarketingExperiments implemented after the Canadian Anti-Spam Legislation. While this campaign was not a permission pass, it was similar, and we were able to work off the findings from that campaign to formulate the test discussed in this blog post.

The main objective of the test was to see if subscribers would be more willing to opt back in with us if we offered them an incentive. While discovering that incentives were not valuable to inactive subscribers, our team also uncovered some valuable takeaways that will be quite insightful for any future permission pass sends.

 

Treatment #1. General Value

Treatment #1 focused on reminding subscribers of the value they would continue to receive with MarketingExperiments. 

 

Treatment #2. General Value and Incentive Offering

Treatment #2 also communicated a reminder of the value subscribers would continue to receive with MarketingExperiments. Additionally, it alerted them that by opting back in with MarketingExperiments, they would be entered to win a free MECLABS online training course.

Read more…

Email Preheaders Tested: The surprising sensitivity of a single line of text

January 8th, 2015 3 comments

Earlier this year I reached out to a friend of mine who manages training with the Salesforce Marketing Cloud (previously known as ExactTarget) to get a sense of what questions everyday marketers were having concerning email.

“Preheaders,” was her quick response. Specifically on “using a preheader, not using a pre-header — what should be in the preheader.”

Just in case you’re not familiar with a preheader, it is the line of preview text you find below the subject line on mobile device email apps and even in the Outlook preview pane.

 

Focusing on that piece of information, I took to the database and decided to do some looking around.

Surprisingly, I didn’t find as many tests as I usually find. This is an item that has just started to get the attention of marketers as of late. Additionally, when I searched on the Internet, I could not find a single experiment published on the subject with statistical significance.

I decided to oversee some tests myself, hoping to solidify some of the initial patterns I was noticing from my initial view of the database.

This is what I discovered: Preheaders can indeed have a significant effect on your email performance metrics. However, I still had some questions:

  • With what metrics?
  • In what way?
  • By how much?

To help answer those questions, I’d like to reference two recent examples for the same type of email:

Read more…

A/B Testing: How to improve already effective marketing (and win a ticket to Email Summit in Vegas)

January 5th, 2015 183 comments

Editor’s Note: This subject line contest is no longer accepting entries. Check out “The Writer’s Dilemma:How to know which marketing copy will really be most effective” to see which entry won, why it won and what you can learn from that to further improve your own marketing.

This blog post ends with an opportunity for you to win a stay at the ARIA Resort & Casino in Las Vegas and a ticket to Email Summit, but it begins with an essential question for marketers:

How can you improve already successful marketing, advertising, websites and copywriting?

Today’s MarketingExperiments blog post is going to be unique. Not only are we going to teach you how to address this challenge, we’re going to also offer an example to help drive home the lesson. We’re going to cover a lot of ground today, so let’s dive in.

 

Give the people what they want …

Some copy and design is so bad, the fixes are obvious. Maybe you shouldn’t insult the customer in the headline. Maybe you should update the website that still uses a dot matrix font.

But when you’re already doing well, how can you continue to improve?

I don’t have the answer for you, but I’ll tell you who does — your customers.

There are many tricks, gimmicks and types of technology you can use in marketing, but when you strip away all the hype and rhetoric, successful marketing is pretty straightforward — clearly communicate the value your offer provides to people who will pay you for that value.

Easier said than done, of course.

How do you determine what customers want and the best way to deliver it to them?

Well, there are many ways to learn from customers, such as focus groups, surveys and social listening.

While there is value in asking people what they want, there is also a major challenge in it.

According to research from Dr. Noah J. Goldstein, Associate Professor of Management and Organizations, UCLA Anderson School of Management, “People’s ability to understand the factors that affect their behavior is surprisingly poor.”

Or, as Malcom Gladwell more glibly puts it when referring to coffee choices, “The mind knows not what the tongue wants.”

This is not to say that opinion-based customer preference research is bad. It can be helpful. However, it should be the beginning of your quest, not the end.

 

… by seeing what they actually do

You can use what you learn from opinion-based research to create a hypothesis about what customers want, and then run an experiment to see how they actually behave in real-world customer interactions with your product, marketing messages and website.

The technique that powers this kind of research is often known as A/B testing, split testing, landing page optimization or website optimization. If you are testing more than one thing at a time, it may also be referred to as multivariate testing.

To offer a simple example, you might assume that customers buy your product because it tastes great and because it’s less filling. Keeping these two assumptions in mind, you could create two landing pages — one with a headline that promotes that taste (treatment A) and another that mentions the low carbs (treatment B). You then send half the traffic that visits that URL to each version and see which performs better.

Here is a simple visual that Joey Taravella, Content Writer, MECLABS created to illustrate this concept: 

 

That’s just one test. To really learn about your customers, you must continue the process and create a testing-optimization cycle in your organization — continue to run A/B tests, record the findings, learn from them, create more hypotheses and test again based on these hypotheses.

This is true marketing experimentation, and it helps you build your theory of the customer.

 

Try your hand at A/B testing for a chance to win

Now that you have a basic understanding of marketing experimentation (there is also more information in the “You might also like” section of this blog post that you may find helpful), let’s engage in a real example to help drive home these lessons in a way you can apply to your own marketing challenges.

To help you take your marketing to the next level, The Moz Blog and MarketingExperiments Blog have joined forces to run a unique marketing experimentation contest.

In this blog post, we’re presenting you with a real challenge from a real organization and asking you to write a subject line that we’ll test with real customers. It’s simple; just leave your subject line as a comment in this blog post.

We’re going to pick three subject lines from The Moz Blog and three from the MarketingExperiments Blog and run a test with this organization’s customers.

Whoever writes the best performing subject line will win a stay at the ARIA Resort in Las Vegas as well as a two-day ticket to MarketingSherpa Email Summit 2015 to help them gain lessons to further improve their marketing.

Sound good? OK, let’s dive in and tell you about your client:

Read more…

Testing and Optimization: 4 inspirational examples of experimentation and success

November 6th, 2014 1 comment

At our sister publication, MarketingSherpa, we publish four case study beats – B2B, B2C, Email and Inbound – with stories covering actual marketing efforts from your peers each week. Not every case study features a testing and optimization element, but many do.

For this MarketingExperiments Blog post, I wanted to share a quick summary of several of these case studies, along with links to the entire article (including creative samples) in case any pique your interest and you want to dig into the entire campaign.

So, without further ado, read on for four MarketingSherpa case studies that feature testing and optimization of various digital marketing channels, strategies and tactics.

 

Case Study #1. 91% conversion lift from new copy and layout

This case study features AwayFind, a company that provides mobile email alerts, and covers an effort to test, and hopefully improve, its homepage performance.

Brian Smith, Director of Marketing, AwayFind, said, “Our primary driver of traffic is our PR efforts. Our homepage is effectively our primary landing page, and we need to convert that traffic into premium users.”

The testing included both changing copy and layout elements. The main copy change was instead of focusing on features, the treatment copy focused on benefits, and layout tweaks included a shortened headline, the remaining copy was split between a subhead and a smaller block of text, and the color of the subhead text was also modified.

In this test, the treatment achieved:

  • 42% increase in clicks to the sign-up page
  • 91% increase in registrations for the trial

  Read more…

Online Testing: 3 resources to inspire your ecommerce optimization

July 3rd, 2014 No comments

Optimizing to improve a customer experience can be a little overwhelming when you consider all the nuts and bolts that make up an entire ecommerce property in its entirety.

In this MarketingExperiments Blog post, we’ll take a look at three ecommerce resources from our testing library that will hopefully spark a few ideas that you can to add to your testing queue.

 

Read: A/B Testing: Product page testing increases conversion 78%

ebook-retailer-versions

 

How it can help

This experiment with a MECLABS Research Partner is a great example illustrating how testing elements on your product pages that are probable cause for customer concern is the best way to alleviate anxiety.

 

Watch: Marketing Multiple Products: How radical thinking about a multi-product offer led to a 70% increase in conversion

 

In this Web clinic replay, Austin McCraw, Senior Director of Content Production, MECLABS, shared how radical thinking about a multi-product offer led one company to a 70% increase in conversion.

 

How it can help

 One big takeaway from this clinic you need to understand is that strategic elimination of competing offers on pages with multiple products can help drive customers’ focus to the right product choices for their needs.

 

Learn: Category Pages that Work: Recent research reveals design changes that led to a 61.2% increase in product purchases

 

These slides are from a Web clinic on category pages in which Flint McGlaughlin, Managing Director, MECLABS, revealed the results of category page design changes that increased clicks and conversions across multiple industries.

Read more…