Posts Tagged ‘a/b testing’

The Importance of Testing: How one test applied to two email sends resulted in different audience responses

November 23rd, 2015 No comments

At MarketingExperiments, sister company of MarketingSherpa and parent company of MECLABS, we believe in testing. Our Web clinics are a testament to this belief – every month we share research that is designed to help marketers do their jobs better.

This culture of testing encouraged me to run my own test.

First, I ran the test on a MarketingSherpa send. After the results of that test came back, the same test was then applied to a MarketingExperiments’ newsletter.

By running virtually the same test twice for two different audiences not only did I learn more about the preferences of ourreaders, but I also learned how incredibly important it is to test, even when you are sure you know what the results will be.


The MarketingSherpa test

As the copy editor at MECLABS, I get to see all of the copy produced by both MarketingExperiments and MarketingSherpa. One of my responsibilities is overseeing the email newsletter sends. Every Monday, MarketingSherpa sends out a Best of the Week newsletter which features the most popular articles of the previous week and introduces the new book for the MarketingSherpa Book Giveaway.

The copy in these newsletters was extensive. Every article listed had a full summary which, as a reader, I imagined would seem overwhelming when opening this email on a Monday morning.

With the help of Selena Blue, Manager of Editoral Content, and Daniel Burstein, Director of Editorial Content, I decided to change reduce the length of the Best of the Week newsletter and use an A/B split test to determine which format our readers preferred.

Below is a side-by-side of the Control and the Treatment. 


After running this test for six weeks, there was a definite trend of higher total and unique clickthrough happening for the shorter treatment versions. My hypothesis proved to be true — less copy in the email meant higher engagement rates with readers.


The MarketingExperiments’ test

Based on the success of the MarketingSherpa test, I thought it would be interesting to run the same test on MarketingExperiments’ Best of the Month newsletter send, which is sent to readers on the first of every month. Similar to the Best of the Week email send, the Best of the Month features the most popular articles of the month, gives a recap of the latest Web clinic and provides a description of the upcoming Web clinic. This was done with the help of Ken Bowen, Managing Editor, MarketingExperiments.

The Best of the Month send has more information in it, so it is naturally a longer newsletter than the Best of the Week email previously tested. Therefore, it felt like a natural mental leap to assume that the shorter Treatment would once again get higher clickthrough rates as it seemed to be more reader-friendly.

As you can see with the below Control of the Best of the Month from August, there was a lot of copy in these sends.


For the Treatment, we once again cut the article summaries. In addition to losing this copy, we also cut the recap of the previous month’s research and instead just had the upcoming month’s Web clinic description. We ran this A/B split test for three months.


Due to the results of the Best of the Week testing, I knew what the results of this testing would be. If one audience responded positively the lessened copy, certainly so would another audience. The fact that we were virtually re-running the same test seemed to me to just be a formality.

Spoiler Alert: This is why testing is so imperative.

If I had applied the results from one test to all newsletter sends, I would have been doing our readers a disservice. Below are the results broken down by month.


Of the three months of testing, only one month validated. Although the Treatment had slightly higher clickthrough rates, it wasn’t significant enough to regularly validate as it did with the Best of the Week send. It appears that readers of the Best of the Month newsletter may find value in the summaries and extended copy that the Control version had.

Testing allowed us to change the MarketingSherpa Best of the Week send to better serve our readers and also learn that the same changes made to the Best of the Month send did not make a statistically significant difference to MarketingExperiments’ readers.



Testing is so important. I work for a company that values testing and is continually striving to best suit the needs of our readers. The editors and directors overseeing both MarketingSherpa and MarketingExperiments were very open to my experimenting with the format of a send that has been with the company much longer than I have.

And while the change of the format itself is important, so was the testing involved. It allowed me to really see how our readers were digesting the information that I was sending to them on a weekly or monthly basis. To better understand the patterns and behaviors of our audience only benefits me as a copy editor.

This meant that one audience preferred the lessened copy while the other may find value with more text . Without re-running the test, I would have blindly assumed that the MarketingExperiments audience had the same reading preferences as our MarketingSherpa audience.

Testing allows us to validate even that which we are sure to be true, and to humble us when this in fact turns out to be false.


You might also like

MarketingSherpa Summit 2016 — At the Bellagio in Las Vegas, February 22-24

A/B Testing: Ecommerce site’s 3,000 positive comments show why you can’t trust just one test

Testing and Optimization: Welcome send test results in 46% open rate for CNET

Categories: General Tags: ,

How Variance Testing Increased Conversion 45% for Extra Space Storage

November 12th, 2015 No comments

When it comes to testing, A/B testing typically steals the spotlight, casting its sister procedure, variance testing, in the shadows. However, according to Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, that’s a mistake.

At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, Emily presented on how her team was able to utilize variance testing to transform Extra Space Storage’s Wild West testing culture into a wildly successful testing environment.

Before the team conducted variance testing, the company’s testing environment was structured like a free-for-all. There were few, if any, set rules in place, and, according to Emily, the person with the highest title and the loudest voice typically had their test implemented. All of this changed after the Extra Space Storage team ran some variance tests.

Variance testing measures two identical Web experiences to determine a site or page’s natural variability. This procedure generally constructs the rules for subsequent A/B tests to follow.

By focusing on variance testing and translating the results from this procedure into rules for A/B testing, Extra Space Storage achieved a 45% increase in conversion rate from the previous year. Watch the below excerpt to learn the results of the team’s test, the rules they developed and Emily’s advice on when to start variance testing and how to implement it.

  Read more…

Nonprofit Testing: How one small change led to 190% increase in clickthrough

September 28th, 2015 1 comment

As marketers, we all dream of expensive radical redesigns of our websites that check off every item on our wish lists. Over the years here at MarketingExperiments, though, we have routinely discovered that with a proper understanding of customers, the smallest changes can often yield the biggest results.

A recent test from our friends over at NextAfter demonstrates this fact.



As Tim Kachuriak, Chief Innovation and Optimization Officer, NextAfter, noted when he joined us for our August Web clinic, Personalized Messaging Tested, NextAfter works exclusively with nonprofit organizations to discover what truly makes donors give.

Dallas Theological Seminary (DTS) is one such organization that NextAfter has partnered with to help answer this question.

In an earlier experiment with DTS, Kevin Peters, Vice President, NextAfter, had found that visitors arriving at the organization’s primary donation page were highly motivated to give. He discovered this by testing two forms of the page.

The first version of the donation page cut immediately to the chase, asking donors to “make a gift online.” The second version of the page posited that perhaps DTS was asking too much, too soon, and prefaced the “ask” with copy highlighting the unique value proposition of DTS. Quotes from well-known figures in the Christian community were also leveraged to build additional credibility. 

Read more…

The A-Z of A/B Testing: Handy guide to nearly 200 conversion optimization definitions

September 14th, 2015 No comments

Here at MarketingExperiments, applying the scientific method to digital marketing campaigns is at the heart of what we do.

As digital marketing, A/B testing and optimization have become so important over the last 15 years that we’ve literally built the MECLABS Institute around them, a whole new language has sprung up around these ideas, complete with its own unique buzzwords, acronyms and industry shorthand.

As marketers, particularly those just starting to test, it can be exhausting just figuring out how to talk the talk.

To help you cut through the jargon and get straight to the heart of these terms, we at MarketingExperiments have created the Marketing and Online Testing Dictionary. This tool not only provides definitions for nearly two hundred marketing terms related to A/B and multivariate testing, but also offers additional resources to help you learn more about many of the ideas and concepts presented herein.

Download your copy of the Marketing and Online Testing Dictionary now, or risk being that guy or girl in the next marketing meeting, clumsily Googling optimization terms on your smartphone under the conference table while praying that your CMO doesn’t notice.


Read more…

How a Cloud-Based Video Creation Service Uses Testing to Better Understand What Customers Want

September 10th, 2015 No comments

What assumptions do you make about your customers? How do you validate if those assumptions are true or, instead, actually damaging conversion?

A/B testing can help you discover what really works with your customers.

Animoto, a cloud-based video creation service, usually has at least one test running every week and runs a total of about 250 tests with millions of customers on its website every year. I sat down with Brad Jefferson, CEO, Animoto, to get an inside look at his company’s testing practices and see what he’s learned about his customers along the way.


One of the tests that Brad’s team ran was to determine what type of sample videos would be most effective.

Read more…

Email Marketing: 5 test ideas for personalizing your email campaigns

September 3rd, 2015 No comments

Personalization is not new to email marketing; but has it lost some of its appeal with marketers?

Only 36% of marketers said they dynamically personalize email content using first names in subject lines and geo-location, according to the MarketingSherpa 2013 Email Marketing Benchmark Report. The report also revealed that only 37% of marketers segment email campaigns based on behavior.

However, marketers from various industries have seen incredible success with personalization. I dove into the library of MarketingSherpa, MarketingExperiments’ sister company, to find out how marketers have used both tried-and-true personalization tactics and innovative, tech-savvy strategies to better engage their customers and email audience.

No tactic or strategy is foolproof, so we suggest using these campaign tactics as testing ideas to see what works with your audience when it comes to email personalization.


Idea #1. Turn your email into a personal note, not a promotional email

As Flint McGlaughlin, Managing Director, MECLABS Institute, says, “People don’t buy from websites, people buy from people.”

The same applies to emails. As we saw in a recent MarketingExperiments’ Web clinic, “Personalized Messaging Tested: How little changes to an email send led to a 380% change in response rate,” when inviting your customers to take an action or attend an event, sending the email from a real person on your team can have a huge impact on the results of your campaign.

Read more…