Archive

Posts Tagged ‘marketing insights’

Stock Images Tested: Does ethnicity in marketing images impact purchases?

August 4th, 2014 2 comments

Does ethnicity in marketing images affect a campaign’s performance?

Besides being an important marketing question, it’s also an interesting social question.

The MECLABS research team asked this question because they needed to find the best performing imagery for the first step in the Home Delivery checkout process for a MECLABS Research Partner selling newspaper subscriptions.

The test they designed was simple enough:

Background: Home Delivery ZIP code entry page for a newspaper subscription.

Goal: To increase subscription rate.

Research Question: Which design will generate the highest rate of subscriptions per page visitor?

Test Design: A/B variable cluster split test

 

Control: Standard image of newspaper on welcome mat

ethnicity-test-control

 

Treatment 1: Stock image of African American man reading newspaper

ethnicity-test-treatment1

 

Treatment 2: Stock image of older Caucasian couple reading newspaper

ethnicity-test-treatment2

  Read more…

Marketing Analytics: What annotation data can tell you about video subscribers

November 18th, 2013 No comments

In the past year, plenty of charts, projections and infographics all show video content heading in one general direction – up and to the right.

This makes sense given the use of video content is also growing as more marketers adopt video into their marketing mix, lest their message is left behind by the projected 77% of all Internet users who will be viewing video content online by 2016.

With the adoption of any new strategy comes the part where the devil is in the details – how do you measure video content and what can the data tell you?

This was a challenge Luke Thorpe, Audio and Visual Manager, MECLABS, was facing in early 2012 when MarketingExperiments Web clinics transitioned to a video format. Shortly after the transition, Luke started implementing best practices into his video editing and uploading to YouTube.

 

One of those practices is adding annotations to the end of videos. Annotations appear when your video ends and it allows users to view other videos you have on YouTube with a single click. Annotations also allow you a certain amount of creativity that Luke explained.

“I have a friend over at GoPro cameras and they were using annotations that included video thumbnails and music. When I saw their video, I wanted to experiment with something similar on our site. I thought it was cool and so my thinking was in line with that whole expression ‘imitation is the highest form of flattery.’”

 

Creativity is one thing, finding a way to measure it is another

After a few months of adding annotations, Luke started to notice a spike in the number of subscribers to the MarketingExperiments YouTube channel. But, he wasn’t sure why the gain rate was increasing because YouTube’s dashboard analytics currently don’t allow you to directly measure the impact on subscriptions through adding annotations.

Also, given at the time he didn’t have any tracking in place, our options were limited in what he could learn, so he asked me to take a look at the data.

Here’s what we found:

 

First, we compiled the daily views and subscriptions to see if any correlation existed.

We found that the likelihood of a correlation existing between daily views and subscribers at around 76%, but the caveat here is the ever cautionary tale of statistics. Correlation is not always causation.

 

Explore every option  

Next, we took subscriber gain stats for the month prior without the annotations and compared that data with subscribes after Luke started annotating videos and found that there was a correlation between views and subscriptions.

 

What can we learn from this?

There are a few things we can glean from the results:

  • There’s no surprise here that a correlation exists between an increase in views and subscriptions. It makes sense after all that they grow together as more views will likely produce more subscribes.
  • We need to discover which one of these two variables is the responsible driver for the upward trend.

The only way to truly know this would be to increase our tracking and analysis efforts to really capture the true cause for the change in behavior.

It’s interesting to consider that small changes can have a big impact, but the key rests in identifying the elements of change and understanding how they work together so you can leverage that knowledge to better serve your customers.

When I asked Luke what he thought about the findings in the data, he explained that annotations help to “close the loop” of sorts in your video content.

“Adding annotations keeps people in your channel longer and engages more of your content,” Luke said. “The views via the annotations might not be very high, but without them, the additional views would be zero.”

Read more…

Email Marketing: Promotional vs. letter-style test increases conversion 181%

October 14th, 2013 4 comments

At the heart of email marketing campaigns, it often seems as if a tug-of-war is being waged.

On one side, you have gaining attention as a tactic and on the other, you have using conversation.

But, which of these is truly effective?

Let’s take a look at how the MECLABS research team tested a promotional-style email design against a letter-style and what we can learn from the results.

Before we get started, here’s a quick review of the research notes for a little background on the experiment.

Background: A large international media company focusing on increasing
subscription rates.

Goal: To increase the number of conversions based on the value proposition conveyed through the email.

Primary Research Question: Which email will generate the highest conversion rate?

Approach: A/B multifactor split test

 

Control 

 

The research team hypothesized the control featured popular design principles to create balance and hierarchy on the page.

The promotional-style email also featured heavy use of images and graphics to catch the readers’ attention and multiple call-to-action buttons for increased points of entry.

 

Treatment

 

In the treatment, a letter-style email was designed to look and feel more like a personal letter. The design limited the use of graphics and images and featured a single call-to-action button.

 

Results

 

What you need to know

By limiting the amount of graphics and focusing on engaging the customer in a conversation, the treatment outperformed the control by 181%. To learn more about why the letter-style email beat the promotional-style design, you can watch the free on-demand MarketingExperiments Web clinic replay of “Are Letter-Style Emails Still Effective?”

Read more…

Marketing Analytics: 4 tips for productive conversations with your data analyst

September 19th, 2013 No comments

Every week, I encounter work orders from research analysts requesting data analysis on our Research Partners’ test results that are difficult to understand.

I think there’s a reason for this – poor communication.

I’ve noticed a lack of understanding of how data analysts think. Research analysts and many marketers do not define projects and goals the same way data analysts approach a data challenge.

Data analysis takes time and resources, so the less time spent interpreting desire over data will leave more room in the budget for necessary analysis. Better, clearer communication means everybody wins.

I wanted to share with you four tips to boost your team’s communication that will hopefully save you a little time and money in the process.

 

Tip #1. The more specific you are, the faster we can help you

If you’ve had experience working with data analysts, then you may know sometimes the conversation can be like asking someone for the time of day and they explain how their entire watch works, even if you just wanted to know the time.

But, who is really responsible for the failure to communicate here?

Is it the timekeeper for being specific, or was it the vagueness of the person who needed the time?

My point here is these kinds of communication mishaps often ring especially true in the analytics world. I can attest that an analyst with clear objectives and goals will be able to perform analysis at an accelerated rate with less revisions and meetings necessary to achieve results.

So, instead of asking for general analysis of a webpage, email campaign or other initiative, try asking for the specifics you want to know.

When an analyst hears general analysis, it’s like giving us a set of Legos and expecting us to instinctively know you wanted a plane constructed instead of the impressive 40-story futuristic building.

 

For example, let’s look at the following requests that highlight how just a few more details can make all the difference:

  • Request #1: “I need to know the clickthrough rate for new visitors compared to returning visitors.”

This question is going to get you what you need faster than asking for a general analysis of a webpage.

  • Request #2:I need to know the clickthrough rate for new visitors compared to returning visitors for the second to third step of the checkout funnel.”

The second request would likely deliver the rate from steps two and three for the different visitor types.

Now, if I only had the first request to work off of, I would deliver the clickthrough rate for every step of the funnel, which takes substantially longer and costs more.

This is also because data analysts have a tendency to flex their advanced analytics muscles when given the opportunity. We want to deliver quality work.

But, the time and effort to achieving those impressive results when something quick and easy would have been equally beneficial to your needs is a waste.

 

Tip #2. Knowing how you’re going to use the data helps

To better help you with a project, we need to know how you will use the data. 

So, when starting a new project, take some time beforehand to sit down with your analyst, it’s not as bad as you think, and discuss which specific topics or characteristics will help you gain the knowledge you need quickly.

If the data will be used for internal discovery, analysts will likely approach analysis, especially the final reporting, somewhat differently than for external reporting.

 

Tip #3. Creating fancy charts should be the exception, not the rule

Knowing how the data is going to be presented will help your analysts avoid wasting precious computational time making fancy charts and graphs if you only need the information for internal use.

Formatting of charts and graphs can end up taking way more time than one would imagine, so an analyst should worry about pretty charts only when needed.

Another reason it is important to discuss how the data will be used is because your analyst might use a more efficient reporting structure. They may use graph and chart types that you ask for when in fact, they could have used a more sophisticated technique if they knew what the end reporting needed to show an audience visually.

For instance, conversations that ask, “Do you need bar graphs for each individual variable?” should happen a lot more often than they do.  

This can become cumbersome and meticulous leading up to final presentations, but if the information is represented with clarity and efficiency using the right combination of charts, everyone wins.

Read more…

Online Testing: 6 business lessons I’ve adapted to website optimization

September 16th, 2013 3 comments

I never intended to become an online marketer. In fact, I never planned to go into anything business related.

My big life plan was to quietly finish my history degree and curate at the most famous museum that would take me. Sometimes, however, things turn out better than we planned and a bad job market pushed me into an MBA.

This is how I came to find myself with the coolest job in the world: optimizing everything I can get my hands on at MECLABS.

Unfortunately, without a marketing or e-commerce degree, I sometimes I feel like a fish out of water around my colleagues. As a result, I have learned to adapt what minimal business savvy I do have into concepts I can apply to optimization and testing.

 

Look for bright spots

A bright spot in the business world is a successful element in an otherwise unsuccessful situation. (If you haven’t read the book Switch by Chip and Dan Heath, I definitely recommend it as they nail the concept of “finding the bright spots” as a catalyst for change.)

And for CEOs and executives, finding the bright spots can mean identifying successful employees or programs and then finding a way to spread those practices or patterns company-wide.

We often use a similar method to the one Chip and Dan advocate when working to optimize a page or site.

Bright spot identification in optimization can come in the form of competitor or internal analysis. At the beginning of a MECLABS Research Partnership, we perform an analysis of the Research Partner’s competitors to identify things the competitors are doing well that we could repurpose for our Partners.

Consider for example, if every competitor in the marketplace is promoting its product with large dynamic images and the Partner is using smaller ones, we might identify testing larger images as an industry bright spot opportunity. Many of these bright spots can create big learnings and substantial lifts.

For internal analysis, we often review the entirety of a Partner’s site looking for pages, traffic segments or functionality that is outperforming other options. To do this type of analysis, we usually employ the data team to pull and analyze engagement and conversion data from throughout the site and perform a Conversion Index Analysis.

Sometimes, the internal bright spots can come from the most unexpected places. For example, we had a Partner once who was seeing a much higher clickthrough rate from one of its landing pages as compared to all of the others.

After running data analysis and reviewing the page, we determined the well-performing page had a video that was not present on the others. We tested the video on the other pages and saw a lift across the board.

 

Be proactive, not reactive

In the business world, it pays to be nimble.

Diversification and flexibility will always be buzzwords for the CEO crowd because being able to stay ahead of rapidly changing industry standards is a coveted skill. In years past, Apple has been hailed as a company with such skills.

When asked about its ability to define the industry, Steve Jobs famously said, “People don’t know what they want until you show it to them.” 

There are many symptoms of a site that needs testing. For example, falling conversion rates, high bounce rates and visitor complaints.

Yet, many marketers will wait until one of these symptoms appears to begin looking at ways to improve their site – but this is a reactive response to problems that could have been prevented.

Performing analysis proactively on how you can make your page better is a step toward a better user experience overall. Steve Jobs did not wait for the marketplace to ask for an iPhone. Instead, he looked at the industry, identified an unfilled need, and made a product that became the new standard for smartphone design.

As online testers, we should be inventing iPhones all over our pages. Don’t wait for your customers to tell you something is wrong. Actively search out ways you can make your site a better user experience and then test until you have something capable of changing the industry.

 

First give value, then get value 

“First give value, then get value,” was a favorite saying of an old business professor of mine.

He used the phrase to mean you have to provide value in a workplace before you should expect to get any back. He also used it as a rather diabolical system of extra credit, but that is beside the point. When it comes to applying business concepts to optimization, this is one of the most cut and dry examples.

We must provide our site visitors and customers some sort of value before we can expect to get any back.

The value we provide to a visitor often comes in the form of a well-crafted value expression. It should be one highlighting all of the major elements of value: appeal, exclusivity, clarity and credibility.

We can also provide value via incentives in the forms of a free download or extra product offerings. Whatever you use as “value,” you must remember it has to have value in the eyes of a customer. Irrelevant or useless downloads may not provide enough value to get their information.

For example, a short e-book or buyer’s guide in exchange for a long or multi-step lead gen form including a phone number might not be an equal value exchange. At the same time, asking for a name and email address in exchange for the same incentive might be just right.

  Read more…

Marketing Analytics: 4 tips to boost confidence in your analytics reporting

August 15th, 2013 No comments

Have you worked on a project where the data reporting was less than ideal?

An odd conversion rate here, or something fishy about the funnel setup there. The momentum builds and before you know it, a lack of data integrity has become a confidence killer.

Sadly, confidence killers do occur more than I would like when working on our Research Partners’ websites, which brings us to a greater problem …

How can you optimize a website if the integrity of your data is highly suspect?

In today’s MarketingExperiments Blog post, I wanted to share four simple tips to help you gain confidence in your metrics reporting that you can use to aid your testing and analytics efforts.

 

Tip #1. Earn analytics access if you can, and fight for it if you can’t

This one may be a no-brainer, but in some cases, access to a company’s analytics is limited even to the research team that relies on the information to improve performance.

The big problem with this limitation is it can completely undermine the good faith involved in any project. So, how can you avoid this issue in the first place?

A simple answer is to try and negotiate access to any and all information you may need for your project in advance and pour over those analytics with your data analyst. With any luck, you’ll be able to spot any inconsistencies before you start testing.

I’ve seen issues ranging from two sites incorrectly using the same UA code, which is the tracking ID Google Analytics uses to tie a site to the analytics platform, which renders all traffic numbers inaccurate. I’ve also witnessed funnels with astronomical conversion rates caused by an overabundance of test orders, and everything in between.

Without gaining access to really dig into the metrics, what guarantee is there that you would ever get to the root of the issue?

But, if you can’t receive access no matter what you try, here are a few things you can do.

  • Request reports for the areas of the site you are working on in advance at the beginning of your project.
  • Analyze any data you can access to establish a baseline, and identify validity threats that could endanger the project.

If you spot inconsistencies, make sure to notify business leaders about those problems to help you effectively manage testing expectations. If you are comfortable with the data, agree on a set of reports and KPIs the project will be judged on.

When testing or launching a major refresh of a page, it is also important to agree on a set schedule of reporting so you are able to check the fidelity of the data and catch anything that could pop-up in a reasonable timeframe. I would suggest checking the data within two days of the change and every other day from that point in time going forward.

 

Tip #2. Treat quality assurance as an integral component of every project

I cannot stress how important a good QA process is in ensuring the data you are basing your recommendations on is accurate. While it may not be the most glamorous part of the project, it is often one of the most important components to boosting accuracy and mitigating risk.

How do you ensure your funnels are tracking properly or that a test was set up correctly?

It can be as simple as setting up a few test scenarios and recording the selections you make, pages you visit, buttons you click or events that should fire, and then ensuring the analytics platform matches the data you entered as you recorded it. This should be performed in a staging environment first, and then duplicate what you have done in a production environment.

 

Tip #3. Set up tracking in another analytics or testing platform

What analytics platforms do you use to track your online sales and marketing efforts? Do you use multiple platforms? Do these platforms integrate with backend sales data?

If you answered “yes” to any of those questions, then you should consider setting up redundant tracking across multiple analytics platforms or within the same platform.

While it’s not realistic to think everything will match up perfectly, this can help your team decide on an acceptable margin of error between platforms that can be monitored for inconsistency.

Also, make sure important funnels, events and pages are tagged consistently between platforms and checked for accuracy.

A caveat I would offer here is to be wary of the different definitions of metrics and how they are calculated across analytics platforms  because a specific metric in one platform may not be defined the same across all of them. Try to find the most similar metrics in definition and base your comparison of platforms on that selected metric.

Using multiple platforms for tracking can also help your long-term tracking engagements where it helps to have another set of eyes on the data.

Read more…