Archive for the ‘Analytics & Testing’ Category

John Rambo or James Bond: What kind of marketing action hero are you?

November 24th, 2014 4 comments

While you may never have to battle gangs of ninjas, jump from flaming helicopters, or defeat eye-patch-laden villains in bloody shootouts, an entirely different type of action is required of today’s marketer.

If you’re facing opposition in the form of endless reporting that never seems to make a difference, it might be time to find the hero within and add some action to your everyday work life. Are you ready to act strategically, rally supporters and face the adventure of changing your organization for the better?

Every week, our sister publication MarketingSherpaanalytics-action-hero-cover holds a free book giveaway featuring volumes that help marketers reach more customers, navigate the workplace, or just generally do their jobs more effectively.

This week’s book is Web Analytics Action Hero: Using Analysis to Gain Insight and Optimize Your Business by Brent Dykes, Evangelist for Customer Analytics, Adobe.

You’re probably wondering, “How can a Web analytics book help me discover my inner Lara Croft or Indiana Jones?” Brent has worked with industry leaders such as Microsoft, Sony, EA, Dell, Comcast and Nike. He also blogs for Adobe and has presented at more than 20 Web analytics conferences around the world.

He is a seasoned expert in using data to transform the way we do business. If anyone can tell you how to start driving actionable, data-driven change and defeat the organizational villains we all face, it’s Brent.

His book keeps an entertaining tone while being packed with informative models and examples of how to drive fact-based marketing decisions. It is essentially a how-to for becoming an action hero of science-based marketing and analysis.

The book did a great job reinforcing some of the things I’ve learned on the job as a MECLABS Optimization Analyst, planning strategy and experiments that fuel the content we produce at MarketingSherpa and MarketingExperiments.



Brent speaks to real-life work situations, and since reading his book, I’ve been able to deliver actionable analyses more effectively and use his models to become better at my job in Web optimization.

I had a chance to speak with the Web Analytics Action Hero himself to pick his brain on what it takes to be the Han Solo of the office.



MarketingExperiments: Who does data-driven analysis apply to, and why should they read your book?

Brent Dykes: Well, I think being data driven and using analysis is really important. The hype around big data has obviously created more of an interest in using and understanding data. I come from a marketing background, and I’ve worked in marketing long enough to remember those days where the vast majority of marketing decisions were being made on intuition—not data.

The digital channel has revolutionized the way we market today. It has ushered in new measurement opportunities that weren’t possible before. In general, it is easier to track digital media than traditional media, and you can get more granular detail as well. With all of the new metrics at the disposal of marketers, the data is susceptible to being misused.

It can’t be, “Hey, I wonder what metrics will make my campaign look good.” Being data-driven isn’t just about using data; it’s about using the right data in the right ways.

I think more people, not just analysts, are realizing they need to understand the data a little bit better.

I haven’t seen it 100% yet, but I think their managers are holding them more accountable to the numbers. So, I think everybody needs to embrace data-driven marketing, and that’s executives all the way down to the interns. Everybody needs to learn.


ME: With all the KPIs out there that can be tracked, how has the role of a marketer changed?

BD: Data is definitely changing the role of marketers. We’re seeing more accountability for marketers. They’re being held more accountable for how they allocate and spend their marketing budgets. Now, every marketer needs to be comfortable with data.

This shift may find some marketers who aren’t that comfortable with data. There have been different articles such as one from the Wall Street Journal that showed how data-savvy marketers are gaining a foothold in the corner office. A lot of the old guard in marketing are now feeling threatened by data. They’ll either need to adapt or continue to lose relevance.

It’s true that we have a lot more metrics and data out there. It can generate a lot of unwanted noise for marketers.

However, as I highlighted in the book, our compass for navigating all this data is having a clear understanding of what’s important to the business.

If you really understand your key business goals or objectives, then you’ll have a better sense for which data is important and which data isn’t important.

Also, the data you will need is going to change over time. Businesses are not static. As such, your organization’s goals will change, and as a result, your metrics and KPIs will need to evolve as well.


ME: What are some characteristics that we’ll find in the book on how to become an action hero in Web analytics?

BD: There are three key factors that influence whether someone can become an action hero in Web analytics.

First is ability. Some skills can be taught while others are innate. You can learn about digital marketing or how to use an analytics tool, but someone can’t be taught to be curious or intelligent. It’s also not all about having a sharp analytical mind. You need to have soft skills to go with it, such as interpersonal and communication skills.

The next thing is having the right environment. I’ve seen really smart analysts fail just because they weren’t getting support from their management team. Maybe they didn’t have an executive sponsor for the analytics practice within their organization. Maybe their company wasn’t willing to invest in the right tools or provide any training to employees.

It doesn’t matter how much ability you have if you’re only one person for a very large organization. You can simply be out-manned and not have the resources to really deliver value from the data.

After ability and environment comes the approach. My book really focuses on the approach you take as an action hero. I break it up into three key steps.

First, you need to be efficient in where you focus your analysis time—prioritization is critical. There’s so much you can analyze, but not everything warrants your time.

Next, you need to be systematic in the way that you analyze the data and so I lay out a methodology for conducting your analysis.brent-dykes-quote

Then, the last step is to mobilize people around your ideas. You could be a data scientist with a Ph.D. in statistics, but if you can’t communicate your findings and insights effectively, then you’ll have very little impact on your organization.

Ultimately, the reason why I focus on action heroes in my book is because if your analysis is not driving action, you’re not driving change. If your insights aren’t changing things, then you’re failing as an analyst or data-driven marketer. And so, that last step of communicating, being a data story-teller, and mobilizing people around your ideas is really critical.


ME: One of the things I want to talk about is the villains that analysts in different departments are going to come across. Can you lay out some of the villains that a Web analytics action hero may have to combat within their organization?

BD: Yes, it was fun to put together a list of villains that Web analytics action heroes will encounter. If you think about the action hero movies that you’ve seen, they usually have a villain or two. I thought it would be helpful to highlight some of the different villains that can wreak havoc for aspiring analytics action heroes.

In the book, I introduce each of the villains and then show tips on how to defeat them. Because every villain has weaknesses, it’s rare that the villain will win in the end. My goal is to ensure my readers can defeat these data villains.

An obvious villain in my book is Analysis Paralysis, where analysts become overwhelmed with the sheer volume of data and potential analysis options. Another familiar villain is The Gut, where intuition is the ruler of the day. It’s not about the numbers but someone’s unsubstantiated opinion.

Doctor Feel Good is where people look at the metrics, but then there’s no accountability. Nobody wants to hurt anybody’s feelings. Even though the metrics or data are showing something’s wrong, nobody’s doing anything about it.

Fire Drill is where analysts are constantly going from one emergency to the next. In this stressful environment, analysts aren’t able to get ahead and generate much value; they’re just putting out one fire after the next.

The Justifier is where somebody has a point they want to prove, and they seek data to justify their idea. If the data actually refutes the point they want to make, then it will be viewed as wrong and ignored or buried.

The Lemming is somebody who chases shiny objects. They follow whatever analytics trends are popular, but they don’t stop and evaluate whether it’s really important to their business or not.

I won’t go into all ten of the villains I cover in my book, but the other ones are the Perfectionist, Stale Data, Suspect Data and the Teflon Man. If you’ve dabbled with data in any way, you’ve probably run into many of these villains already.


ME: You really stress aligning with business goals. How does that encourage action within the company?

BD: Well, when you’re going to do analysis, one of the key challenges is that many different paths lay before you. There are all kinds of really interesting things you can investigate. Many of these paths can lead you astray. You can end up investing a lot of hours analyzing things that don’t really matter to your business.

The business objectives – really understanding what’s important to your business – keeps you grounded. It keeps you focused on what’s important, and steers you away from distractions in the data. When you’re trying to affect a certain business goal, it will keep you focused and productive in your analysis efforts.

As I said before, we’ve got all kinds of data, and it can be really confusing and really overwhelming. Having a clear idea of your business objectives or goals is really the first step to making sense of which data and metrics to focus on.

Once you understand the business goals, the next question is what’s my ability to influence change? In some areas, you’re not able to influence change due to a lack of resources or time to implement your recommendations. It’s important to know that up front so you don’t waste time on analyzing things you can’t influence.

After I know I can influence change, I then need to ask: What’s the potential impact? Could this have a big influence on the business?

Then, also, what’s my level of effort? Maybe as I start looking at this, I scratch the surface and realize, “Oh, this is going to take me hundreds of hours to analyze,” and if it’s not going to have a good payoff then it’s not worth my time. So you have to balance these considerations so you can be efficient and effective with the data.

We have a finite amount of time, and we can’t look at everything. If you focus on things that matter to the business that you personally can influence, you’re more likely to drive change and action.

Another thing about business goals is they’re usually prioritized. Not all business goals are equally important. Some business goals may pale in comparison to your company’s primary business goal. So prioritization of the business goals will also influence where you should invest your time and what you should analyze.


ME: To wrap things up, what’s a piece of advice that you have for the aspiring action hero that’s kind of all-encompassing for everybody?

BD: I think the key thing is to not be afraid of the data. To get in and start using it as much as you can. In the book, I talk about the concept of Setupland and Actionland. One of the things that I’ve noticed in working in this industry for the last 10-plus years is that too many companies get stuck in Setupland.

They go through all the work of gathering requirements, getting the tags in place and collecting the data. And then, they get caught in a vicious cycle of just doing more tagging and reporting, but never really going beyond that.

I think the key behind the action hero book is emphasizing the importance of analyzing or using the data. OK, reports are great, but they’re fairly static. They can’t do anything on their own.

We must dive into the data to find insights that we can then bring back to the business. Hopefully, through this process, some positive change occurs.

Collectively, we need to move beyond Setupland and spend more time in Actionland, where the real benefits from Web analytics live. More analysis, not more reports, is the key to success.




If you want a chance to pick Brent’s brain in more detail, make sure to enter our weekly book giveaway. He’s also recently written an introductory 192-page Web Analytics Kick Start Guide e-book, which we’re giving away FREE to everyone who enters the book giveaway.




You might also like

Marketing Research Chart: Marketing analytics challenges [MarketingSherpa chart]

Marketing Data: Using predictive analytics to make sense of big data [More from the blogs]

Marketing Analytics: Now that marketers can collect data, interpretation is the top challenge [More from the blogs]

Testing and Optimization: 4 inspirational examples of experimentation and success [More from the blogs]

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Copywriting: Brevity is the soul of marketing

November 20th, 2014 1 comment

I’ve always loved this quote:

“Brevity is the soul of wit.” – William Shakespeare

To me, its beauty rests in the powerful meaning packed in six simple words. Brevity can also be used as a tool to aid your marketing, as I discovered from a recent email experiment.

But first, a little more detail about the experiment.

Background: A global producer of high-quality audio equipment and accessories.

Goal: To increase clickthrough rates in an email.

Research Question: Which email will generate the highest clickthrough rate?

Test Design: A/B multifactor, radical redesign split test


Control email-test-control


In a preliminary review of the control, the MECLABS research team hypothesized the control was at risk of underperforming and could use some strategic tweaks.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Testing and Optimization: 4 inspirational examples of experimentation and success

November 6th, 2014 1 comment

At our sister publication, MarketingSherpa, we publish four case study beats – B2B, B2C, Email and Inbound – with stories covering actual marketing efforts from your peers each week. Not every case study features a testing and optimization element, but many do.

For this MarketingExperiments Blog post, I wanted to share a quick summary of several of these case studies, along with links to the entire article (including creative samples) in case any pique your interest and you want to dig into the entire campaign.

So, without further ado, read on for four MarketingSherpa case studies that feature testing and optimization of various digital marketing channels, strategies and tactics.


Case Study #1. 91% conversion lift from new copy and layout

This case study features AwayFind, a company that provides mobile email alerts, and covers an effort to test, and hopefully improve, its homepage performance.

Brian Smith, Director of Marketing, AwayFind, said, “Our primary driver of traffic is our PR efforts. Our homepage is effectively our primary landing page, and we need to convert that traffic into premium users.”

The testing included both changing copy and layout elements. The main copy change was instead of focusing on features, the treatment copy focused on benefits, and layout tweaks included a shortened headline, the remaining copy was split between a subhead and a smaller block of text, and the color of the subhead text was also modified.

In this test, the treatment achieved:

  • 42% increase in clicks to the sign-up page
  • 91% increase in registrations for the trial

  Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

4 Threats that Make Email Testing Dangerous and How a Major Retailer Overcame Them

October 2nd, 2014 No comments

To test emails, you just send out two versions of the same email. The one with the most opens is the best one, right?


“There are way too many validity threats that can affect outcomes,” explained Matthew Hertzman, Senior Research Manager, MECLABS.

A validity threat is anything that can cause researchers to draw a wrong conclusion. Conducting marketing tests without taking them into account can easily result in costly marketing mistakes.

In fact, it’s far more dangerous than not testing at all.

“Those who neglect to test know the risk they’re taking and market their changes cautiously and with healthy trepidation,” explains Flint McGlaughlin, Managing Director and CEO, MECLABS, in his Online Testing Course. “Those who conduct invalid tests are blind to the risk they take and make their changes boldly and with an unhealthy sense of confidence.”

These are the validity threats that are most likely to impact marketing tests:

  • Instrumentation effects — The effect on a test variable caused by an external variable, which is associated with a change in the measurement instrument. In essence, how your software platform can skew results.
    • An example: 10,000 emails don’t get delivered because of a server malfunction.
  • History effects — The effect on a test variable made by an extraneous variable associated with the passing of time. In essence, how an event can affect tests outcomes.
    • An example: There’s unexpected publicity around the product at the exact time you’re running the test.
  • Selection effects — An effect on a test variable by extraneous variables associated with the different types of subjects not being evenly distributed between treatments. In essence, there’s a fresh source of traffic that skews results.
    • An example: Another division runs a pay-per-click ad that directs traffic to your email’s landing page at the same time you’re running your test.
  • Sampling distortion effects — Failure to collect a sufficient sample size. Not enough people have participated in the test to provide a valid result. In essence, the more data you collect, the better.
    • An example: Determining that a test is valid based on 100 responses when you have a list with 100,000 contacts.
Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Subscription Checkouts Optimized: How experimentation led to compounding gains at the revenue level

August 25th, 2014 No comments

Subscriptions have been the lifeblood of almost every media publication since the conception of the industry.

But imagine for a moment that you were trying to subscribe to your favorite newspaper and you were presented with something that looked like the page below.


Experiment #1. Reworking disconnected, confusing pages



This was the first step in the checkout process for  subscribing to a large media publication. 

Editor’s Note: To protect their competitive advantage, we have blurred their identity.

Once a customer entered their ZIP code to determine whether this publication could be delivered to their area, they were taken to this page. Put yourself in the mind of the customer and think about how this page would have been received.

That is precisely what the marketing team did. What they saw was a very disconnected page that gave the customer almost no reassurance that they were still buying from the well-known media publication.

  • The publication logo was almost entirely missing from the page.
  • The colors on the page did not match the brand of the company.
  • The two levels of navigation at the top of the page provided multiple opportunities to click away.
  • The entire process seemed complicated to the customer.

Though there were a number of things the team wanted to change on this page, they needed a new page that changed only a few elements. Every day this page was live on the site, the publication was losing potential revenue from customers finding the process too difficult to complete. A long, arduous Web redesign was not an option. They needed to recover some of that revenue as fast as possible.

So the team ran an experimental treatment in an online test that they thought would require the least amount of time and resources and still achieve a high return on investment. The treatment is displayed below.


Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg

Marketing Analytics: Show your work

August 14th, 2014 1 comment

Data handling and analytics can sometimes offer shocking results, as global B2B company National Instruments discovered after a surprising decrease in an email campaign’s conversion rate.


Key Obstacle: Concern about the new numbers

“When I first saw the number change, I was a bit freaked out,” said Stephanie Logerot, Database Marketing Specialist, National Instruments.

Stephanie, as a strategist, felt her greatest challenge was communicating the new way of looking at the data to National Instruments’ stakeholders outside of the database marketing team. This meant making certain everyone understood why the numbers dropped after implementing the new, more stringent data criteria.


A little background

A recent MarketingSherpa Email Marketing case study– “Marketing Analytics: How a drip email campaign transformed National Instruments’ data management” – detailed this marketing analytics challenge at National Instruments.

The data challenge arose from a drip email campaign set around its signature product.

The campaign was beta tested in some of National Instruments’ key markets: United States, United Kingdom and India. After the beta test was completed, the program rolled out globally.

The data issue came up when the team looked into the conversion metrics.

The beta test converted at 8%, the global rollout at 5%, and when a new analyst came in to parse the same data sets without any documentation on how the 5% figure was determined, the conversion rate dropped to 2%.

While interviewing the team for the case study, as what often happens in these detailed discussions, I ended up some great material that didn’t make it into the case study and wanted to share that material with you.


The team

For the case study, I interviewed Ellen Watkins, Manager, Global Database Marketing Programs, Stephanie, the database marketing specialist, and Jordan Hefton, Global Database Marketing Analyst, all of National Instruments at the time. Jordan was the new analyst who calculated the 2% conversion rate.

In this MarketingExperiments Blog post, you’ll learn how the team dealt with the surprising drop in conversion, and how they communicated why data management and analytics was going to be held to a new standard going forward.

The team overcame this obstacle with a little internal marketing.

Read more…

Share and Enjoy:
  • LinkedIn
  • StumbleUpon
  • Facebook
  • Digg