Archive

Posts Tagged ‘analytics’

Web Optimization: 3 strategies to improve testing operations at your company

December 11th, 2014 1 comment

In a previous blog post, we detailed how Felix + Iris, a newly launched eyewear ecommerce site, made simple tweaks to its hero unit to improve home try-on conversion 72%.

In this blog post, read about how the Felix + Iris marketing team has embraced testing, and how the team shares results throughout the company. Read on to hear more from Jon Corwin, User Experience Lead, One Click Ventures (parent company of Felix + Iris), and how his strategies achieved testing and optimization success.

 

Step #1. Integrate testing into company culture

At One Click Ventures, the testing function exists in the marketing department.

“There is very much an iterative approach or kind of a lean methodology that One Click has taken,” Jon said.

Jon explained, as far as buy-in goes, testing is not something the team has had to convince others outside of Marketing of its value.

“It’s more of a conversation of what we should test – not whether,” he said.

Marketing team members seek approval from the content team on copy changes, or the design team for anything creative, typography or image-related. Jon also explained the team’s director of marketing will, from a strategic standpoint, help make those decisions.

However, Jon explained the testing function for marketing is autonomous.

“Our testing started off as a skunkworks operation. It was almost like scratching our own itch, and launching small tests and sharing the wins after the fact,” he said.

From there, he explained it has grown and the team has embraced it as another feedback tool to help keep the company a lean operation.

With the newly launched Felix + Iris brand, the team realized testing can be used as a tool to help manage risk.

Instead of buying into a new feature on one of the One Click Venture sites, the team can build a small prototype, launch it and validate that the feature is helpful, or not, with A/B testing.

Once the team has that knowledge, Marketing can send that feature to the tech team and have similar features built out, or use lessons learned from tests to better inform how they should craft future campaigns.

“Right now, it is very much a small operation, but one that has been key in helping make some of these decisions, be it design, messaging, new feature build-out, so on, so forth,” he said.

 

Step #2. Share results constantly

Jon explained there are many different ways the marketing team shares testing results within the organization.

Once tests are completed and the results have been analyzed, Jon will email those results to the stakeholders for that specific test. In addition, weekly conversion meetings, held by Jon, are used to discuss lessons learned from tests.

Jon and the team keep a master ledger of all testing efforts, called the Test Tracker, which is in the form of an easy-to-read spreadsheet.

“That’s where we’ll document all of the testing activity and final test results, with the goal being that that’s our testing bible filled with Felix + Iris best practices based on testing we’ve done in the past,” Jon explained.

Read more…

Hero Unit Testing: 72% lift from simple changes you can implement today

December 8th, 2014 1 comment

Selling a product online that customers would most likely prefer trying on in a brick-and-mortar store is a challenge.

Felix + Iris, an online prescription eyewear retailer, provides a free home try-on option for its products with its Fit Kit.

However, getting customers to take the plunge to try on glasses at home was a challenge, especially because the brand is new having launched in September 2014.

Right from the get go, Jon Corwin, User Experience Lead, One Click Ventures (parent company of Felix + Iris), implemented A/B testing and optimization into every aspect of the Felix + Iris’ online presence.

The company falls under the umbrella of One Click Ventures, which owns two other online ecommerce eyewear brands, and testing is a large part of One Click Venture’s business strategy.

“We really embraced using A/B testing as another quick and easy feedback cycle to validate whether our messaging is in line with our customers’ needs,” Jon said.

In this blog post, we’ll detail one of Felix + Iris’ tests on the homepage of the site, centering around the hero image and copy for starting the free home try-on process.

 

Control

homepage-control

 

“The control’s hero unit’s design was certainly in line with our audience. It definitely spoke to our audience. It was aligned with our brand,” Jon said.

But, as Jon discovered, as a new brand, there are some disparities with no having brand equity to help carry Felix + Iris’ message.

Another issue that Jon identified was ambiguity within the call-to-action to “Get Started.”

“Essentially, the top funnel conversion point we were testing is for them to start the Fit Profile quiz. But we realized with the control, there was some ambiguity around what steps were required, what the value of the Fit Profile is for the customer, and what they get out of it,” Jon explained.

Jon developed his hypothesis: Will replacing the hero image with an actual image of the home try-on kit as well as adding the steps in the process help to portray the tangible results of trying Felix + Iris?

  Read more…

John Rambo or James Bond: What kind of marketing action hero are you?

November 24th, 2014 4 comments

While you may never have to battle gangs of ninjas, jump from flaming helicopters, or defeat eye-patch-laden villains in bloody shootouts, an entirely different type of action is required of today’s marketer.

If you’re facing opposition in the form of endless reporting that never seems to make a difference, it might be time to find the hero within and add some action to your everyday work life. Are you ready to act strategically, rally supporters and face the adventure of changing your organization for the better?

Every week, our sister publication MarketingSherpaanalytics-action-hero-cover holds a free book giveaway featuring volumes that help marketers reach more customers, navigate the workplace, or just generally do their jobs more effectively.

This week’s book is Web Analytics Action Hero: Using Analysis to Gain Insight and Optimize Your Business by Brent Dykes, Evangelist for Customer Analytics, Adobe.

Editor’s Note: This contest has concluded, but be sure to check back on the MarketingSherpa Book Giveaway page for a new contest every week. You can also get notified of new contests from the Best of the Month Newsletter.

You’re probably wondering, “How can a Web analytics book help me discover my inner Lara Croft or Indiana Jones?” Brent has worked with industry leaders such as Microsoft, Sony, EA, Dell, Comcast and Nike. He also blogs for Adobe and has presented at more than 20 Web analytics conferences around the world.

He is a seasoned expert in using data to transform the way we do business. If anyone can tell you how to start driving actionable, data-driven change and defeat the organizational villains we all face, it’s Brent.

His book keeps an entertaining tone while being packed with informative models and examples of how to drive fact-based marketing decisions. It is essentially a how-to for becoming an action hero of science-based marketing and analysis.

The book did a great job reinforcing some of the things I’ve learned on the job as a MECLABS Optimization Analyst, planning strategy and experiments that fuel the content we produce at MarketingSherpa and MarketingExperiments.

analytics-action-hero-chart

 

Brent speaks to real-life work situations, and since reading his book, I’ve been able to deliver actionable analyses more effectively and use his models to become better at my job in Web optimization.

I had a chance to speak with the Web Analytics Action Hero himself to pick his brain on what it takes to be the Han Solo of the office.

 

 

MarketingExperiments: Who does data-driven analysis apply to, and why should they read your book?

Brent Dykes: Well, I think being data driven and using analysis is really important. The hype around big data has obviously created more of an interest in using and understanding data. I come from a marketing background, and I’ve worked in marketing long enough to remember those days where the vast majority of marketing decisions were being made on intuition—not data.

The digital channel has revolutionized the way we market today. It has ushered in new measurement opportunities that weren’t possible before. In general, it is easier to track digital media than traditional media, and you can get more granular detail as well. With all of the new metrics at the disposal of marketers, the data is susceptible to being misused.

It can’t be, “Hey, I wonder what metrics will make my campaign look good.” Being data-driven isn’t just about using data; it’s about using the right data in the right ways.

I think more people, not just analysts, are realizing they need to understand the data a little bit better.

I haven’t seen it 100% yet, but I think their managers are holding them more accountable to the numbers. So, I think everybody needs to embrace data-driven marketing, and that’s executives all the way down to the interns. Everybody needs to learn.

Read more…

Marketing Analytics: Show your work

August 14th, 2014 1 comment

Data handling and analytics can sometimes offer shocking results, as global B2B company National Instruments discovered after a surprising decrease in an email campaign’s conversion rate.

 

Key Obstacle: Concern about the new numbers

“When I first saw the number change, I was a bit freaked out,” said Stephanie Logerot, Database Marketing Specialist, National Instruments.

Stephanie, as a strategist, felt her greatest challenge was communicating the new way of looking at the data to National Instruments’ stakeholders outside of the database marketing team. This meant making certain everyone understood why the numbers dropped after implementing the new, more stringent data criteria.

 

A little background

A recent MarketingSherpa Email Marketing case study– “Marketing Analytics: How a drip email campaign transformed National Instruments’ data management” – detailed this marketing analytics challenge at National Instruments.

The data challenge arose from a drip email campaign set around its signature product.

The campaign was beta tested in some of National Instruments’ key markets: United States, United Kingdom and India. After the beta test was completed, the program rolled out globally.

The data issue came up when the team looked into the conversion metrics.

The beta test converted at 8%, the global rollout at 5%, and when a new analyst came in to parse the same data sets without any documentation on how the 5% figure was determined, the conversion rate dropped to 2%.

While interviewing the team for the case study, as what often happens in these detailed discussions, I ended up some great material that didn’t make it into the case study and wanted to share that material with you.

 

The team

For the case study, I interviewed Ellen Watkins, Manager, Global Database Marketing Programs, Stephanie, the database marketing specialist, and Jordan Hefton, Global Database Marketing Analyst, all of National Instruments at the time. Jordan was the new analyst who calculated the 2% conversion rate.

In this MarketingExperiments Blog post, you’ll learn how the team dealt with the surprising drop in conversion, and how they communicated why data management and analytics was going to be held to a new standard going forward.

The team overcame this obstacle with a little internal marketing.

Read more…

Online Testing: How to use A/A testing to break through the noise

June 30th, 2014 2 comments

Getting a lift from your testing efforts can be satisfying and rewarding.

Not to mention, increases in conversion have changed the fortunes of entire enterprises and the careers of the marketers who advocated testing.

But is a lift truly a lift, or is it simply a false positive resulting from natural variation?

In this MarketingExperiments Blog post, I wanted to share an excellent example of using A/A testing (and yes, you are reading that correctly) from Emily Emmer, Senior Interactive Marketing Manager, Extra Space Storage, presented at Web Optimization Summit 2014.

 

What does variance in testing look like?

variance-testing-homepage

 

Here’s the example Emily shared with the audience to help put variance in context using a control and treatment of Extra Space Storage’s homepage.

There is absolutely no difference between these pages except for the 15% difference in conversion.

According to Emily, that’s when you need to start investigating how variance is potentially impacting your testing efforts because there should be little to no difference in terms of performance in identical pages.

“A 15% lift is more concerning,” Emily explained, “because there should be no difference with the same experience.”

 

A/A testing is not A/B testing

variance-testing-explanation

 

Emily also noted a key distinction between A/A and A/B testing that is really important to grasp:

  • A/A testing – Can help you measure the natural variability (noise) of a website by testing an identical experience.

Read more…

Call-to-Action Button Copy: How to reduce clickthrough rate by 26%

March 31st, 2014 8 comments

“Start Free Trial” | “Get Started Now” | “Try Now”

One of the above phrases reduced clickthrough rate by 26%.

 

DON’T SCROLL DOWN JUST YET

Take a look at those three phrases. Try to guess which phrase underperformed and why. Write it down. Heck, force yourself to tell a colleague so you’ve really got some skin in the game.

Then, read the rest of today’s MarketingExperiments Blog post to see which call-to-action button copy reduced clickthrough, and how you can use split testing to avoid having to blindly guess about your own button copy.

 

How much does call-to-action button copy matter anyway?

The typical call-to-action button is small. You typically have only one to four words to encourage a prospect to click.

There are so few words in a CTA. How much could they really matter?

Besides, they come at the end of a landing page or email or paired with a powerful headline that has already sold the value of taking action to the prospect. People have already decided whether they will click or not, and that button is a mere formality, right?

To answer these questions and more, let’s go to a machine more impressive than the Batmobile … to the splitter!

 

A/B/C/D/E split test

The following experiment was conducted with a MECLABS Research Partner. The Research Partner is a large global media company seeking to sell premium software to businesses.

The button was tested on a banner along the top of a webpage. Take a look at that banner below. 

cta-experiment-start-free-trial

 

Five different text phrases were tested in that button. Since I’ve already teased you on the front-end, without further ado, let me jump right into the findings.

 

Results

cta-test-results

 

Those few words in that teeny little rectangular button can have a huge impact on clickthrough.

As you can see, “Get Started Now” drove significantly more clicks than “Try Now.” Let’s look at the relative changes in clickthrough rate so you can see the relationship between the calls-to-action.

Read more…