|Marketer’s Intuition Revisited|
|Thursday, 13 December 2007|
Topic: Marketer’s Intuition Revisited: Is There a Place for Intuition in Web Page Optimization?
A subjective factor like intuition might alarm those basing new Web site designs on test results that discover what really works when it comes to Internet marketing. It doesn’t seem scientific. It’s certainly not represented in our Conversion Index formula:
Yet the very nature of the optimization procedure includes an element of intuition. Where does it fit when identifying the most effective Internet marketing strategy?
Previous MarketingExperiments surveys showed intuition was unreliable when it came to predicting the best page performance, the best headline, the best copy in the test cases we studied. Our survey-takers were wrong at least 50% of the time.
In the interest of updating our previous findings on marketer's intuition, we invited those attending our December 5, 2007, Web clinic to evaluate side-by-side Web site and email optimization choices and vote on which ones they believed performed best in our tests. We then conducted a live poll of the audience, shared the tally, and reviewed the actual test results.
The results were fascinating: the majority of the December 5th audience failed each time to pick the best performing design. Even when Dr. Flint McGlaughlin provided pre-vote commentary on two of the designs, the votes still ran counter to his "hints."
Given that intuition is a subjective "X" factor in marketing decisions, we attempted to answer these questions: How can it be honed to its maximum reliability? And should we ever trust it?
Editor’s Note: We recently released the audio recording of our clinic on this topic. You can listen to a recording of this clinic here:
Definitions of intuition vary, but generally it means
Arriving at a conclusion without apparent deductive reasoning.
In Jungian psychology, "intuition" is on one end of a personality axis, and "thinking" is on the opposite end. But using intuition doesn't mean thinking has not taken place.
In the Recognition Primed Decision-Making (RPD) model, psychologist Gary Klein found that intuition means "using previous experience to rapidly interpret perceptions and subconsciously choose feasible solutions."
Therefore, marketing intuition may be a refined ability to immediately filter variables and opportunity through a mental library of data, results, and analyses based on what has worked and what hasn’t in one’s previous marketing testing experience.
Case Study 1: Incentive
Control (without gift card)
We asked our clinic audience:
Which offer performed best in our tests?
A plurality thought Treatment 2 would perform best.
What you need to understand: Both the Control and Treatment 1 performed better without the gift card incentive.
Why?—especially since previous test results have shown us that Incentives can be a powerful factor in the Conversion Index formula.
Was this a rare case where an incentive worked against the Conversion rate?
Another difference between the two best performing offer pages: one is two steps, the other just one. Did that somehow make the difference?
Testing experience may lead us to the intuitive conclusion that offering an incentive may increase conversion. But an experienced analyst also knows that we cannot definitively show the impact of the incentive alone — we cannot isolate it — unless we use identical versions of the offer where the incentive is the only variable.
Our primary analyst also raised a concern about possible Instrumentation, Sample or Selection Threats having intruded into his test protocol.
We would need to retest the incentive premise.
Case Study 2: Copy and Headline
We asked our audience to answer two questions about Case Study 2:
Which offer page performed best in our tests?
Which headline performed best?
A majority of our audience picked Treatment 2 both times.
Our primary analyst also thought Treatment 2 would perform best. Believing “ActualMe” does not have brand recognition and using it in the headline might be confusing, the analyst tried “Communicator” in the Treatment 2 headline, positing that with clear communication the shorter page would outperform the longer page.
What you need to understand: Shorter copy performed better than longer copy, and Treatment 1 performed best.
Why didn’t the analyst’s intuition work? Why did the ActualMe headline perform better?
We reached no definitive conclusion, so we tested again. . . .
Case Study 3: Headline and Page Design
Now which offer page would have the highest impact on conversion?
The votes were closer this time, but the majority of our audience still picked the wrong offer page.
What you need to understand: Treatment 1 produced a 25% higher conversion rate than the Control page. In this case the primary analyst’s intuition was correct: the very direct, clear headline performed best.
A clear, concise headline about “what you get” and “how long it takes” was obviously a very good design choice.
But let us dig a little deeper.
Did adding a picture of the “profile” increase Friction and Anxiety or convey Incongruence? Did visitors think this . . .
. . . was all they would get for their time and effort?
The actual personality profile is very detailed and several pages long.
Case Study 4: PPC Ad Copy
Which ad performed better in our tests?
The audience’s intuitive choice was again trumped by the test results:
What you need to understand: The PPC ad without the 5-star credibility indicator outperformed the Treatment by 15%.
Why? Previous test results have shown us that adding credibility indicators is a powerful Anxiety reliever.
One analyst interpreted the results this way:
Another interpreted the results this way:
Still another analyst interpreted the results this way:
Yet another looked at the test from a completely different perspective:
A “sequence of concerns” takes place in the purchasing decision process. Early in the process shoppers want to know if you have the best price, if you have the widest selection, if you have what they need. The credibility concern arises as a shopper gets closer to the buying decision.
In Case Study 4, visitors may have been presented the five-star credibility indicator too early in the sequence.
Anticipate what is on a visitor’s mind at each step of the process, starting with the design of your pay-per-click ad. Then meet each of their concerns, the Anxiety or the Friction factors, with the appropriate relievers as they click deeper into the process.
Case Study 5: Email Subject Line and Copy
We asked our audience to again test their intuition, and to tell us why they voted the way they did:
Which email performed better in our tests?
Some participants thought the Porsche vs. Corolla email seemed more engaging, that cars were an exciting, emotional analogy and that Email 2 was boring. Others thought the “results” approach better explained the course’s benefit.
What you need to understand: The format with the academic approach produced a 29% higher CTR than the informal email.
Case Study 6::Landing Page Design
Our audience’s collective intuition was 0 for 6 so far. We asked one last question:
Which landing page performed better in our tests?
We again solicited audience feedback on why they voted the way they did before revealing the actual test results. Someone thought the Control design was clear and easy to understand. Someone else offered that the Treatment was a better sales approach. Dr. McGlaughlin offered that he didn’t like either design.
What you need to understand: The Control’s Conversion Rate was 114% higher than the Treatment’s.
The Control’s focus is about getting content for your site, while the Treatment emphasizes earning ad revenue by publishing content. Was getting content a stronger Value Proposition than earning ad revenue?
Was there too little copy in the Treatment, creating Anxiety about the offer and making a decision to continue? The “Call to Action” buttons on the Treatment page were also small and poorly placed.
Related Marketing Experiments Reports
As part of our research, we have prepared a review of the best Internet resources on this topic.
These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.
* = Decent | ** = Good | *** = Excellent | **** = Indispensable
Editor(s) — Frank Green
Writer(s) — Peg Davis
Contributor(s) — Flint McGlaughlin
HTML Designer — Cliff Rainer
Email Designer — Holly Hicks