|Small PPC Search Engines Revisited|
|Friday, 01 December 2006|
Topic: Small PPC Search Engines Revisited
Can online marketers achieve a worthwhile return on investment with smaller PPC engines like Kanoodle, Miva and others?
We recently released the audio recording of our clinic on this topic. You can listen to a recording of this clinic here:Small PPC Engines Revisited
Two years ago we conducted a study of small PPC engines to find out whether they offered online marketers a profitable opportunity to generate more sales.
At that time our research indicated that these smaller engines could indeed generate significant additional revenues.
Today, two years later, and with even more intense bidding for keywords on the major PPC engines – Google, Yahoo! and MSN – we wondered whether these smaller engines still offered a profitable source of additional traffic and income.
If so, the use of these smaller engines, in addition to the big three, could take a great deal of pressure off marketers who are facing diminishing returns with their current campaigns.
We tested seven small PPC engines with four different research partners.
While there were some interesting differences in performance between different partners and engines, the final results may surprise you.
In order to truly understand what the smaller PPC engines can and cannot achieve for online marketers we tested no fewer than seven engines with four different research partners.
The PPC engines included in our test were:
The partners we tested represent the following business areas:
By working with multiple engines across a number of different industries our intention was to avoid drawing conclusions based upon industry or product market-specific attributes.
From the outset, we expected that the volume of traffic would be far below those of Google, Yahoo! or MSN.
This proved to be the case.
In addition, there were significant differences in reach among the small engines themselves. As a result, our findings are based principally on percentage differences in conversion rates and cost per sale.
:: Observation 1 – Conversion rates among the different search engines can vary enormously.
One of the first questions we wanted to find an answer to was, “Which of the smaller PPC engines delivers the highest conversion rate?”
As we will see later, the performance of the engines appears to vary according to the industry or company.
For the purpose of comparing conversion rates, we ran the same text advertisement across all seven engines, and used identical keywords.
We began with the specialty job posting site.
One difference between the campaigns, which lay outside of our control, was the minimum bid allowed by each engine.
We had no control over the number of exposures each ad received, as the reach of the various engines varies enormously, though we did collect enough data to compare conversion rates among the engines.
Here are the results:
What You Need To UNDERSTAND: For this company, in the specialty job search market, the measured conversion rate of the best performing engine was more than 50% higher that of the second best, and almost four times that of the third best.
:: Observation 2 – Using a single PPC engine, the nature of your business can have a major impact on conversion rates.
The next question we asked ourselves, working with the same set of test results, was “Will a single PPC engine perform as well for one company as it does for any other?”
We isolated the data from the highest performing engine from the first test, Enhance.com, and compared its performance across all four companies. Results are for the same number of keywords, using four different text advertisements.
Here are the results:
What You Need To UNDERSTAND: Small PPC performance can vary dramatically across industries.
As with so many tests, these results answer one question, and then raise others.
:: Observation 3 – How does the cost per action vary among the different PPC engines?
In this case we isolated data from a single company, the publisher of a very large national newspaper.
Our purpose was to compare the final cost per action among the different engines. After all, ROI is the single most important measure of campaign success for most companies.
Again, we ran the same ad, with the same keywords and copy, simultaneously across all of the different engines.
The amount we spent on each engine was determined both by reach and minimum bids.
Here are the results:
What You Need To UNDERSTAND: While the results, particularly the conversion, are disappointing in any context, the $66.58 CPA for MIVA was 43.4% lower than that for Google over the same period and 59.2% lower than Overture.
While the performance of the small engines was disappointing overall, this testing provided a number of worthwhile insights. We were able to establish profitable campaigns for two of these companies, though the results were tenuous due to low volume.
Because of the limited sample size, it was not possible to achieve statistical validity for the tests within a practical time period. Consequently, any site or business related decisions you would make using your own data would come with a correspondingly higher risk of error. This is a persistent problem that comes with the territory of working with such low volume traffic sources, and makes them less attractive as primary marketing channels.
In our last brief on this topic, from March of 2004, we had found that the smaller engines at that time to be potentially viable and profitable.
Despite higher bidding on the major PPC engines, it appears that the smaller engines have failed to capitalize on the opportunity, and offer sufficient quality to build profitable campaigns.
Here is what we recommend:
In assessing the viability of each engine, you may want to use our free Maximum Bid Analysis tool for calculating the highest bid you can make and still break even on the campaign.
You can download the tool here:
As part of our research, we have prepared a review of the best Internet resources on this topic.
These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.
* = Decent | ** = Good | *** = Excellent | **** = Indispensable
Editor — Flint McGlaughlin
Writer — Nick Usborne
Contributors — Jimmy Ellis
HTML Designer — Cliff Rainer