As the creator and designer of a number of popular Search Engine Optimization tools, I have used (and regularly test) a variety of SEO tools and services in order to compare them to my own and see where I can improve (and, to be honest, where I’m beating the pants off the competition). This includes not only direct competitors to my own products, but also the “Big Data” providers in the SEO industry.

Great data is only great if you understand what to do with it. If you’re a beginner to ranking your site in Google then you need guidance more than a bunch of stats and numbers. This is where most of the SEO tools fail miserably. While the individual data points provided by these tools are often pretty good, when it comes to using this data to give you practical advice they almost always fall flat.

Take estimating keyword competition for example. How difficult will it be to rank in Google for a specific set of keywords? With all of that data at these tools disposal, you would think they would be pretty good at estimating that difficulty.

They’re not. In fact, they’re usually pretty bad at it.

Let me back this up by giving you an example of some keywords where these tools get it wrong. This example is a “long tail” (that is, a set of keywords that don’t get searched very often and contain 4 or more words).

Keywords: online acoustic guitar lessons

Difficulty rating from popular tools (scale is 0 to 100):

Moz – 50

SEMRush – 69

SpyFu – 56

KWFinder – 49

Difficulty rating from my soon-to-be-released SEO system:

Keyword Titan – 28

Notice the difference? The four popular tools shown estimate it to be about twice as difficult to rank for “online acoustic guitar lessons” as Keyword Titan does.

The reason why their estimates are so (incorrectly) high is because those tools appear to be averaging the authority of Google’s top 10 ranking domains / pages for the keywords. That’s a mistake, a serious mistake, and it’s where pretty much every keyword tool goes wrong.

You see, the true estimation of how difficult it will be to rank for a set of keywords in Google isn’t found in the strength of the top 10 sites ranking for the keywords — it’s found in the weakness of the weakest ranking site in those top 10 results.

That is, if there are 9 very strong sites ranking for a set of keywords and one weak site mixed in among them, that weak site is the true indicator that ranking in the top 10 for those keywords is not so difficult. After all, if it was difficult to rank for then that weak result wouldn’t be there, right?

Almost no other keyword tool gets this right. They always average the strength of the top 10 Google results together to come up with their difficulty estimations.

I designed Keyword Titan to be different. When you analyze a single set of keywords in KT you get what I call a “Snap Analysis”. Here’s the snap analysis for online acoustic guitar lessons:

onlineacousticguitarlessons

Notice the site ranked #9, acousticguitarlessonsonline.net. The site was clearly created for the simple purpose of ranking for a number of keywords related to online acoustic guitar lessons. It has a keyword rich domain name (I’ll go into detail in a future blog post about why that’s helping this site rank).

The TrustFlow of the domain and of the page is somewhat low (in case you’re not aware, TrustFlow is a respected measure of how much “trust” the links coming into a domain or page give it — the higher the TrustFlow, the more likely the domain or page is to rank in Google).

But where that domain really shows its weakness in comparison to the rest of the ranking sites is in the number of other sites linking in to it (the refdomains (site) metric). While all of the other sites have hundreds or thousands or more external domains linking into them, acousticguitarlessonsonline.net only has 86. Getting 86 quality links takes a little bit of time, but hardly qualifies for the “hard” ratings being given to these keywords by the other tools.

The relatively low external linking domains combined with its marginal TrustFlow causes Keyword Titan to give it a difficulty rating of only 28 (which is on the low side of “moderate” in Keyword Titan).

This same scenario plays out again and again any time I run keywords through the popular keyword tools. Because those tools use the strength of the ranking sites to estimate difficulty rather than looking at the weakest site in the results, they are wrong much of the time. That means that SEO professionals and beginners alike are making poor decisions about which keywords to target.

It’s not that these tools don’t have access to the same data that Keyword Titan does — they do — they just interpret it incorrectly. So the next time you’re trying to figure out what keywords you should be trying to rank for in Google, keep that in mind.

I welcome your thoughts and responses in a comment below.