Dealing With Inaccurate Keyword Data

Splitting Up Inaccurate Data Will Make Data Less Meaningful

Splitting up “inaccurate data” will make data less meaningful, not more.

Allow me to elaborate:

As far as searches go, Google gives you ‘ball park only’ data for searches on a national level, and depending on how you ask the question, it will change the numbers, but it will remain in the same ballpark. Sadly it’s the best data we can come by, but it can easily be up to 35% incorrect either up or down. All I did to do to verify that was to rank on the first page consistently for several reasonable terms (traffic > 1000 searches/month) and then compare what webmaster tools said the exposure was for that keyword versus what Google estimated the traffic was.

Add to that the complexity the fact that they are showing local results today for national terms. Let’s say there are across the nation 1000 searches for SEO, but Google shows local results for this query. This means now that when you are in Austin and you type SEO this doesn’t count against the keyword “Phoenix SEO” – rather, it rolls into that 1000 searches for “SEO”

Further, a metro area is broken down into its suburbs so you can have searches on “Mesa SEO”, Scottsdale SEO”, “Phoenix SEO”, etc, and probably some of those will have searches if you query Google keyword planner, and others not

So let’s say we ask Google how many searches there are for “Mesa SEO” and it says none, then you have to assume that such a large majority of the local people type in just “SEO” that Google does not bother to count the insignificant number of “Mesa SEO”.

If we start by calculating that there are 300M ppl in the US, and 3M people in your metro area – and when you ask Google about all of the different suburban “{local} SEO” terms it comes back and say a total of 3 – now you have to make a decision.

If you take 3M/300M then the population in your metro area is about 1% of the national population, that means that out of that 1000 national searches about 1% of them should be in your area – BUT instead of 10 you got 3. Now you have to wonder (and here you can only guess) does your area have more of an interest in SEO than the national average? Or less? Is 3 a valid number or should you go with 10? Or is the real number 13? 10 for the national term added to the 3 for the local terms?

Is there REALLY a significant difference between 3 and 10? and what happens when you KNOW the real number could be 7 or 12? (30% plus or minus) .. or if you go with the addition theory it could be as high as 15.

THIS is one of the reasons I started to move away from focusing on a single keyword and started to focus on the entire Keyword DNA Braid. If you are targeting 5-7 keywords per page, and you have a modest site of 20 pages, then you have a total of 120ish keywords, and instead of trying to figure out the bracket of each keyword you can figure out an estimate of all of them.

And because it’s an estimate of the average, it will actually be MORE accurate than the individual.

Bottom line:

1. Don’t sweat the small stuff.

2. Keep your eye on the website.

3. Build out the blueprint using lots of good keywords and a healthy dose of words that are easy to rank for.

If you do these things, whether you are doing the site for yourself or for a client, it will start to rank for the easy terms first, and as it starts to get traction on the larger keywords, both of you will be happy.

Add a Comment

Your email address will not be published. Required fields are marked *