Elance

 

          projectPDQ

                            






www.projectPDQ.com

 

 

Featured Article

  PRINT   BOOKMARK  

 
SEO – The Death of Keywords – Part 1
 
Are their days numbered? 

The series of 2011 Panda updates by Google enhanced the discrimination of its search engine technology. This was done in an effort to enhance the quality of results returned for users’ searches by filtering out low quality sites, regurgitated rubbish, scraped content and wordy but ultimately low quality content, from the users’ perspectives. The ultimate objective was, of course, to ensure that the user community maintained a high regard for the Google search enterprise, and its associated advertising business model. As an aside of course, this also means that Google is reaching deeply into knowledge of individual users’ browsing habits – not just superficially as had been the case.

The Panda enhancements reportedly included: assessment of user residence time on a site and their inter-page navigation on that site; users’ readiness to share the results with friends and colleagues directly, and through social networking; their preparedness to comment (on Web 2.0 sites).

Some time ago I wrote on whether Google was a neural network (insert reference). It’s not there yet, but the learning capability is increasing in leaps and bounds.

Whilst the emphasis of the model appears to have shifted from content-analytical to consumer-analytical, I cannot believe that the lexical analysis routines which Google has employed for several years now, have been dumped.

I believe that Google still harbours a vision of being able to take a piece of content and assess its value.

Now ‘value’ is a subjective measure with no inherent scale – it is a relative concept. But what if the content was ‘measured’ relative to, say, Wikipedia or other relevant authority site?

If Wikipedia (or equivalent) became the ultimate reference baseline, then what is the point of anyone writing anything else (other than to disagree or to extend the content)? Of course, it would be to venture an opinion or disagree with Wikipedia.

Anyway, to get to my main point: Up until 2011, the idea of keywords has worked to both Google’s and commercial interests. For Google it has meant that relatively accurate search results could be obtained, whilst for content creators, the concept has been, mostly, profitable.

As the lexical analysis technology proceeds however (and companies such as Booklamp have irons in this fire, too), then there will surely come a time when the concept of ‘keywords’ would be meaningless. Google will decide for itself what the content is about, and measure its ‘absolute’ quality against Wiki-likes, supplemented by the user community grading – both automated and invisible by Google itself, and by direct user engagement in a site.

Fine, you say, but that’s not the whole story! No, it’s not. That’s why this is Part 1.

The next articles deal with:

-> Google Page 2 – Will That Be The New Battleground?
-> Why Adwords Will Have To Change
-> Why Review And Comparison Sites Will Grow…And Grow

© 2011 Phil Marks

Interested in a great thriller with espionage, cyberwarfare and digital invasions? Check out Gate of Tears.

Back to top