With moderate search engine optimization knowledge, some common sense, and a resourceful and imaginative mind, one can keep his or her web site in good standing with search engines even through the most significant algorithm changes.
The recent Google update of October/November 2005, dubbed “Jagger”, is what inspired me to write this, as I saw some web sites that previously ranked in the top 20 results for extremely competitive keywords suddenly drop down to the 70th page. Yes, the ebb and flow of search engine rankings is nothing to write home about, but when a web site doesn’t regain many ranking spots after such a drop it can tell us that the SEO done on the site may have had some long-term flaws. In this case, the SEO team had not done a good job predicting the direction a search engine would take with its algorithm.
Impossible to predict, you say? Not quite. The ideas behind Google’s algorithm come from the minds of fellow humans, not supercomputers. I’m not suggesting that it’s easy to “crack the code” so to speak because the actual math behind it is extremely complicated. However, it is possible to understand the general direction that a search engine algorithm will take by keeping in mind that any component of SEO which is possible to manipulate to an abnormal extent will eventually be weighted less and finally rendered obsolete.
One of the first such areas of a web site that started to get abused by webmasters trying to raise their rankings was the keywords meta tag. The tag allows a webmaster to list the web site’s most important keywords so the search engine knows when to display that site as a result for a matching search. It was only a matter of time until people started stuffing the tag with irrelevant words that were searched for more frequently than relevant words in an attempt to fool the algorithm. And they did fool it, but not for long. The keywords meta tag was identified as an area that was too susceptible to misuse and was subsequently de-valued to the point where the Google algorithm today doesn’t even recognize it when scanning a web page.
Another early tactic which is all but obsolete is repeating keywords at the bottom of a web page and hiding them by changing the color of the text to match the background color. Search engines noticed that this text was not relevant to the visitor and red-flagged sites that employed this method of SEO.
This information is quite basic, but the idea behind the aforementioned algorithm shifts several years ago is still relevant today. With the Jagger update in full swing, people in the SEO world are taking notice that reciprocal links may very well be going the way of the keywords meta tag. (i.e. extinct) Webmasters across the world have long been obsessed with link exchanges and many profitable web sites exist offering services that help webmasters swap links with ease. But with a little foresight, one can see that link trading has its days numbered, as web sites have obtained thousands of incoming links from webmasters who may have nev