Search Engine Ranking Algorithms Overview

Today, I want to tell you a bit about search engine ranking algorithms’ history and what we should probably expect in the future. Hopefully, this article will help you to optimize your site in advance and secure its visibility in the near and distant future. Let’s start with the aims and problems search engines have been facing.

Before Internet became available and accessible to the major part of the world there was a comparatively small number of personal and business websites, low keyword competition and even a smaller number of websites with well-done on-site optimization. At that time the webmaster could specify Meta keywords, Meta Title and URL structure to rank well for his keywords. As the number of such webmasters increased there would appear 2 or more similar websites with the same keywords and the approximately the same content. Thus a need for new grounds to assess why this or that site is better was needed. This ground was no more bound to sites’ Meta information nor to sites’ content, but rather to its off-site popularity.

Webmasters started to gather links to their websites from any accessible source. As well as doing link exchange with the webmasters of other sites, which is effective but obviously not allowed method of website promotion. In this way some part of sites’ owners was eliminated from top SERPs on the basis of poor off-site popularity. At this point websites ranking would directly depend on how much time a webmaster spends on his creation – that was a perfect way to get rid of spammy sites that came into existence as the popularity of Internet as marketing means grew stronger. When primary off-site optimization would go without saying, search engines needed to go further… Thus the number of filters that is incredibly long at the moment and is giving headache for so many webmaster started to grow steadily.

Your forum, social bookmarking and non-relevant links are devaluated, you can not have too many links at one time popping up, you can not have the same keyword in the anchor of the title to your site, you can not have the same description in the anchor of the link to your site, your link must be at a perfect position at a specific page to give you the maximum SEO juice and you need to run marathon negotiations to get a webmaster of another site place it there… and so on and so forth. That involved even more human time, but what is saddest – started to involve money as well! Pursuit for links set in motion whole networks of spammy yet effective in terms of SEO directories. One could get hundreds of listings for a certain price. Fortunately, they would be filtered as well but still participate in SERPs generation leaving lots of homework for guys who did the optimization without any money involved. But today links seems to come to their capacity limit as means of SERP calculation and what is going to be the next step is а mere surmise…

Link popularity supremacy webmasters have been talking so much about will hopefully come to its decline. And the most sensible step would be to gather extra usage statistics and generate SERPs on the basis of websites’ user-friendliness. Not all webmasters have Google or another analytics script but their number is growing steadily. This script allows to determine how many pages have been viewed by the visitor, how much time was spend on this or that page, determine the bounce rate and ultimately indirectly collect information about how friendly a website is towards this or that keyword – the conclusion based on the visitor behavior. But at the moment there is no statistics counter that would be applicable to all the search engines nor does it seem possible, because such a tactical alliance between all the search engines seem unfeasible while “open-source” access mode to the statistics would not gladden webmasters now vulnerable to statistics leakage. On the other hand a particular search engine will never allow priority of one site over another on the basis of their stats counter presence. Sensible as it is, this method of ranking position calculation leaves much to be done technically but looks a logic step to be made by search engines in the battle for relevant search results and attempt to deliver most user-friendly resources for an Internet user.