Get Free Quote

BONUS: SEO Competitive Research, Part 3: Measurable Parameters

BONUS: SEO Competitive Research, Part 3: Measurable Parameters

by: Bradley Leese, February/March 2006

Search Engines

Google, Yahoo! and MSN place unique emphasis for high keyword phrase rankings.

All major search engines claim a staggering 100+ parameters that play an active role in how their algorithms evaluate a website. Each major search engine's user support section claims the purpose is to ensure the maximum search result relevancy. What are these 100 parameters that each engine considers important? The truth is that it is impossible to know for certain what each of the 100 parameters are and how much weight each element merits. To make matters even more confusing, each search engine has its own idea of what parameters are most relevant. The secret to overcoming this hurdle is fairly simple. By studying the user support information and search engine results, the answers become much clearer.


What does a webmaster have to do to get their site ranked in the absence of clear instructions on what is required?

What stops search engines from publishing clear documentation on what parameters are given the most emphasis? Search engines claim the secrecy surrounding their algorithms is to prevent malicious spammers from skewing the search engine results by altering their site with the sole purpose of higher rankings at the expense of a better user experience.

If search engines are so concerned about these actions then what constitutes spam or, "deceptive or manipulative behavior" (Google's webmaster guidelines)? Additionally, what are the possible repercussions of performing these tactics? Google defines a beneficial website as one that offers useful information to the user while displaying consistent results to both the user and the search engines alike. Any act that seeks to influence the search engines with information that differs from that which is presented to users is perceived as spam. The punishments for these tactics fall into multiple categories; the most severe is that the site is banned from the search engines and all rankings are dropped. A less severe penalty would be that the page is deemed less relevant and rankings may drop a few placements. In general, hidden text is a banning offense, duplicate content is merely disregarded as irrelevant and ranked accordingly (that is, not at all.)

Knowable Parameters

All search engines give obvious clues as to what they define as its major priorities.

How can a webmaster decipher which parameters are most important to a search engine? Begin by visiting each search engine's user support page:

Start by defining what each search engine places the most emphasis on within their webmaster tips or guidelines. From these pages, prioritize the elements that should be included within the construction and maintenance of your site. Continue by defining what each engine considers to be spam, or has tagged as deceptive. This information should act as a set of principles that are used for quality assurance that must take place on an ongoing basis. These factors are constantly changing and revisiting this information frequently is well advised.


With known parameters begin research on the search engine results for confirmation and clarification.

After all the parameters are gleaned from the support sections, the next step is to carefully research the search engines' index results or Search Engine Rankings Placement (SERP). A wealth of information can be gleaned from watching the ebb and flow of the search results. In order to understand why sites are ranking a certain way, take a sample of the top results for a given query and "break down" each site into measurable parts, while noting which elements appear to have the greatest impact. The most important factor in research is to be consistent, make sure that each website is analyzed with the same factors.

Analysis and Testing

While knowledge of all weighing factors is unknowable, clarifying the core elements is entirely within your grasp.

You now should have the largest parameters for each search engine confirmed. Remember these factors are in constant flux and must be reevaluated each time the search engine results shift their placements. Analysis and testing will guide this process into digestible segments for ongoing evaluation. Each time the rankings seem to be in flux run through this process, both on your site and that of your major competitors. Make note of the discrepancies and adjust for the changes and updates.

Next: Part Four, Tools and Techniques

 Click Here

Tools and Techniques: With 100 possible parameters, learn how to acquire information quickly, consistently and efficiently.

For permission to reprint or reuse any materials, please contact us. To learn more about our authors, please visit the Bruce Clay Authors page. Copyright 2006 Bruce Clay, Inc.