Search Engine Optimization Best Practices: Standards and Spam Discussion

As an SEO, you’re no doubt aware that the definition of spam changes over time. Thanks to black-hat SEOs, search engines are constantly updating their definitions of spam. Even worse, these definitions may vary between major search engines like Google, Bing, and Yahoo!.

To be an effective SEO, avoiding spam is key. Because of the severe penalties associated with spamming search engines, it’s always best to play it safe. But to play by the rules, you need to know what the rules are.

This article identifies SEO standards that stand the test of time. It defines the most common types of spam you should be aware of. Then we go over some essential ways to optimize your site and its contents without relying on tricks.

We’ve divided this article into a few sections so you can skip around as needed:

Who Fights SEO Spam and Penalties

It’s the SEO who is responsible for making sure that you do not have spam on your website or in your website strategy. SEOs work to protect and build your website’s experience, expertise, authoritativeness, and trustworthiness (E-E-A-T), make sure there are no penalties, and repair the site should they appear. Spam runs counter to all of those goals.

The definition of SEO varies. We have an entire SEO Guide that dives into the specifics of how to do search engine optimization and gives you a good foundation.

To start off our discussion of SEO standards, let’s pose a situation.

Situation: A Story of Two Sites

Site A is quite well written and exceptionally relevant for the search keyword “W.” Site B is not as well written, not as content-rich, and nowhere near as relevant. The search engines will not like B.

Site B uses search engine optimization (SEO) technology and a few borderline spam tricks. Suddenly site B outranks site A for the search ”W.” This hurts the user experience and lowers user satisfaction with the results from that search engine. Search engines see this as a slap to the face since their job is to ensure that visitors see relevant content and are happy.

Is it any wonder that search engines are always tightening down on spam rules? It is one matter to improve the quality, presentation, and general use of keyword phrases on a webpage. It is a different matter to trick the engines into higher rankings without editing the site content.

It is the position of the search engines that the role of the SEO practitioner is to improve the quality, quantity, clarity, and value of content. Quality content allows search engines to select worthy sites based on their unique relevancy factors. SEO practitioners should help search engines by making sites more relevant, clear, and accessible. SEOs should not use spam techniques to inflate the perceived relevancy of inferior sites.

Don’t disguise an inferior site – fix it. Don’t make site B appear to be more relevant than site A; actually make it more relevant.

While some search engines reward off-page SEO technologies, the improvements are often short-lived and of diminishing benefit. The cutoff for what is acceptable is also changing each day. Tricks that work today can get you unlisted tomorrow. Pages that are informative and contribute to the content, usability, and indexability of any site are the goal of SEO.

For too long SEO practitioners engaged in an arms race. Some saw their role as inventing more and more devious technology to trick search engines and beat competitors. Today, search engines have aggressive anti-spam programs, making this strategy ineffective. The news is out — if you want to get search engine rankings for your clients, you have to play well within the rules. And those rules are “no tricks allowed.”

Simply put work on honest relevancy and win. All others will fade away.

Case in Point: Doorway Pages

At one time, the “doorway page” was used as a portal to dynamic content. At one point, several major engines even endorsed them as a way to organize and display content on your site. However, by 2002 the search engines had reversed their opinions of doorway pages. Today, search engines now consider doorway pages to be spam.

This proves that what works today may not work tomorrow. If you play with fire, you will regret it.

Our advice is to always play in the center of the acceptable area. We also advise you to not experiment with new ways to fool the engines and earn overnight rankings. Even research for the purposes of self-education can cause long-term issues with your page rankings.

General SEO Standards to Practice

There are many different technologies and methodologies used by SEO practitioners. It is not the intent of a Code of Ethics to define HOW the code is met, but rather to set the bounds of compliance. Search engine acceptance depends upon meeting these codes plus SEO standards defined by each search engine. In general, if actions are in compliance with the Code of Ethics and meet the SEO standards of the search engines, then they are allowed.

But remember that what is an allowable trick today may be blacklisted tomorrow. It is better to focus on honest SEO than waste your time on something that will need to be abandoned soon.

These are general guidelines that may vary from search engine to search engine:

  1. Keywords should be relevant, applicable, and clearly associated with page body content.
  2. Keywords should be used as allowed and accepted by the search engines (placement, color, etc.).
  3. Keywords should not be utilized too many times on a page (frequency, density, distribution, etc.). They should be used naturally within the page content.
  4. Redirection technology (if used) should facilitate and improve the user experience. But this is almost always considered a trick and is a frequent cause for removal from an index.
  5. Redirection technology (if used) should always display a page where the body content contains the appropriate keywords (no bait and switch).
  6. Redirection technology (if used) may not alter the URL (redirect) or affect the browser BACK button (cause a loop). It also may not display page information that is not the property of the site owner without sound technical justification (e.g., language redirects).
  7. Pages should not be submitted to the search engines too frequently.

Each search engine must support at least the Robots Exclusion Standard. This is not always the case, but it should be.

Guidelines for a search engine or directory may further discuss the relevance, spamming, cloaking, or redirection. Usually, these will be discussed in a way that relates to user experience. In general, revising or adding content is good if it improves the user experience. This is the subjective area we must all interpret and is why rules change so often.

We recommend reading Google’s Search Quality Evaluator Guidelines document for specific examples of what the search engine considers high- and low-quality web content for search visitors.

The Players

There are three main players when it comes to search engine optimization:

  1. Clients – Owners of the website. Client emphasis is on sales, holding users (sticky), and user experience. There is also an emphasis on getting the visitor to take a desired action.
  2. Search Engines – Emphasis is on providing a positive user experience. This is achieved through relevance (controlled by algorithms) and minimal negative impact as a result of bait-and-switch technologies.
  3. SEO Firms – Professionals who obtain traffic for client sites as a result of a search engine query. This involves understanding the SE ranking algorithms and beating the competing SEO firms optimizing other clients for the same terms. It is important to follow SEO standards and remain within the “No Spam” boundaries (play within the rules) while doing this. SEO practitioners are paid by clients and are rewarded for rankings at almost any price.

Unfortunately, if the rules change, sites may be dropped from SE’s. If algorithms change, sites may be lowered in the rankings. If competing SEO firms are successful in finding a new trick within the rules, site rankings may fall. If new competing client sites enter the market, site rankings may drop. If the client site uploads altered pages or changes server technology, site rankings may drop.

SEO Processes

There are four main page-centric SEO processes used by search engine optimization firms:

  1. Editing Client Webpages: This is making revisions made to a client site’s pages so that they can rank higher in the search engine. This is honest SEO work and involves editing real honest website pages. The improvements better serve users and raise the quality of the page. This is the bread-and-butter of legitimate SEO firms. It is the clear winner when it comes to obtaining meaningful and long-lasting rankings.
  2. Man-Made Pages: These are a “doorway-like” technology (shadow page) that is keyword intensive. When visited, these pages should present an honest site page. This is a labor-intensive process that copies a real honest page and then alters it to emphasize keywords found on the honest page (page presented). In some implementations, this page loads the presented page into a frameset, and others redirect. This is not to be confused with web design, which adds extra content to a site that is intended for human visitors. ANY man-made page that is not intended for human visitors, no matter how great the content, is considered spam.
  3. Machine-Made Pages: These are often a “doorway-like” page where content is pulled from other site content based upon keywords. This content is then compiled by a software tool. Some generate pages using gibberish or templates that are easily detected by the search engines. This type of tool could generate thousands of pages in minutes. ANY machine-generated page that is not intended for human visitors, no matter how great the content, is considered spam.
  4. Cloaking: This is often associated with sites doing IP and USER-AGENT serving where the internet server will present a page that will vary based upon the visitor characteristics. This technology can be used to present different content to each search engine or browser. Because of this, a search engine seldom sees the same content that is presented to a browser. While there are acceptable reasons to do cloaking (such as legal restrictions of what content may be shown based on a visitor’s age or location), the use of cloaking that filters content based on whether the visitor is a spider or a human, no matter how great the content, is likely to be considered spam.

Editing Focus/Methodology

The primary methods used to improve search engine ranking are discussed on our site. This section lists a couple of areas that are affected by current SEO standards, and are called-out for special notice.

  1. Navigation: the use of links to encourage spiders to locate content within the website, and to support popularity algorithms.
  2. Content: the inclusion or focus on words, phrases and themes associated with search engine query strings.
  3. Transfers: pages that display (or transfer) to a real honest page. These pages are keyword-rich and theme-based for search engines. Yet they provide a reformatted page for the browser. This is very much like a standard Frames implementation in conjunction with a search engine optimized no-frames area. This practice includes URL-switching where the displayed page has a browser address that is different from the URL in the search engine link (thus is a redirection). It also includes instances where the browser back button causes a loop.

Bad Practice Issues

What makes a bad search engine optimization practice? When asking SEOs this question, spam and cloaking seem to be the leading answers. We present these items as bad practices and encourage others to submit ideas for this list as well. Some of these SEO practices were once accepted by the search engines but have become “bad” over time as the search engines have evolved to combat their individual notions of webspam.

Transparent, hidden, misleading and inconspicuous links — This includes the use of any transparent image for a link and the use of hidden links (possibly in div/layers). It also includes any link associated with a graphic without words/symbols that can be interpreted as representing the effect of taking the link. Also included are inconspicuous links like 1×1 pixel graphics or the use of links on punctuation. All of these would be considered “spam” and a cause for removal from a search engine index.

“Machine generated” pages – While many content management systems do create webpages, this entry refers to deceptive content pages generated to alter search engine results. These pages are typically created through software that takes keywords and uses them to assemble a high-ranking page. Pages like this are unconditionally spam because of their low quality and relevancy to any search.

Cloaking – When cloaking is used to deceive search engine user-agents for the purposes of ranking, it violates SEO standards and is considered spam. The only exception is when there is no impact (deletion, formatting, or insertion) on content delivered to the visitor versus the search engine spider. Where the stated objective of the tool [filtering by IP number or user agent] is to facilitate the delivery of differing content based upon visitor/search engine identification processes, the implementation of cloaking technology is considered BAD.

Search engines may remove cloaked sites from their index where deception is involved.

Content Farms/Article Directories – Before Google’s “Panda” update in 2011, article directories were a common way to increase PageRank. An article directory is a site that collects content about a specific subject. While collecting articles on a particular subject is not bad, many sites were “content farms.” These content farms churned out low-quality content on a particular topic to trick SE’s into increasing their PageRank. The Panda update removed sites that engaged in this practice. Today, search engines continue to filter out these low-quality pages as part of their core algorithms.

Spam – Spam runs from white-on-white to overloading the web with free webpages/sites developed to boost popularity through links. This category needs a clear definition, but it is the most easily defined in “black and white” rules.

External factors such as sites with numerous, unnecessary host names may also be caught. Some other common spam techniques include excessive cross-linking sites to inflate perceived popularity and the inclusion of obligated links as part of an affiliate program.

What the Engines Think Is Spam

Google

Google frequently updates its definitions of low-quality pages and webspam in its Search Quality Evaluator Guidelines. The latest edition also discusses low-quality pages that are not spam and simply miss the mark. For the purposes of this article, we will continue to focus on the definition as it relates to spam and not poorly made, well-intentioned pages.

Google has directed quality raters to rate a page as “low quality” if it contains low-quality MC (main content) or if the “title of the MC is exaggerated or shocking.” This specifically cracks down on clickbait types of headlines, which are less likely to gain a spot on the Google front page. Google provides additional context on this, noting that:

Exaggerated or shocking titles can entice users to click on pages in search results. If pages do not live up to the exaggerated or shocking title or images, the experience leaves users feeling surprised and confused.

Further explanations of low-quality content focus on whether there is “an unsatisfying amount of website information or information about the creator of the main content.” So contact information for the site owner should be easy to locate on a site, such as the business name, address, and phone number. And if an author is named for an article, enough background information should be shown to establish the person’s credibility and support the quality of the content.

More resources are available here:

Bing

Bing defines spam as pages that “have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information.” Bing has stated that if pages are found to contain spam, Bing may remove them at their discretion. They may also adjust the Bing algorithms to focus on more useful information and improve the user experience. Yahoo gets its search results from Bing, so any spam standards that apply to Bing will also apply to Yahoo.

Bing information and reporting tool: https://www.microsoft.com/en-us/concern/bing

Summary

Sites that are not in compliance with SEO standards of quality are in danger of being removed from the search engine indexes. Should all search engines enforce the same standards, many websites will be scrambling for honest SEO firms to optimize their sites. This creates an opportunity for SEO practitioners to set a standard for the future.

We encourage all who read this to be vocal with their staff, clients, and SEO providers about working toward compliance.

FAQ: How can I ensure my SEO strategies align with the best practices of SEO standards analysis?

Let’s explore some key insights to help you align your SEO strategies with industry-leading practices.

Understanding SEO Best Practices

To begin, it’s crucial to grasp the essence of SEO best practices. These practices are guidelines established by search engines and digital marketing experts that ensure your website is easily discoverable and provides valuable content to users. They cover aspects like keyword research, quality backlink building, mobile responsiveness, and user experience.

Navigating the SEO Standards Landscape

SEO standards are intricate, often guided by updates from search engines like Google. Keeping up with these changes is imperative. Regularly monitor industry news, follow reputable SEO blogs, and attend webinars to stay informed. Embrace a proactive approach, adjusting your strategies in response to algorithm shifts.

Data-Driven Decision Making

Incorporate data analysis into your SEO strategy. Leverage tools like Google Analytics and Search Console to gain insights into user behavior, keyword performance, and traffic sources. This data empowers you to refine your strategies based on real-time information, improving your site’s ranking and relevance.

Content Quality and Relevance

Content remains the heart of SEO success. Crafting high-quality, relevant content that addresses user intent enhances user experience and signals to search engines that your website is authoritative. Regularly update and optimize existing content to align with current search trends.

Technical SEO Excellence

Technical SEO is the backbone that supports your content’s visibility. Ensure your website is well-structured, with clean URLs, optimized images, and fast loading times. Implement schema markup to help search engines understand your content better. A technically sound website enhances user experience and contributes to better search rankings.

By seamlessly integrating these insights, your SEO strategy can transcend beyond conventional practices, yielding sustainable results. Remember, SEO is an ongoing journey, not a one-time task. Stay adaptable, continuously refine your approach, and you’ll confidently navigate the dynamic world of SEO.

Step-by-Step Procedure: Optimizing SEO Strategies with Best Practices of SEO Standards Analysis

  1. Familiarize yourself with fundamental SEO concepts and practices.
  2. Stay updated on the latest industry trends through reliable SEO news sources.
  3. Research and understand the search engine algorithms, especially those of major players like Google.
  4. Attend webinars, conferences, and workshops related to SEO standards analysis.
  5. Regularly monitor your website’s performance using tools like Google Analytics and Search Console.
  6. Analyze user behavior data to identify areas for improvement in your SEO strategy.
  7. Conduct comprehensive keyword research to identify relevant search terms for your target audience.
  8. Develop a content strategy that aligns with user intent and incorporates high-value keywords.
  9. Ensure your website’s technical aspects, like page speed and mobile-friendliness, are optimized.
  10. Implement schema markup to provide search engines with additional context about your content.
  11. Focus on building high-quality backlinks from authoritative websites in your industry.
  12. Create and publish fresh, informative, and engaging content on a consistent basis.
  13. Regularly update and optimize existing content to reflect current search trends and user interests.
  14. Collaborate with web developers to address any technical issues that might impact SEO.
  15. Monitor your website’s ranking and traffic patterns to gauge the effectiveness of your strategies.
  16. Stay adaptable and adjust your strategies in response to changes in search engine algorithms.
  17. Leverage social media and other platforms to amplify the reach of your content and engage with your audience.
  18. Consider investing in paid search advertising to complement your organic SEO efforts.
  19. Engage with your audience through comments, shares, and social interactions to build a strong online community.
  20. Continuously educate yourself about evolving SEO standards and adapt your strategies accordingly.
EU Team


Bruce Clay Europe Srl
Via Ponte Seveso 23
20121 Milan
+39-328-9770661