Get Free Quote
« When is Optimization... | Blog home | Search Headlines... »
February 8, 2007

The 950 Penalty or a New Ranking Theory?

Print Friendly

I should preface this by saying everything you read below is pure speculation, but the fact that it’s pure speculation is why this post is so fun and you should read every single word without getting up for more water, checking your email, refreshing your MyBlogLog profile or using the restroom.

If you haven’t noticed the much-hyped debate regarding the maybe real/maybe not real Google Minus-950 penalty is heating up and is actually getting interesting.

We first started hearing talk about a supposed Minus-950 penalty back in January when several webmasters complained that pages that traditionally ranked very well were now being hidden on the last page of the search engine results. Even odder was that the only sites that seemed to be experiencing this problem were spam sites or high quality niche sites. Sites that fell in the middle were left unaffected. (Or have just been taught that openly complaining is annoying and is not a good way to make friends.)

At the time, the discussion in the forum focused on whether this penalty actually existed and why it was only certain pages that were being affected, not the entire site. Three explanations presented were:

  1. The penalty was a result of over-optimization, most often using too similar anchor text on a page.
  2. A sign that Google can’t differentiate between scraper sites and the original content producers. For example, Google runs across an instance of duplicate content in their index, doesn’t know which site to punish, so both sites are banished to the last page of the SERP.
  3. It’s somehow related to Google’s disengaging of the George W. Bush Googlebomb since both acts moved once high ranking pages out of a searcher’s view.

As you can see, the "explanations" offered read more like "outright guesses", and no possibility seem too outrageous. Like most forum debates, the conversation went largely unresolved.

The thread was rekindled today after WebmasterWorld administrator Tedster read through a patent filed (on my birthday) last June by Googler Anna Lynn Patterson. The patent is named "detecting spam documents in a phrase based information retrieval system" and discusses a system where word phrase frequency is used to determine if a page is spam. Some have called it low-scale version of latent semantic indexing.

The patent’s abstract reads:

"Phrases are identified that predict the presence of other phrases in documents. Documents are the indexed according to their included phrases. A spam document is identified based on the number of related phrases included in a document."

The idea here (I think) is that too many like or related phrases signals that a page is keyword stuffing and not providing useful information to the reader. Based on that find, Tedster broke off the original thread and asked: Is it a "950 Penalty"? Or is it Phrase Based-Re-ranking. Basically, is the 950 Penalty real or is Google re-ranking results due to phrase-based factors.

Tedster believes the latter, that this patent is responsible for the "penalty" site owners have been experiencing:

"My gut is telling me that this isn’t really a penalty, it’s an interactive effect of the way the Google dials have been turned in their existing algo components. It’s like getting a poor health symptom in one area of your body from not having enough of some important nutrient — even though you’ve got plenty of others and plenty of good health in many ways."

It’s hard to tell what’s going on or if anything is going on at all. Tedster backs up his assertion by reporting that he knows of a site where "one solid new inbound link from a very different domain" solved the site owner’s problem. But another member says it took him "de-optimizing" his site and lowering the keyword density of things like page titles, and body content before his site regained its rankings. So we’ve come full circle.

What do you think? Is there really a penalty or is there a filter that re-ranks results based on a sort of latent semantic indexing? If there is a penalty, is it just the MSSA penalty in disguise or is it legitimate?, Is one thing responsible for everything or is it just easier for people to make up new Google penalties than to accept responsibility for a crappy site?

The conversation is still going on at WMW so go check it out.





6 responses to “The 950 Penalty or a New Ranking Theory?”

  1. Nicola RIva writes:

    Very interesting post Bruce. I believe that it is more possible that the effects of penalization are due to overoptimization rather than a -950 penalty. I had experiences of website that after a massive change on-page factors ranked worse. And looking deep at the changes, these were alla sings of over optimization. Expecially built on phrases rather than keyword stuffing.

  2. Peter Dimov writes:

    Carel Translations was established in 1991 as a translation and secretarial services bureau. Our team of dedicated translations are experienced in all major industries such as economy, law, construction, engineering and computer science. Our loyal clients are local firms, foreign companies, universities and many more.

  3. CAD Website Design writes:

    Lisa,

    I love reading your blog postings, and I wish I could find someone like you to help develop my blog for my business.

    I have read the same posting by tedster at WMW and while I do believe there exists an over optimization penalty, after reading through some of the complaints about the -950 penalty, I feel that it is more like the MSSA penalty (love that one by the way) :)

    I know that I look at on page factors all the time for my personal sites. I still have not found any magic bullet, but I believe that a site can rank well if they follow best practices when designing their sites.

    While we enjoy building sites for others, I despise doing search engine optimization/marketing for our clients and would rather sub-contract it out to someone else.

    Thanks for the wonderful and entertaining articles you guys and gals here at Bruceclay.com post on a regular basis.

    Jeff Phillips -
    President -
    CAD website Design

  4. Serkant Karaca writes:

    If this is a penalty it should end in a finite time. People say after de-optimization, they gain their healty positions which means it’s a filter rather than a penalty.

  5. Rishi MOdi writes:

    I’ve worked with one of my customer who said he has been penalized for 950 penalty and did the following and he regained his position in 1 months time in Google. (I don’t know if he was penalized but when I came across the site there was no ranking, where as customer claims that he had a good ranking earlier).
    1) Got his w3 validation done.
    2) Worked on his site structure and removed all orphan and missing links.
    3) We did seo copy writing for all the supplement pages.
    4) Got him link from some authoritative websites like Yahoo, Business.com and new other related to his niche.

    This strategy worked for this customer and he was happy. I’m not claiming that this is the way to deal with it, but it worked for a customer so no harm trying. And anyways its all white hat SEO.

    Regards

  6. Man Ray writes:

    Reading on, I’m kind of wondering if there ever was a term like over-optimization. Something that’s over or sub isn’t optimal at all. There’s a specific penalty for every misdemeanor.



Learn SEO
Content Marketing Book
Free Executives Guide To SEO
By continuing to use the site, you agree to the use of cookies. AcceptDo Not Accept
css.php