The 950 Penalty or a New Ranking Theory?

I should preface this by saying everything you read below is pure speculation, but the fact that it’s pure speculation is why this post is so fun and you should read every single word without getting up for more water, checking your email, refreshing your MyBlogLog profile or using the restroom.

If you haven’t noticed the much-hyped debate regarding the maybe real/maybe not real Google Minus-950 penalty is heating up and is actually getting interesting.

We first started hearing talk about a supposed Minus-950 penalty back in January when several webmasters complained that pages that traditionally ranked very well were now being hidden on the last page of the search engine results. Even odder was that the only sites that seemed to be experiencing this problem were spam sites or high quality niche sites. Sites that fell in the middle were left unaffected. (Or have just been taught that openly complaining is annoying and is not a good way to make friends.)

At the time, the discussion in the forum focused on whether this penalty actually existed and why it was only certain pages that were being affected, not the entire site. Three explanations presented were:

  1. The penalty was a result of over-optimization, most often using too similar anchor text on a page.
  2. A sign that Google can’t differentiate between scraper sites and the original content producers. For example, Google runs across an instance of duplicate content in their index, doesn’t know which site to punish, so both sites are banished to the last page of the SERP.
  3. It’s somehow related to Google’s disengaging of the George W. Bush Googlebomb since both acts moved once high ranking pages out of a searcher’s view.

As you can see, the "explanations" offered read more like "outright guesses", and no possibility seem too outrageous. Like most forum debates, the conversation went largely unresolved.

The thread was rekindled today after WebmasterWorld administrator Tedster read through a patent filed (on my birthday) last June by Googler Anna Lynn Patterson. The patent is named "detecting spam documents in a phrase based information retrieval system" and discusses a system where word phrase frequency is used to determine if a page is spam. Some have called it low-scale version of latent semantic indexing.

The patent’s abstract reads:

"Phrases are identified that predict the presence of other phrases in documents. Documents are the indexed according to their included phrases. A spam document is identified based on the number of related phrases included in a document."

The idea here (I think) is that too many like or related phrases signals that a page is keyword stuffing and not providing useful information to the reader. Based on that find, Tedster broke off the original thread and asked: Is it a "950 Penalty"? Or is it Phrase Based-Re-ranking. Basically, is the 950 Penalty real or is Google re-ranking results due to phrase-based factors.

Tedster believes the latter, that this patent is responsible for the "penalty" site owners have been experiencing:

"My gut is telling me that this isn’t really a penalty, it’s an interactive effect of the way the Google dials have been turned in their existing algo components. It’s like getting a poor health symptom in one area of your body from not having enough of some important nutrient — even though you’ve got plenty of others and plenty of good health in many ways."

It’s hard to tell what’s going on or if anything is going on at all. Tedster backs up his assertion by reporting that he knows of a site where "one solid new inbound link from a very different domain" solved the site owner’s problem. But another member says it took him "de-optimizing" his site and lowering the keyword density of things like page titles, and body content before his site regained its rankings. So we’ve come full circle.

What do you think? Is there really a penalty or is there a filter that re-ranks results based on a sort of latent semantic indexing? If there is a penalty, is it just the MSSA penalty in disguise or is it legitimate?, Is one thing responsible for everything or is it just easier for people to make up new Google penalties than to accept responsibility for a crappy site?

The conversation is still going on at WMW so go check it out.

Lisa Barone is a writer, content marketer & VP of strategy at Overit Media. She's also a very active Twitterer, much to the dismay of the rest of the world.

See Lisa's author page for links to connect on social media.

Comments (11)
Filed under: SEO
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!
Bruce Clay on March 11, 2024
Do Meta Descriptions Matter Anymore?
Bruce Clay on March 7, 2024
Social and PR Practices That Support SEO for Beginners
Bruce Clay on March 6, 2024
How to Rank Higher on Google in 9 Steps

11 Replies to “The 950 Penalty or a New Ranking Theory?”

Hello, I visit this website on a regular basis, this web site is really good and the users are actually sharing pleasant thoughts.

Heya, I am here for the first time. I found this blog and find it truly helpful. It helped me out a lot. I hope to give something back and aid others as you helped me.

Very good information. Lucky me I discovered your blog by accident (stumbleupon). I’ve saved it for later!|

If you want to grow your traffic just keep visiting this website and be updated with the newest information posted here.

You actually make it seem so easy together with your presentation however I in finding this topic to be actually something that I think I would by no means understand. It kind of feels too complex and very vast for me. I’m looking ahead for your subsequent put up, I will try to get the grasp of it!|

Reading on, I’m kind of wondering if there ever was a term like over-optimization. Something that’s over or sub isn’t optimal at all. There’s a specific penalty for every misdemeanor.

I’ve worked with one of my customer who said he has been penalized for 950 penalty and did the following and he regained his position in 1 months time in Google. (I don’t know if he was penalized but when I came across the site there was no ranking, where as customer claims that he had a good ranking earlier).
1) Got his w3 validation done.
2) Worked on his site structure and removed all orphan and missing links.
3) We did seo copy writing for all the supplement pages.
4) Got him link from some authoritative websites like Yahoo, Business.com and new other related to his niche.

This strategy worked for this customer and he was happy. I’m not claiming that this is the way to deal with it, but it worked for a customer so no harm trying. And anyways its all white hat SEO.

Regards

If this is a penalty it should end in a finite time. People say after de-optimization, they gain their healty positions which means it’s a filter rather than a penalty.

Lisa,

I love reading your blog postings, and I wish I could find someone like you to help develop my blog for my business.

I have read the same posting by tedster at WMW and while I do believe there exists an over optimization penalty, after reading through some of the complaints about the -950 penalty, I feel that it is more like the MSSA penalty (love that one by the way) :)

I know that I look at on page factors all the time for my personal sites. I still have not found any magic bullet, but I believe that a site can rank well if they follow best practices when designing their sites.

While we enjoy building sites for others, I despise doing search engine optimization/marketing for our clients and would rather sub-contract it out to someone else.

Thanks for the wonderful and entertaining articles you guys and gals here at Bruceclay.com post on a regular basis.

Jeff Phillips –
President –
CAD website Design

Carel Translations was established in 1991 as a translation and secretarial services bureau. Our team of dedicated translations are experienced in all major industries such as economy, law, construction, engineering and computer science. Our loyal clients are local firms, foreign companies, universities and many more.

Very interesting post Bruce. I believe that it is more possible that the effects of penalization are due to overoptimization rather than a -950 penalty. I had experiences of website that after a massive change on-page factors ranked worse. And looking deep at the changes, these were alla sings of over optimization. Expecially built on phrases rather than keyword stuffing.

LEAVE A REPLY

Your email address will not be published. Required fields are marked *



Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919