Get Free Quote

FEATURE: Bruce Clay's SEO Action Items for Google's Assault on Aggressive SEO

by Virginia Nussey, April 17, 2012

Estimated reading time:
4 minutes

Top takeaways:
• A reinculsion request response from Google uses the terms "organic" and "inorganic" to describe whether or not a link is topical.
• A future SERP may be made up of 3 organic results, 3 paid results and 3 local results.
• The decreased real estate of organic results points to a need for increased relevance, explaining Google's focus on improving organic ranking algorithms.

Google is taking aggressive action to reduce what it believes to be unfair or manipulative SEO practices. For a report of the events that have unfolded around over-optimization and unnatural linking, read this month's news article by Bruce Clay Australia.

For an analysis of how Google's latest efforts to stamp out spam may shape SEO strategies, industry veteran Bruce Clay weighs in on the over-optimization issue. Here, Bruce describes:

  • What to watch out for when evaluating the long-term security of your SEO strategy.
  • Possible technologies Google is using to detect over-optimization, such as the utilization of the Chrome browser.
  • Possible motivations for Google's reinvigorated offensive, including the possible future face of search results.

This video interview on Google's Offense Against Over-Optimization and Aggressive SEO was conducted Thursday, April 12, 2012. The transcript follows of the video follows.

SEO Newsletter: What's involved in Google's over-optimization considerations?

Bruce Clay: I think that the repositioning of things to move ads above the fold, which are images with links that leave your site, I think that Google has openly said that that is spam and consider that to be a low quality site. I think that's part of over-optimization.

I think that if you just have a site and you're not repositioning things and you have a lot of navigation above the fold, that's just normal navigation but if it all leaves your site that's where the quality issue is.

I think that Google is also looking at sites from the standpoint of links. They're getting more aggressive on what they're referring to as "organic" and "inorganic" links. The concept of an organic link is that it's something that is earned and appropriate to the subject matter of your page, whereas an inorganic link is something that has a low overlap or incidence of complementary, supplementary or synergistic content.

If I had a site and it's about computers, having links to my computer page from somebody that does Christmas cards is not necessarily considered to be organic. It's a link. It passes PageRank. They may not be selling links, but because it's inorganic it's immediately now considered highly suspect. And I think that those kinds of inbound links are considered to be a penalty. They are considered to be, certainly, something that shows that you are out either buying or soliciting or bartering or something to get links that you wouldn't naturally.

(Where did you hear this term?)

Organic versus inorganic is in a reinclusion request response from Google.

What factors do you think influence the designation of links as organic or inorganic?

I also think that that is a complex thing because inorganic or organic might be biased by multiple factors. One of those factors could be the trust of the sites or the network of sites that are linking to you and vice versa.

What we want to be able to do is make sure from an SEO point of view that our inbound links are appropriate to the site, not just in the anchor text but the site linking to me is organic, it's appropriate site for site. We want to make sure that I link out to people that are organic, appropriate match site for site. And we want to make sure that my on-page structure does not look manipulated for search for SEO purposes. Present the page the way the page should be presented.

The sites known to have been targeted by over-optimization penalties are arguably low quality. Is there any possible long-term impact for white hat SEOs in the future?

What I have as a personal psychological problem is that Google has been so vague about what over-optimization is. And we're going right back to where we were 12 years ago. As soon as Google defines what over-optimization is, they draw the line for the spammers. So, they must intentionally be vague about this. They can not let the spammers know where the line is drawn in the sand. Being vague, everybody's sort of guessing.

The best that we can do is attempt to determine what over-optimization is. I am personally convinced that over-optimization is the organic and inorganic approach to link structures and the network that are your inbound links. That is the biggest risk that Google faces. It is the easiest to get and therefore likely to be a target for over-optimization. That's why I think that's a big part of it.

What role might Chrome be playing in Google's capability to detect over-optimization?

One of the things that all along has been a problem is that a spider doesn't really fully emulate a browser. But with Google owning a browser, they can just take part of the browser code, use that to render the page. Use that to resolve CSS and JavaScript. Use that to pick up on the header stuff. They can even, if they wanted to, go as far as to say that it is actually a browser and not Googlebot. And they can determine what the page is pretty fully as if it were rendering and then use that as the basis for the spam filter instead of the old spider technology which is read a page and pray.

Major moves by Google are strategic and take the whole search environment into account. Any big-picture theories about why Google's acting now?

You've probably heard me talking about it in training and at conferences in my presentations: the Rule of Threes. Pretty soon the first page of organic results will be 3 pay per click, it will be 3 organic, it will be 3 places. That's for 70-80% of all queries. And the other 10% of the page will be engagement. It'll be videos, it'll be news, something, images.

So, the home page is only going to have 3 organic links that matter. The brands and the purely virtual companies, the ones that are entirely on the web, they're not going to be able to compete against Places. Brands like eBay that doesn't have Places, can't compete. Amazon can't compete in the places area because there are no stores locally. In order for the quality sites to show up ━ the sites that really belong in that organic space, to show up ━ one of the things that Google must do is to get rid of the sites that vie for those rankings that don't deserve them. I think that Google is basically, if they're going to promote places more, they're going to have to make the organic better. I think that's one of the compelling reasons.

What actions, if any, do you recommend SEOs take now?

What we're going to find is Google is going to tighten down on the organic and inorganic concept of linking. If somebody links to you and there's no reason in Google's mind for them to link to you, that's inorganic; you will lose. If people link to you and it's totally expected that they would link to you, that is organic; it counts. I think they're going to apply that.

I think that we on our websites for both inbound and outbound links have got to start doing pruning of both who we link to and who we don't link to, and who links to us and who doesn't link to us. We have to get in and do pruning projects. I think that we as site owners have got to be more careful about what our competition can do to us. I think that is an increasing threat. I think that it's going to get bigger and bigger and bigger, and I think that it's going to be a significant topic over the rest of this year.


For permission to reprint or reuse any materials, please contact us. To learn more about our authors, please visit the Bruce Clay Authors page. Copyright © 2012 Bruce Clay, Inc.