Google Updates: A History of SEO from 2000-2010
The Search Engine Optimisation (SEO) industry has changed tremendously in the last ten years. Ever since Matt Cutts stated Google makes 300 to 400 changes to the algorithm each year, it’s evidently clear rankings change quite a bit, for a variety of reasons. For competitive queries like car insurance you can see changes on nearly a daily basis, as Google continues to chase relevance for users. Over the past ten years, some of these changes have had disruptive impacts on not only the top ranking results in the SERPs, but also traffic to the websites behind these and the businesses behind these wbesites. This post will cover some of the significant Google updates that have occurred since 2000.
A Brief History of SEO
Before there was even a word for Search Engine Optimisation, webmasters would discuss their strategies to get websites ranking on forums. Webmaster World has a great post detailing the major events in SEO prior to 2000, dating back all the way to 1995 – the era where search engines were akin to the Yellow Pages with AAA style listings at the top. From then on having your websites rank well has been a constant cat-and-mouse between webmasters and the search engines: SEO was born.
At the time people were not even calling it “SEO”, but people realised that they could manipulate the rankings and shared their strategies online (and presumably kept many secret as well). I highly recommend that you check it out the post because as it is a fascinating read and gives you a really good appreciation for how far we have come as an industry.
Fast forward to 2000 when Google broke into the scene with its new Page Rank algorithm, and it became clear to webmasters around the world had to adapt and change.
Google Updates from 2000 to 2010
Between 2000 and 2003 PageRank would generally be updated monthly and rankings would fluctuate accordingly. I remember hearing Todd Friesen tell a story about long sleepless nights waiting for the updates to arrive, then panicking (and refreshing like crazy) until the new rankings resolved. Webmasters would post their findings on Webmaster World, and once the updates were complete they knew it was about a month until the next set of updates arrived. It was in 2003 when the people at Webmaster Word started naming the updates after hurricanes, with Boston, Cassandra, Dominic, Esmeralda, all the way to the infamous Florida update.
During this time SEO was pretty spammy. It was all about getting high PageRank links, or even just links from wherever and whoever you could. Footer links on high PR pages would catapult you to the top, and link farms to throw PR to your websites were easy to deploy and effective.
Florida Update – November 2003
The Florida update was the first “game changer” update as many top rankings sites simply disappeared from the rankings. Sheer panic erupted across the board because it seemed that Google finally cracked down on the manipulative tactics being used to get pages to rank.
Barry Lloyd and many others theorised that the engineers at Google invented a way to detect pages that have been over-optimised and simply removed them from the index. Ian Lurie recalls that the sites that didn’t disappear were the content rich, natural ones with good, well-written content.
Brandy Update – February 2004
The Brandy Update emphasised Latent Semantic Indexing – the idea of using synonyms on your website. Someone by the name of “GoogleGuy” (aka Matt Cutts) made an interesting point just before the update that webmasters who do not “think about search engines” generally do not bother to include word variants – and spammers can easily create doorway pages full of word variants. What does that mean? You can’t outsmart Google just by throwing keyword variations into your text – it has to be natural.
LSI is not simply opening up a thesaurus and replacing every X instance of “dog” with “Canine”. As Google crawls and indexes billions of pages, it gets a pretty good idea about word associations and what words should appear on the pages. If you want to learn a bit more on the subject, read our post on Latent Semantic Indexing.
Alex Walker made a post over on sitepoint about the Brandy Update, and he highlighted the five important changes he thought were brought with the update: an increase in index size; Latent Semantic Indexing (using synonyms); grouping websites in neighbourhoods; de-emphasising on-page elements like <h1> and <b>.
Allegra Update – February 2005
In a press release, 6S Marketing reported that Allegra was a remedy to the “Sandbox Effect” that many websites were facing since 2004, and there were many posts over as Webmaster World that support this theory. The forums over at Search Engine Watch mentioned that LSI factors could have been emphasised, and these were the two main changes reported about the update.
Bourbon Update – May 2005
The Bourbon Update is another big change to the algorithm that was focused on getting rid of spam from the index. There was an article on COMMbits stating that its purpose was to tackle duplicate content, non-thematic linking, low quality reciprocal links, and fraternal linking. In a post on Webmaster World Matt Cutts has a very long discussion about general updates at the time, and mentioned re-inclusions to sites removed from the index, and there were additional posts about breaking out of the sandbox.
Jagger 1, Jagger 2, and Jagger 3 – October 2005 to November 2005
The Jagger updates, like most of the major updates, were released with the intention of dealing with an increasing amount of webspam. The sites tackled were scraper sites, AdSense directory sites, pages using deceptive CSS techniques, and again dealing with reciprocal linking abuse.
Before the update, Matt Cutts issued a warning about hidden text on sites, and after Jagger he invited webmasters that thought their site was mistakenly removed for hidden text or text links to ask for a re-inclusion. Google has several patents relating to relevancy, and they may have bumped up their importance in the algorithm. Here is a post that outlines one author’s ideas on the changed ranking factors.
There was an interesting discussion over at Tech Patterns that some webmasters noticed their rankings did not fluctuate if they used white hat tactics, and that most of the plummets in rankings were from sites reciprocal-linking and spamming their way to the top. Of course, this is one person’s opinion on one forum, but it seems in line with the ultimate goal these updates: Google wants to deliver clean, non-spammy, and useful results.
Personalized Results – June 2005
In June, 2005 Google made their first mass-release of Personalized Results. The purpose of this change was to influence the results shown to a user logged into their Google Account by what websites they have visited and through their previous searches. You can read more about this change over at Wikipedia.
BigDaddy – December 2005
The BigDaddy update was a software upgrade of the GoogleBot and affected the way Google dealt with links. Matt Cutts said that the kinds of sites effected by the index “had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling.” Web Workshop reported that this is when outbound links became an important factor, as linking to spammy sites like Omega 3 fish oil or Ringtones could affect your rankings. Matt also mentioned the importance of having relevant links pointing to your site in order for Google to crawl more pages on your site.
Reducing the Impact of Googlebombs – Jan 2007
We all have seen Googlebombs before – where a group of people try to influence the rankings for an obscure term as a joke, such as the results for “Weapons of Massive Destruction” returning a fake 404 error page – and in January 2007 Google announced that they tweaked the algorithm to detect them.
Universal Search – May 2007
The impact of Universal Search on Search Engine Optimisation was that the SERPs integrated material from Google’s multiple channels and opened “back-doors” to the first page. If you could get a video ranking, it could sneak to the top of the results pages of very competitive keywords. The change brought together images, videos, news, maps, and websites into a single set of results.
Real Time Search – December 2009
Real Time Search was about including fresh, topical content in the SERPs. If there is an earthquake or other major event, it makes sense that queries about that location bring up links to relevant news articles even if they did not have long-standing high quality links or other time-related signals of authority. RTS brought with it the idea of Query Deserves Freshness (QDF), as Google had to determine what queries need frequent updating. Here’s a video from Google about the change.
Vince – February 2009
The Vince update, named after the Google engineer who invented it, is known as the Google update that boosted the rankings of popular brands. Matt Cutts went on the record saying that the change wasn’t about boosting brands per say, but rather putting more weight on domain authority, trust, and reputation (which big brands generally have). This change really highlighted the importance of working towards establishing credibility and building an authoritative domain.
Caffeine – August 2009
On August 10, 2009 Google began inviting people to test out their “next generation infrastructure”. It finished rolling out in June, 2010, and touted both “50 percent fresher results” and the largest index ever collected. Google went from having layers of their index each updating at a different rate (each requiring an entire re-crawl of the web before updates could be rolled-out), to smaller portions that would update on a continuous basis.
May 2010 – Mayday
At the end of April / start of May Google made a significant change to its algorithm, looking for higher quality sites to surface for long tail queries. Search Engine Land reported that the sites most hit by the change were those with many product pages without strong links pointing to them. In a Google Webmaster Help video, Matt Cutts described the change, and suggested ways that people could improve the quality of their sites by asking themselves the flowing questions: “What sort of things can I do in terms of adding great content… [and] do people consider me an authority?”.
Instant Search – September 2010
Google Instant is the most recent change to the search engine, and it is all about updating results as you type. There was an abundance of speculation that this would have huge effects on search engine optimisation, but so far those appear to be exaggerated. You can read our post on Google Instant Search for more information.
In November Instant was updated with “Instant Preview” showing users an image of the page before they click through to the link.
“Decor My Eyes” Update – December 2010
A story erupted around the web this week about a website that was using bad customer service to get incoming links, which in turned supported its rankings for many competitive terms. Today, Google announced that it tweaked its algorithm to respond to cases like this and try to prevent them from happening in the future.
After looking at the changes Google has made in the past ten years, I think the biggest conclusion to draw is that you need flexibility in your approach and definitely should not focus on a specific ranking factor. The algorithm is tweaked hundreds of times each year, and what works today might not work tomorrow or in extreme cases, even be considered spam. It’s clear from Google’s relentless pursuit to remove spam from the index that your efforts really should be producing quality content and establishing credibility and authority by attracting natural, relevant links from authoritative sites. At Bruce Clay we break these down into Technical, Expertness and Content.
What do you think? Have I missed any big updates? Anything you would like to add? I would love to hear your thoughts in the comments.