SEO Factors & Trends 2010 – July Update

In January 2010, Bruce Clay Australia released its annual SEO report including key 2009 ranking factors and major trends for 2010. Six months later, the search engine landscape has evolved dramatically, confirming these trends and introducing new dimensions to them.

The key factors with increased importance in 2010 are speed of site, mobile search, online brand presence, social media optimisation, personalised search, linking, richer search listings, spidering and new developments like Google’s infrastructure update, Caffeine, and the Mayday algorithm update.

bruceclay-aus-seo-fact

Speed of Site

In January, page speed became a ranking factor in the Google algorithm, and monitoring the load speed of (at least) the key ranking pages is critical to keep them ranking. Using tools such as Yahoo YSlow or Google PageSpeed provides webmasters with great recommendations for improving page speed, and performing these tests on your and your competitors’ key ranking pages is a good way to understand what needs to be done to make your website faster. Google Webmaster Tools also displays a benchmark of your site compared to other sites, and has recently rolled out some new features breaking down the load speed (in Google’s eyes) of some key pages. Site speed is obviously an ongoing race involving all a websites’ stakeholders. Don’t let your competitors outrun you.

Mobile Search

Mobile search has also evolved in interesting ways since last year. While Google enjoyed the near monopoly of search on the world’s leading mobile phone, the recent addition of Bing as a search engine option on the iPhone could shake the well-established place of the search giant. Some rumours have been circulating about Bing becoming the default search engine on the iPhone, but they are unsubstantiated. Some mobile search behaviour research has shown that searches on high-end phones are as complex as desktop searches and even more diverse in some ways. Optimise your site to make sure that your website displays in the best possible manner to lower your bounce rate from mobile browsers and optimise for local searches to capture the “instant searcher” looking for something, right here, right now.

Online Brand Presence

We believe online brand presence remains a key strategy for all brands. This strategy can be complex in large organisations where education has become essential to ensure that all external communication follows a set of SEO guidelines ensuring the optimal association of your brand with key ranking keyword phrases. Google recently released a new brand suggestion feature for generic queries, aimed at helping users find the most popular brands on the Web related to their search query. This new feature is determined algorithmically, so improving your online brand presence could get your brand suggested to the search engines users for your most relevant keyword phrases.

Social Media Optimisation

Social media can be a love / hate relationship for marketers. These new marketing channels are exciting and can be used in interesting ways, but in many cases, social networking sites do not drive significant traffic volumes to your site and will cost you some serious time if the implementation is not done in an optimal manner. Whether you adopt an offensive or defensive approach, you have to be there. And why wouldn’t you, when Hitwise released a report in March 2010 showing that Facebook attracted more visits than Google over that month.

This is getting even more important since Facebook rolled out a “Like” feature for all websites. We have already experienced the benefits of the Facebook Connect tool, but the Open Graph opens new doors to the social media giant. I exposed my theory over a year ago in a blog post about Facebook Search and it looks like Facebook is getting even more ambitious. Since then, Bing has started powering Facebook search, the social network’s first step in the search market, but most importantly, Facebook has now started to leverage its entire community to “Like” pages.

We all know that relevance is the holy grail of search engines, mainly supported by three pillars, one of them being the expertness of your site. So far, the best way found by search engines to measure your website’s authority was to measure links, and it made sense. But what if search engines started to replace, or at least to introduce, human reviews in the ranking algorithm, wouldn’t it be more relevant? Facebook has this major asset where it receives votes of confidence from its massive user database about how relevant a website is to them. This new dimension also has the advantage of being more difficult to spam, with the obvious constraint of having to possess a Facebook account to vote. The democratic power of a closed network may be questionable, but has the power to potentially bring more relevancy to search results, and social votes may become the new links on an SEO perspective. So make sure you stay up to date with all these social media innovations, because if you don’t get in now, your competitors will.

Personalised Search

Have you recently typed in an important keyword and been surprised at how high your rankings are? This can be confusing for people unaware of the recent personalisation of search results released by Google. We all saw it, to the point where by mistake we may have thought for a few seconds that our website was starting to rank for all sorts of highly competitive search queries. Well, this is not the case. The important point here is to make sure that all stakeholders of your SEO project are aware of this and don’t start drawing conclusions based on false results. To check non-personalised search results, some use “incognito” or other less personalised methods such as clearing the cache and cookies, but we have seen discrepancies in these search results and would advise to only use tools getting data straight from the APIs such as our SEO tools suite to create your ranking reports.

Linking

Google has recently been granted a new patent based on the “reasonable surfer model“. This new patent explains how Google may give more or less weight to the different links on a page based on certain criteria.

These are:

  • User behaviour based on a set of documents = Support your content with other relevant content (Bruce’s famous siloing concept.)
  • User behaviour related to the links = Make sure your important links are prominent.
  • Model based on user behaviour data = Only link to relevant content to avoid bounces.
  • Weight within a set of documents = Organise your content into a hierarchy to ensure the most important content ranks.
  • Model based on a link associated with a linking document = Use good anchor text.
  • User behaviour data relating to navigational actions = Choose your navigation wisely and limit the number of “templated” links on the page.
  • Weight based on user behaviour data and document features = Create good content that users will actually like.
  • Weight for a link based on features of the link, linking document or linked document = Choose descriptive anchor text for your links and use contextual links to link to other relevant documents.

Google is pushing more and more in the battle against spammers. They are even calling for external spam reports by everyone to fight link spam.

This confirms that getting links is not about spamming directories, forums or discussion boards, it’s about creating content that is worth linking to. While you still need to make sure your website is well referenced across the Web, do not spam. You may get caught and this may be the end of your organic search engine traffic and, in some cases, your business; spam isn’t worth it. You could spend an hour spamming the hell out of Yahoo Answers, or you could spend an hour writing great content for your website, which one would you choose anyway?

Richer Search Listings

Search engines are introducing new dimensions in search, bringing more diversified and relevant types of search results to the users. One of our directors at Bruce Clay Australia recently spoke at SMX Sydney about Universal Search results. Last year, I blogged about the potential of Rich Snippets and since then, HTML 5 released a new format of structured data named Microdata, and Google continues to recognise more microformats such as Events, Recipes and even Video.

Getting your website to rank for the different types of search options increases your presence in the search results, but implementing structured data on your site increases the chance of triggering Rich Snippets in search listings, providing great additional information to the users.

When it comes to Product search, Google has now become the most visited shopping comparison site on the Web, so if you have an e-commerce site, implementing these new mark-up standards can give a significant boost to your traffic and revenues. So mark-up your site to diversify your search results, increase your click through rate and your revenue.

Spidering

Search engines are getting better and better at finding and indexing your content, but some technologies remain better than others for spiders, so make sure you are using the most appropriate technology for search engines on your website. Website architecture is also key to ensure optimal spidering and to maximise the proportion of your site indexed by the search engines. Good website architecture will make it easier for spiders to find and index your content.

New Developments

Since January, some new features have been rolled out by Google to optimise the spidering of images using images XML Sitemaps. Google is now also supporting multi-content XML Sitemaps, where you can specify the different types of content for each URL, all in one XML Sitemap.

Additional major changes in the search engines world in the last 6 months included:

Google Caffeine

The full release of Google Caffeine has impacted SEO as Google gets much faster at discovering and indexing content. This means that websites with more diversified and fresher content on their site, will be spidered and indexed more regularly.

Mayday Update

In an attempt to add quality and improve relevance to longer tailed search queries, Google has updated its algorithm to add more quality to the long-tail query search results, hence reshuffling some of their rankings. Matt Cutts said at Google I/O in May 2010 that “this is an algorithmic change in Google, looking for higher quality sites to surface for long-tail queries. It went through vigorous testing and isn’t going to be rolled back.” Essentially, this has negatively impacted websites that were ranking for long-tail queries without having great content and links, while rewarding websites that have created good content for their users and generated quality links. Matt Cutts also mentioned that it was affecting long-tail queries “more” implying that this algorithmic change could also impact head term queries. Matt Cutts confirmed it’s an improvement in search quality, and this is for the best.

So far, this year has been pretty busy in the search engine world and we will publish another issue of our SEO Factors & Trends report early next year to summarise what we believe are the key SEO factors for 2010 and which will be the key trends for 2011.

See morliac's author page for links to connect on social media.

Comments (0)
Filed under: SEO — Tags:
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!

LEAVE A REPLY

Your email address will not be published. Required fields are marked *



Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919