SEO Hot Tub
Consumer Focus Is On Price: consumer trends prove that shopping demand is not driven by brand as much as it used to be. Today, consumers online are looking for deals, discounts, incentives, free shipping and coupons. We have yet to see how this will impact SEO. While Google offers its own competing product in Google shopping, comparison shopping engines gained ground over the past two years. This reflects the growing trend in users looking for deals and discounts online before purchasing. While historically free shipping has been a strong differentiator today nearly 51% of online shoppers are at least “somewhat likely” to cancel their entire order if free shipping is not provided.
Video & Visual Search Statistics: according to Bing’s note web visitors are able to process information up to 30% more rapidly when it’s provided in a combination of video and text, rather than text alone. Additionally, 65% of us are visual learners, and more likely to digest information and learn from it, in a visual form. Most types of ecommerce, notably apparel and products suited to emotional and personal tastes, excel with visual additions.
Trends In Technical SEO: When Google Instant was announced, half of the SEO community panicked. The assumption was the long-tail would be deteriorated as searchers naturally clicked on the first offerings by Instant, which are normally 1-3 terms in length. That is not the case. Sampling about 600,000 search visits from Google across a dozen ecommerce clients four weeks before Instant (which launched on September 8, 2010) and four weeks after, the results showed very little, if any, impact in query length. “May Day” updates and algorithm change had much more pronounced impact on search engine optimisation. E-Commerce companies would need to focus on offering unique, high quality content and resources as well as custom-written product text. Even more importantly large ecommerce sites need to be fast.
Google Places Search is now showing more map pins. It seems clear that Google is now not arbitrarily limiting local results to a specific number on the page if there are relevant blended places results for the query.
Searchers found that merchants conducting bad business were still ranking high in SERPs. NY Times and Bloomberg ran articles of a particular experience where the owner of DecorMyEyes.com leveraged negative publicity to garner links to their site. Google reacted quickly with an algorithmic solution that removed hundreds of merchants from SERPs that “in our opinion, provide an extremely poor user experience.” Google isn’t revealing the underlying signals and data points as to how the algorithm functions, to avoid bad merchants deciphering a loophole.
Sentiment tracking is a method of tracking the public’s collective opinion on a subject based upon the frequency of keywords in news stories, comments, reviews and social media.
Sentiment tracking would take negative commentary about a service and translate it into strikes against that service. The reliability of a source must be established to determine the “weight” of a strike and the degree of negativity per keyword would also be weighted. E.g. average vs. abysmal.
Google is already working with a world-class sentiment analysis system, however Google says “So far we have not found an effective way to significantly improve search using sentiment analysis. Of course, we will continue trying”.
Michael Gray covered this topic with a post on how to optimise an ecommerce site with category pages. He suggested:
- Listing the products all in one page: If you have fewer than 200 products list them in one page, but what about user experience? You can display less by default and have a “view all” option. Keep the bot away from the page that shows less products (no follow links, no index meta tag and canonical tag that point to the version you want to be indexed). Make sure this is configured correctly.
- Using AJAX: serve the bots an HTML version with 100 products, but serve then people an Ajax version with 10 or 20 products, then use Ajax to reload/shift/change the products. Technically it’s cloaking but user intent is also considered (what MG called white hat cloaking )
- Breaking down your category: if you have a lots of products per category, break the category into subcategories
- Using pagination: if you cannot break into subcategories, list the most important products first to ensure the search engine spiders reach them first. Make sure you interlink all the paginated pages without creating a bad user experience.
1) If an article is retweeted or referenced much in Twitter, do you count that as a signal outside of finding any non-nofollowed links that may naturally result from it?
Bing: We do look at the social authority of a user.
Google: Yes, we do use it as a signal.
2) Do you try to calculate the authority of someone who tweets that might be assigned to their Twitter page? Do you try to “know,” if you will, who they are?
Bing: Yes. We do calculate the authority of someone who tweets.
Google: Yes we do compute and use author quality
3) Do you calculate whether a link should carry more weight depending on the person who tweets it?
Google: Yes we do use this as a signal, especially in the “Top links” section [of Google Real-time Search
4) Do you track links shared within Facebook, either through personal walls or fan pages?
Bing: Yes. We look at links shared that are marked as “Everyone,” and links shared from Facebook fan pages.
Google: We treat links shared on Facebook fan pages the same as we treat tweeted links. We have no personal wall data from Facebook.
5) Do you try to calculate the authority of someone on Facebook, either say via their personal wall or their fan page.
Bing: We don’t do this on Facebook. On Facebook, we only get what’s public, only updates and things you’ve posted to everyone as viewable. We don’t get things only shared with friends, so we don’t know how authoritative you are on Facebook. There isn’t the whole convenient retweet mechanism we see on Twitter.
We do see valuable content shared by Facebook users, even though we only get what’s public. For example when Gary Coleman died we saw a video from Different Strokes, saying his favorite line “what ya talk’in ’bout Willis” gain popularity. It happened to be what a lot of people are sharing on the day he passed away.
Google: Again, the treatment is the same as for Twitter. And we have no personal wall data from Facebook.
5) Do you calculate whether a link should carry more weight depending on the person who shared it on Facebook?
Bing: We can tell if something is of quality on Facebook by leveraging Twitter. If the same link is shared in both places, it’s more likely to be legitimate.
Google: Same as question 5.
6) And just to be really clear, the new Facebook data is not yet being used in ordinary web search, right? (asked only of Bing, because it was only relevant to them)
There are three new ways to filter and refine your keyword list. With the new updates, you can:
- Choose specific terms to include or exclude from your keyword list.
- Use the ‘More like these’ button to search for terms that are similar to the specific keyword ideas you’ve selected from the table.
- Get only results that include the exact words or phrases (and their close synonyms) you’ve typed in the search box.
The feature lets you filter or annotate the search results by reading level. The reading levels include basic, intermediate and advanced. You can have Google label or annotate the results with those labels, only show basic results, only show intermediate results or only show advanced results.
Google is bringing its seller ratings ad format to the mobile platform, allowing searchers using Google on the mobile web to see ratings of merchants within a search ad. Google Mobile searches were up 130 percent year-over-year in Q3 of 2010.
Bing Facebook Integration: Bing has integrated Facebook ‘Likes’ into its results for a couple of months now — if a friend has ‘Liked’ an article that’s relevant to your query, Bing will note that. But these ‘Likes’ have been shown in a separate widget — soon, they’ll actually impact the search algorithm itself, so links will be reordered based on social signals.
Bing Maps: For its Maps products, Bing has shifted its focus from Silverlight to HTML5 and AJAX (this was announced last month). Bing Maps will start to show interior maps of shopping centres as part of its Maps, and it’s also partnering with EveryScape to showcase interior panoramas of venues.
Yahoos answer to YouTube will no longer accept video uploads, all User Generated Content (UGC) will be removed March 15th 2011. No word yet as to what will become of Yahoo Video, this comes at the same time as Yahoo announcing a round of layoffs.
The factors Google takes into account when serving up local listings are:
- Distance to Search Query
BCI research suggests that the maximum distance a business can be from a searcher’s query is 30 miles (48 kms). The closer the query is to the business the more priority it will be given.
Participating in Boost and adding Tags does not result in preferential treatment.
Bing (finally) Releases Backlink Data
The new functionality allows you to:
- Understand the number of inbound links over a period of time to your site
- See number of inbound links by page including URL and Anchor Text details
- Export link data for offline analysis
- It is important to note that the count of inbound links will be based on content stored in the Bing index vs. a complete, comprehensive count of links between every page on the Internet.
Webmaster Tools has updated the “Search queries” and “Links to your site” features.
- The Search queries feature can now display the “Top pages” based on their performance in the search results.
- This will show the queries driving traffic to that page.
There are also pie charts to show the proportions of search type, location and traffic.
The “links to your site” feature now shows when a URL redirects