Ask the SEOs – SMX East 2009
Danny’s assembled a grizzled group of veteran SEOs. They’re running through intros. Here’s yours:
Moderator: Danny Sullivan, Editor-in-Chief, Search Engine Land
Greg Boser, President and CEO, 3 Dog Media
Bruce Clay, President, Bruce Clay, Inc.
Vanessa Fox, Contributing Editor, Search Engine Land
Todd Friesen, VP of Search, Position Technologies
Rae Hoffman, Owner, Sugarrae Internet Consulting
Stephan Spencer, President & CEO, Netconcepts
Aaron Wall, Author, SEO Book
Q: The Canadian portion of our site is under a different root folder but has the same content as the U.S. portion of the site. Is that duplication a problem?
Vanessa: Is it on a .ca site?
Stephan: Did you sent the geographic region in Webmaster Central?
Vanessa: Do that first. But you may want to move it to a .ca domain.
Aaron: And redirect the Canada folder to the new domain.
Q: How do you deal with content when it’s been scraped from your site?
Stephan: In the bio of the article, link to the original URL of that article. When it’s ripped off it will be linking to the original version. Try to get all the duplicates to point to the canonical version of that article.
Greg: If you have your site ripped off from a site with more authority, then that site will probably rank for it. But typically that’s not the case.
Rae: If a scraper site is outranking you, you have bad SEO.
Vanessa: File a DCMA.
Aaron: You can use it to your advantage as a name and blame.
Vanessa: Yeah, like, look at this big brand that obviously likes my content.
Bruce: Copyright register your content so you have recourse.
Q: Will bounce rate or time on site ever become a factor in the ranking algorithm?
Vanessa: If you’re talking about analytics data, and Google using analytics data to rank your site, it would be hard for Google to rank things that way because not everyone has Google Analytics on their site. If you think about why you want to rank on search engines, it’s to get people engaged on your site. High bounce rate is a signal that you’re not doing that. That’s a bigger issue to me.
Todd: There are sites where a user clicks through and gets what they want in 10 seconds, then hits the back button. There’s too many cases like this that Google could roll bounce rate into the algorithm. If you have a ridiculously high bounce rate, don’t worry about your ranking. Worry about why you have a ridiculously high bounce rate.
Rae: I would just assume that they use everything. They say they don’t use the analytics data, and maybe they don’t, but the things they can tell about your site are beyond anything we can comprehend. So just assume they know everything.
Greg: There’s a time when your page is around the 12 to 17 position and it’ll jump onto the first page sometimes. We call that the audition period. It’s when Google’s testing your site out in place of something else on the front page position. We’ve found that during that time, if you concentrate on improving bounce rate, it’ll stick on the front page faster.
Bruce: It would be an easily spammed factor.
Danny: Google has lots of ways to cross check the data, though. So they can tell if there’s behavior out of the ordinary.
Q: My site recovered from a Google penalty six months ago for spammy backlinks. Is it safe to launch a sub-domain now?
Unanimous: Adding content won’t hurt you.
Q: What do you think the value of sub-domains in terms of ranking power? In the last few months I’ve noticed that Google has devalued keyword value on sub-domains.
Vanessa: I think that value is mostly in the anchor text of links.
Todd: Think of why you’d use a sub-domain. I don’t think using it just to add another keyword is that useful.
Vanessa: When monitoring someone else’s site, it can be hard to know what really caused the effect you’re noticing. You don’t know what else is happening on the site.
Greg: Any large scale project we do, we typically sub-domain. You don’t want to be overly granular with it — engines don’t like that. It’s seen as spam. But if you have a site with a very large topic base, you’ll find you can rank for head-related terms by breaking top-level categories into sub-domains. It’s also good for brand protection. For your brand name it will give you more listings.
Todd: For your brand, you should have at least two listings: your site and an indented listing. With a couple sub-domains you can have 40 percent of the page.
Q: Any insights on Google Caffeine?
Danny: Google hasn’t really said anything interesting about it yet. They’ve said they’re trying some new crawling, maybe some new ranking things.
Vanessa: I know that Google said it’s primarily an infrastructure change, not a ranking change. Though some people I’ve talked to have noticed a ranking change.
Todd: I’ve seen Universal Search results missing, like video results missing. But generally with ranking, we haven’t seen any big changes.
Greg: I’ve got a tool that collects data on both Google standard and Caffeine every day and have noticed a lot of changes. A trend toward home pages — trying to return the best site, not necessarily the best page.
Bruce: I’ve found pages in Caffeine that are older than the standard index. So it appears that the regular index is updating faster than the Caffeine index. If you see ranking changes, it may be that your competitor has been spidered more recently than yours. It’s hard to compare the two indices because they’re not in sync.
Todd: Keep in mind that the only people that know about Caffeine and are clicking on things are a bunch of search marketers.
Danny: I think Google’s also feeling a lot of pressure over Bing. No one thinks Google does search anymore. This is something they can point to and say, look, we’re still working on search.
Q: Any tips for Google Suggest?
Danny: In some cases you’ll get sites that come up. Typing “rhyme” into the search box suggests www.rhymezone.com.
Aaron: We actually were able to create phrases. If your domain name matches your keyword, you probably get a boost. A lot of Suggest looks to be navigational, as well.
Todd: I’d like to be able to clean up the suggestions from a brand management standpoint. Google Suggest is hurting you before the user even gets to the results page.
Vanessa: You can take advantage of the knowledge of the intent of the query. For instance, Bing breaks the query down into categories. You can drive interest by speaking to those categories.
Q: What’s the difference between Alexa ranking and Google PageRank?
Vanessa: Well, they’re the same by both being mythical numbers.
Danny: It’s like two independent critics reviewed a movie. There’s no relation.
Vanessa: Alexa is skewed because it only counts users that have a toolbar. A better way of gauging your site than PageRank is looking at your inbound links.
Rae: Don’t waste your time monitoring that number. Spend your time improving your site.
Todd: The two things to be concerned about in regards to toolbar PageRank is if it’s white or grey. If there’s even a pixel of green, you’re fine.
Q: What do you recommend for Sitemaps and making sure your site is crawled well.
Vanessa: I think you may as well submit an XML Sitemap because it doesn’t hurt you and it’s good to give them a better picture of your site. It doesn’t replace good information architecture. If it’s crawlable it can also help you with diagnostics if you submit it to Google and learn more accurate metrics than the search operator will give.
Aaron: If you block a page in robots.txt and someone links to it, it can still end up in the index. Instead you should include a noindex in the Head of the page. But if you include a noindex in the head and also exclude it through robots.txt, it won’t be crawled to see that it’s supposed to be noindexed.
Q: Should you pull old URLs from Sitemaps after you’ve redirected pages?
Wait for the redirects to get picked up. It’ll tell you in Webmaster Tools.
Q: For Flash, when should I use switch objects?
Vanessa: The biggest problem is that you need a separate URL for each interaction, otherwise it’s a mass of interactions.
Todd: You’ll come across sites that paid $2 million for an amazing Flash site. And they come to you and say we need SEO help. You can’t look at them and say, “You’re going to have to scratch that.” Copy the site in HTML.
Vanessa disagrees that you should just cloak your site as the solution.
Bruce: You can copy your top nav as links in the footer.
Q: How do you determine if a site is authoritative?
Rae: Does it rank well for main keywords?
Stephan: Does it have Sitelinks for non-brand keywords?
Vanessa: If you’re asking about sites that are linking to you, does the site have active stuff happening on it. Is it referring lots of visitors?
Greg: The time factor is one that you can’t fudge. Sometimes you may want to acquire a domain.
Vanessa: Acquiring a site and not changing the whois info is a little risky.
Q: Are the guidelines the same for optimizing dynamic sites?
Vanessa: Google wrote a post on their blog this morning about dynamic parameters.
Q: What are the dumbest SEO mistakes you’ve each seen?
Todd: Pier One Imports launched a new site with breadcrumbs that followed the click path. It was totally unspiderable. They ended up just shutting off the ability to buy products on the site altogether because they didn’t want to fix it. You still can’t buy anything from their site.
Rae: At a site clinic several years ago. One site someone submitted was a scraper site.
Bruce: During a site review, one site had linked to all their other sites in the footer. One of the services phrases had a different site being linked to with each letter.
Vanessa: One site had been around a year and wasn’t indexed but they didn’t know why. The host had actually blocked the site with robots.txt.
Greg: In the gaming industry there’s a lot of geo-targeting. More than once, a site will bounce Googlebot to the U.S. site because it comes from a U.S. IP, and so their UK site will never get indexed.