SMX West 2012: Evening Forum with Danny Sullivan
From the SMX West agenda: The audience is the panel in this session. Search marketers share thoughts, ideas and knowledge. The session is moderated by Search Engine Land editor-in-chief Danny Sullivan, and the audience shapes the agenda.
Audience: Dave Naylor today, on his blog, writes about what might have changed in respect to link value in Google.
Danny: They may have stopped following nofollow. They may be changing the wieght. He has no idea what they changed, but suspects is realted to spamming rather than changing what was working well. It was interesting they said it was a signal they used for a long time. They could be using at words around a link to associate with a link itself. People will look into it further and we’ll see what emerges. It’s also difficult because we don’t know what they do with link signals to begin with. It might end up being a very specific thing you won’t want to do, which people aren’t really doing much of anyway.
Audience: What would happen with search engine results if there was no SEO?
Danny: Matt Cutts would be working for Facebook… ;) That’s not possible. Even if you’re not doing SEO, you’ve done SEO on your pages. Let’s say if everyone just published a bunch of pages. The SE still has to come up with an algorithm that will use signals. The result would be accidental SEO. Maybe better to think of the debates in the industry about whether or not SEO is ethical, black hat, fair, etc. He finds this debate boring, he’s heard it all so many times. There are people that understand SEO means working with good content and making that content visible to search engines. It’s always been akin to public relations. You have a good story to tell. If you’re smart you package the story so a reporter wants to hear more about it. Ultimately the story succeeds not because you tricked the reporter, but because the story is a good story. If content deserves to do well, then the content should do well from there. Results, in a world without SEO, wouldn’t be better, they’d be different or worse, most likely.
Audience: We recently revamped our site. We’ve reduced the number of crawl errors significantly. Is there a good way to manage crawl errors for large sites that have been revamped? Is there a number that the crawl errors should be under?
Danny: SEs will spend a certain amount of time on your site. Reducing errors helps make sure more of your site is visible. Also, you may have errors but still have great traffic. Worry more if you have a drop in traffic. Check out the technical SEO track tomorrow.
Audience: With SEOmoz, it’ll still rank my page and a failing grade I get is lack of Meta description.
Danny: Meta descriptions aren’t a ranking signal. But it’s one of the few things we have left that we have control over. Google or Bing, in their infinite wisdom, may use it as an excerpt in the SERP.
Audience: Do you think there will be a time when Google stops using the link graph and replaces it with social signals?
Danny: Social signals are growing in importance and will potentially eclipse links. When Google came along, they ranked pages via links on a page. That was gamed, so then they decided to let people vote – that was the link model. Counting links, as a way to vote for what’s popular on the web, is like when the country said everyone can vote, as long as their a white male landowner. Not all the Internet population is a landowner. Do they rent space from WordPress, who automatically nofollows links? Participation has now gotten simpler as you can just click a “Like” button.
Audience: What’s the best tool you recommend to measure authority of a domain.
Danny: He understands the use of such metrics in order to prioritize where to put your effort. But these third-party tools (Compete, Alexa) are just guesses and you can look at PageRank but that’s outdated.
Audience: Google’s given a fair amount of screen real estate to Google+. It’s a lot of hype. Given the importance of the real estate, is it overhyped or underhyped?
Danny: A string of articles this week have suggested Google+ is a ghost town. But even if it’s a ghost town, to participate in Search Plus results and other Google features, you need a Google+ account. It’s having a direct impact on rankings. It’s one of Google’s main ranking signals. Who wants to ignore that? It’s not hard – slap the button on your page. Just doing a read-only, it’s better than not being there at all. The influence of +1s add to this issue. I don’t think it’s hype when you’re a search marketer. If you care about Google, you’ve got to care about Google+. If you’re a general user, maybe you don’t care about Google+.
Audience: If a U.S. site has had a few penalties, not great rankings, but great rankings overseas, should I rel=”canonical”?
Danny: The fact it’s doing well overseas shows they don’t think it’s doing completely bad. It makes me think it’s not actually penalized in the U.S., maybe something else is wrong with it. I’d reach out to Google to find out if it’s got standing penalties. Leave the sites that are working alone and keep them away from the site that’s not doing well. Try to catch lunch with a Google engineer this week.
Audience: This one’s for domainers/developers/affiliate marketers. It’s regarding the documentation about Google quality raters. How much do you think manual raters, when they see sites .net and .org sites with same keyword structures pop up, will they flag them?
Danny: Google will give new sites a chance at the top to see how they’ll be met by users. Quality raters generally adjust algos overall. If a site drops out of results in a scenario like this it’s more likely an engineer came by and saw the site is part of a spammy network and so will drop the whole network. If there was something decided to be wrong with the company, all related sites would go. Probably one site was doing better with click-thrus and conversions, etc., and one was doing worse and didn’t stay.