Ask the Search Engines: SMX East

The last day of SMX East was hectic as I tried to touch base with new friends and contacts, attend all the sessions and make it to my flight in time. I managed to pull off most of those objectives, with the exception of posting a couple of live-coverage entries to the blog. So without further ado, here’s what happened at the last session I attended. As my momma always says, “Better late than never!”

Moderator Danny Sullivan has got to feel good right now. Another awesome conference nearly complete! For the final leg of the marathon, let’s go straight to the source and talk to the search engine representatives about all the things on our minds.

Danny says that he used to do a session called “I’m So Confused” because of all the conflicting information that is shared at conferences. But this panel will give us the official take from the search engines.

The reps are Nathan Buggia, Live Search Webmaster Central, Lead Program Manager, Microsoft; Aaron D’Souza, Software Engineer, Search Quality, Google Inc.; and Sean Suchter, VP of Engineering, Yahoo.

Sean says that sites should submit sitemaps, either .txt or .xml, and overall it helps with inclusion. He also heard a question about keyword order in Titles. He says that it is important to get right, not because of ranking but for the effect of the presentation in the SERP. Users will react well to seeing the keywords they are searching for in the Title so they should be further toward the beginning.

Aaron says that he’s involved in trying to get rid of spam. He hears a lot about companies wanting to put up different versions of content for different countries. They wonder if it’s going to be a duplicate content issue. He says that if the URL and the path to the content is reported to Google as specific for a certain location, Google won’t see it as duplicate content.

Nathan says that he hears a lot about URLs. He says that session tracking parameters for a page will result in multiple versions of the same page in the index. Competing against own pages for space in the index can be harmful. He recommends submitting a sitemap with one URL for each page, and it should be the shortest form in the canonical version consistently. He also hears a lot about metrics and thinks that people are worrying about metrics that aren’t the most important. He thinks it’s all about conversions and trying to find the most valuable action. Finally, he doesn’t believe that enough people are using the search engine provided tools available.

Now comes the Q&A part that you’ve all been waiting for. Be kind; Q&A can be hard to blog.

Are there best practice for running A/B tests so search engines don’t think you’re trying to cloak?

Aaron says that the way they look at it is that cloaking is only a problem if the intent is malicious. So for A/B testing, it is fine because the same type of content will be served. While they don’t encourage cloaking, penalties only happen after a human review, so no penalty will be served if it’s clearly just testing and not malicious.

Nathan says that A/B generally looks different than cloaking, and while they don’t recommend cloaking, it really isn’t a problem.

Sean says that the bad situation happens when there are large diversions, not the little ones that are common of testing.

Do you count affiliate links?

Sean says it depends where and in what context the links are coming up. If they are coming up in random, irrelevant places, that’s not good. But if affiliates are making them of value to users, it’s probably going to be a fine signal.

Nathan says that each link is evaluated independently and it’s not necessarily considered if it’s an affiliate or not.

We’re currently redesigning our site and the only thing staying the same is the domain. The old site had ten pages and the new one will have 100,000. Are we going to have a problem?

Nathan says that the search engine always tries to find the most relevant page for a query, so if there’s a page with similar content about a product as the manufacturer’s page about the product which has been around longer, your site’s page may not show up as it’s considered a duplicate. One way to work around this is if you can add something beyond what’s already out there, like pictures or reviews. As the question came from someone who has a weapons site, he suggests that she could maybe do videos of tazing pets… The whole audience laughs and groans and I’m pretty sure Nathan is turning pink. Maybe not the best example!

Aaron says that when you have a unique offering in a market, you will stand out by doing something different. He doesn’t think that the reputation credited to the old site will be devalued on the new site, but he does warn to be aware of duplicate content from the old site.

When will Yahoo and Microsoft get country-specific targeting? And what’s your advice if you want your site seen in another country?

Sean says that you should use a ccTLD because it’s a huge signal. The other big signal is where the users and links are coming from.

Nathan says that you should make sure the international site is all located in the same sub-group or sub-directory because it’s easier to identify. If a whole sub-directory looks like it’s in German, it’s a signal that it is targeted for Germany.

What percentage of false positives do you have in spam protection?

Aaron says that it’s low but that it’s an algorithm so there are sometimes mistakes. Spam algorithm changes are treated the same way as any other algorithm change. They test changes in a large sample and if they see a generally overwhelmingly positive result, they roll it out.

Sean sways that it’s low, but if you think your site is treated incorrectly or if it has been cleaned up, submit a webmaster support form for consideration by the right people.

Danny says that Microsoft and Google will report to you if they think you’re spam, except for cases that Google feels are so obvious the site is spammy that you should know it already. Yahoo is working on it.

Should people be bothering with nofollow or not to try to flow their PageRank around?

Sean says that in terms of designing for users, it’s not helpful at all, so in the long term your energy could probably be better put into other areas. Aaron says that for the most part the issue comes up when there are way more links on a page than are useful to a user. In that case you have to think if the page itself is good for the user. He doesn’t think it’s going to cause an issue one way or another. Nathan asks who in the audience is doing sculpting (maybe five) and then he asks who has measured a positive change (maybe two). He says it was higher than he thought, but he still doubts the long term value.

Aaron says that sculpting seems like a lot of effort to put into the one signal of the link equity algorithm. He says he thinks it can be done if there’s nothing left to do. Danny recommends testing it yourself to see if you see a difference.

It was suggested in a link building session that you could make donations to charities to get a link on their .org site.

Danny says that, to make it more uncomfortable, Matt Cutts has said that’s fine. Sean says that if a charity is offering links for sale, he would think that they’d be getting links from bad guys as well as good guys, which will quickly get them flagged. Then the site will be in the universe of people who are bad and that link will be worthless.

Aaron says that if they were to see that 60 percent of the spam comes from charities, then they’ll go after it. If it’s rampant and makes up a large portion of spam then they’ll see it as low-hanging fruit. Nathan says that if you’re giving it to charity, then it’s good anyway. But really, a charity that is aggressively selling links is probably going to see other attention as a result of their marketing techniques and see an increase in traffic.

Do you ever do direct intervention to penalize spam, as opposed to changes to the algorithm?

Aaron says absolutely. If it’s hurting the results right now then they’re going to do something manually. But they want to make the algorithm better, too, which they do by learning about the ways people are spamming.

Do reports that come in from a Google account have more weight?

He says that reports that come in from Webmaster Central are considered first over the external submissions because it’s a cleaner data set.

Does the Yahoo algorithm in Japan work in a significantly different way than in the U.S.?

Sean says that there are slightly different signals but that it is the same back-end search engine and system, just tweaked for the market.

In natural search, do you offer some sort of endorsement or certification for SEOs?

Nathan says no. He says that he wouldn’t want to endorse vendors because there’s so much behind it. Sean says that it’s the second time he’s heard the question and says it’s an interesting suggestion.

Is there a conflict behind your content networks showing up in your search engines?

Sean says that the reason Yahoo has SEOs is because they’re trying to avoid a conflict of interest. There’s search and there’s content and it’s not the same thing. So, for the content they have to compete for their user base and thus they need SEO. Aaron says that there’s no Google policy to boost Google properties, but for certain properties like YouTube they have more information on them than they have on other sites, so they may show up more. Nathan says that Microsoft tries to keep a firewall between all of their businesses. Even advertisers that spend tons of money get no preferential treatment. AdCenter and the search engine are separate.

Are links still the primary signal for popularity and importance?

Aaron says inks are a good measure of reputation. Clicks are a noisy signal, and so the absence of a click for a result is thus way more useful because it signals that it’s not the most relevant result. Sean isn’t sure if links are the most important signal or not, but he will say that it’s a larger signal than Title tags, for instance.

What’s happening with personalized search?

(Okay, I actually didn’t hear the question, but this is the answer.)

Aaron says there’s a lot of data they have access to because of the way people use the search engine. But in personalized search, one policy is that whatever is done will be told to the user. The user can go in and control what is being used for personalization. They want to give you the ability to say “I don’t want you to use this”.

And that’s a wrap for SMX East! Thanks to Cindy Krum, Eric Lander and Kate Morris, who took time out of their whirlwind schedules to come on SEM Synergy and, of course, thanks to all the great speakers who didn’t hold anything back when it came to sharing with hungry audiences. All that’s left to post from the conference is the highly-attended Give It Up: White Hat Edition panel, which will be hitting the blog November 7.

Virginia Nussey is the director of content marketing at MobileMonkey. Prior to joining this startup in 2018, Virginia was the operations and content manager at Bruce Clay Inc., having joined the company in 2008 as a writer and blogger.

See Virginia's author page for links to connect on social media.

Comments (3)
Filed under: SEO
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!

3 Replies to “Ask the Search Engines: SMX East”

The part of the discussion I liked best was the discussion on affiliate links with nofollows. Some people where I work seem to think that SEO companies can’t develop a decent affiliate marketing program without being spanked by Google. I have advocated the opposite for a long time and I feel vindicated by the discussion at SMX East.

Richard,
Let me take a crack at clarifying and adding context. When the panel started, each of the three speakers introduced themselves. Aaron mentioned what he does (gets rid of spam) and a question that he often hears when he talks to the public (duplicate content due to country-specific versions). Your reading of what Aaron said on the panel is correct. Google has said that you can put content on a country-targeted directory if you report it as such in Webmaster Tools. Aaron said that doing such would not result in duplicate content filtering.
However, for best results, we and most SEOs recommend putting the country-specific content on the corresponding ccTLD and hosting the site in that country as well. This is especially important because Google is not the only search engine out there, and depending on the country you are targeting, you will want your site to be recognized across all engines.
Thanks for your question. I hope that helps!

Hi Virginia

Great recap, but a request for some clarification if I may:

Aaron says that he’s involved in trying to get rid of spam. He hears a lot about companies wanting to put up different versions of content for different countries. They wonder if it’s going to be a duplicate content issue. He says that if the URL and the path to the content is reported to Google as specific for a certain location, Google won’t see it as duplicate content.

Can you give this some further context pls?

My reading is that if I publish dupe content on say http://www.mydomain.com/us/index.htm and http://www.mydomain.com/uk/index.htm, geotarget those 2 folders to US and UK respectively via webmaster console, then they wont be considered dupe?

Can you clarify any further pls?

Rgds and thanks in advance
Richard

LEAVE A REPLY

Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919