SEO: Advanced Q&A

We did it. We’ve once again made it to the final session of another jam packed show. And if that wasn’t exciting enough, we’ve finally reached an Ad:Tech session on search engine optimization, and Bruce is moderating it. Finally, I get to listen to a conversation I actually understand, all while proving to Bruce that I really do work while him and the rest of the Bruce Clay gang are in the booth. I swear they think I go back to the hotel and nap while they’re talking to clients and explaining our services. I’m busy; I swear it!

But anyway, here we are at the SEO: Advanced Q&A session. Bruce "Lisa’s Not Done Until I Say So" Clay is moderating with speakers Aaron D’Souza (Google) and Sandor Marik (CondeNet) both fighting for his attention, I mean presenting.

Okay, let’s get to it.

Bruce starts off explaining that this session is on SEO, search engine optimization. (I love that he defines SEO for the non-geeks in the room.) This is the first time Ad:Tech has tried a panel that is geared towards Q&A. This session will focus on three different SEO perspectives – a company (Bruce), an engine (Aaron) and an end user/large publisher (Sandor). Bruce goes down the line and introduces the panel members. For some reason I don’t get a shout out. Whatever, Bruce.

Bruce says the panelists have opted for audience-driven content. The panel will make short comments in things that impact search engine optimization. But the real intent is to let the audience ask their questions. He’s so generous, our Bruce.

The first topic Bruce touches on is spiderability. Bruce says your site absolutely has to be spiderable. He’s a big supporter of site maps.

For a search engines perspective, Aaron says that making sure they have access to a Web site is a really big problem. There are sites where it’s incredibly hard to find pages because they don’t have links and robots are bred to follow links. Aaron mentions the new Sitemaps.org that help site owners show the engines where the content is on their Web sites. Aaron calls it "a great thing" — it makes it easier for the search engine spiders to find these pages. Site owners don’t have to worry about links not being present or about deep pages not being crawled.

Sandor says from the publisher’s perspective Flash is one of the biggest challenges. For those just tuning in, the search engines can’t read Flash. Shocking, I know. There are many techniques to get around it but you have to have the knowledge on how to implement Flash in a search engine friendly way. You don’t want to throw away valuable content.

Next the panelists move on to the topic of duplicate content.

Bruce says that the Bruce Clay Web site gets stolen about two times a week and I giggle. It really does. He talks about clients who write articles, offer them through syndication, and then before you know it the syndication version outranks the original content. This is bad. He says there are ways around duplicate content but it is something you’re going to have to contend with. It’s hard for the engines to identify the originator of the content.

Aaron says when Google looks at duplicate content in the index, they try to group it into one big cluster so they can then pick out the most relevant version. The advent of things like RSS has made it much more difficult for the engines to take a time-based approach to say that Person A was the person who first published the content. Someone can get your content before the engines have indexed either copy. However, if your site has historically been the producer of original content, it is easier for the engines to identify (read: guess) that your site was probably first to publish.

There are both tools and companies that can help with monitoring for duplicate content, says Sandor. A site’s biggest challenge is that you don’t know if your content has been stolen or reproduced. Tools can help you find out.

If you’re producing original content, you really have to pay attention to duplicate content. Content is what drives your traffic. Since the algorithms are complicated, it’s quite likely that good content that ends up on a different site may outrank you in the engines. Sandor says it’s definitively top of mind for him.

Bruce says creates Google Alerts to monitor for duplicate content. Throw a few sentences of popular pages into an alert and each time another site uses that content you’ll get a URL to the offending site. Then you can track down that person and poke them with sharp sticks. Huzzah!

Next topic – Linking strategies.

Bruce identifies the three kinds of links: Inbound links (when others link to you), Interior links (when you link to yourself), and External links (you link to other experts). He says you need all three kinds of links to be accepted and well-ranked in the search engines. One by itself won’t do it. This is quite important.

Aaron thinks of linking as another signal to help them determine where pages fall in the rankings. There are so many different way of linking that its use of a signal can vary with time. It’s like with any other signal — if the use of that signal starts to be corrupted then its importance diminishes. In other words: stop spamming, you cheats!

Sandor says figuring out the best interlinking strategy is a complex question. The considerations are somewhat internal.

For example, every page of Sandor’s site includes a link to all of their other Web sites. This is an internal branding policy that they think helps users discover their other properties. There have been lots of discussions about whether this hurts them from a search engines perspective. This is something they have to think about when they’re thinking with their SEO hat.

Next topic – Spam.

Bruce says he runs into constant fights with clients over spam. (I’ve seen these fights. Everyone wears sumo suits.) He says that sometimes BC will make edits to a site, send that info to the clients, and the webmaster will change our considerations to "improve" them and inadvertently end up spamming. Bruce says that some of that has to do with the way people learn to do search engine optimization in general. They pick up spam by listening to how other people do SEO. Spam just creeps in there. He uses the telephone example where you tell someone something, they pass it down, and along the line the message changes drastically.

Bruce says that best practices are the way to go. Over the past 11 years, the spam rules have changes over 80 million times. The biggest problem we face is that a lot of people implement things not knowing its spam. They’re innocent errors but they’re clear violations of best practices. Spam has to be paid attention to.

Aaron says when he first joined Google he entered into Matt Cutts’ spam group (hi, Matt!). Aaron lies and says that working with Matt has been fun. When people ask him what’s the difference between spam and a bad site, he says he knows spam when he sees it. It’s people who are doing things not because they are misinformed but because they’re evil. For example, JavaScript redirects — that’s not being misinformed, that’s evil. These are things Google classifies as spam and they’ll penalize you for that.

Aaron lets us know that Google has started to integrate more info about when and why your site has been penalized into the Webmaster Console. When they think a site may have been hacked, they’ll drop an email to your webmaster. They’re trying to increase the amount of information.

Sandor says that none of his sites have been banned but that he has found questionable techniques even on his site. Sometimes people who work on sites are not well informed and they enter into shady areas. It’s difficult to know which techniques are really going to get you banned and what you can get away with. He recounts a funny story about when he found a section of one of his pages that was invisible to the engines. If you looked at the source code it said "hidden text for SEO" in the comment section. Heh! This was done by an uninformed developer.

If you have a good site with good content, you don’t need the spammy techniques. It may be tempting at times to spam, but for long-term results, it’s worth it to go the white hat route. Let’s hear it for the good guys!

Next topic – Social, behavioral and local issues

Bruce asks how many sites are seeing Wikipedia outrank them for their keywords. Lot of people raise their hand and I begin growling and foaming at the mouth. [Down, girl. –Susan]

Bruce says it’s just a matter of time before the engines understand behavioral factors. They’ll know when you search for "java" if you’re looking for coffee or the programming language (or the island! Vacation, anyone?) What we’re running into now is that future buyers (people who are now 15, 16 years old) don’t see most of the traditional advertising. They’re not setting brands in their brain. They don’t see who you are. Bruce looks at it as a layer of ice that’s melting from the bottom up. Five years from now you’ll find that your brand has melted away or eroding. You need to play in the social space so that future buyers get your brand in their brain.

Bruce talks about the number of traffic we get from sites like del.icio.us or StumbleUpon. He says lots of his larger clients have staff on board solely for social networking.

Aaron says the new marketing technique will be satisfying user intent. If you have content but don’t know how to satisfy the users need, you have nothing. Being able to understand users’ intentions is going to become really important. That’s why social is exciting – we’re getting implicit signals from the users themselves about what they think is important content. Use that.

Sandor says social is a very interesting topic for publishers. It used to be that users were just an audience, now they’re becoming contributors. We need to find a balance for how to use this new content without letting it destroy the brand. An article with 200 comments is better content than the same article without the comments. It shows Google and the engines that this is interesting content.

From here, Bruce opens the session up to user questions. He says to speak loudly. Yeah, yeah, do that.

The first question-poser says that to get a site to rank you have to pick a keyword and build content around it. He’s working on a piece of content that focuses on several keywords and he wants to know how to get that piece to rank for multiple targeted keywords?

Bruce says it’s possible to get many keywords ranked on a page. The interconnectivity, how smart you construct your titles, etc, will help you to rank one page for several keywords. It’s about how that page is connected to other content on your site. Do you focus on these keywords or are you vague about everything?

Aaron says to look at the other pages ranking for your terms and ask yourself why your page should displace that page.

Another audience member asks Bruce if he feels there is corruption in the engines’ results?

Bruce says there’s no reason for the engines to risk their stature, position and reputation to help one site rank in the search results. That is the dumbest thing any search engine can do. Heh. He does believe results can be biased by gaming the algorithm. That’s what spam is all about. There was a case a few years ago when Yahoo admitted their algorithm was flawed and they began hard coding the top ten search results for certain queries. He doesn’t see that today. He doesn’t think the engines are fixing the results.

And that’s it. Lots of optimization knowledge covered in this session. My poor baby fingers.

On an unrelated side note, I want to say that hopefully Susan has treated you all to a Friday Recap by now. If not, I sincerely apologize for her suckiness. You’ve worked hard all week and you deserve a reward. If the girl hasn’t delivered, I give you my full permission to toilet paper Susan’s desk and steal her action figures. (Please don’t touch the stuff on my desk. I’ve been good). Thanks for tuning in to read the Ad:Tech coverage this week and I’ll see you on Monday when we’ll be back to our regularly scheduled blogging.

[Lisa, today is Thursday. Stop threatening me with your loyal hordes and go take a nap. Folks, I promise there will be a pathetic imitation of Lisa’s usual Friday recap tomorrow. There may even be puppies. –Susan]

Lisa Barone is a writer, content marketer & VP of strategy at Overit Media. She's also a very active Twitterer, much to the dismay of the rest of the world.

See Lisa's author page for links to connect on social media.

Comments (3)
Filed under: Social Media Marketing
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!

3 Replies to “SEO: Advanced Q&A”

Check it out, it’s like a Google party in here!

Aaron, it’s okay, you don’t have to protect Matt. We know how he really is; we’ve heard things. ;)

Aww. And just so you know, I can honestly say Matt has been the best manager I’ve had at Google so far ;P
Thanks for organizing a great session guys. It was a lot of fun.

“Aaron lies and says that working with Matt has been fun.”

:) I have no idea whether it’s fun working with me, but it’s always been a blast working with Aaron. He’s a good guy. :)

LEAVE A REPLY

Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919