SMX Liveblog: Ask the SEOs
Alright everyone, it’s the last session of the conference – this marathon is almost over for my tired fingers. They’ve saved one of the best for last – it’s time for Ask the SEOs. At the speaker table are SEO powerhouses including:
- Greg Boser, Independent, Consultant (@GregBoser)
- Rae Hoffman, CEO, PushFire (@sugarrae)
- Jeff Preston, Senior Manager, SEO, Disney Interactive (@jeffreypreston)
- Marshall Simmonds, Founder and CEO, Define Media Group, Inc. (@mdsimmonds)
- Ellen White, SEO Director, Ford Motor Company
Search Engine Land Editor Danny Sullivan is moderating … here we go.
Ellen talks about a client who didn’t get SEO and she took the journey to explain it to him and get him to buy in. By making simple changes to a site like taking images and turning them into heading tags they saw large improvements in the SEO.
Do you think internal linking is still effective? Is PR sculpting dead?
Greg says he doesn’t think it’s dead. When you “move the juice” the right way you can be rewarded and he feels it’s still hugely important.
Marshall feels that information architecture is still really important. PR is supposed to emulate how a user uses your site. Moving content is ok but it can be difficult and time consuming.
Rae suggests that when redoing URLs, make sure you check all your internal links and update them to the new URLs – don’t rely on 301 redirects to capture those. You can use a little bit of juice by not updating them.
Marshall throws in that you also need to make sure you are updating the XML sitemap when you move content.
Will new global tld impact search?
Greg thinks those things are stupid LOL. (Thank you, I share that sentiment). Most of the panel does not feel that the new tlds such as pizza.pizza like Danny suggested, will impact search much. Danny thinks it just a sales game for domain brokers. We’ve seen this all before with other types of tlds – no reason to believe that these new ones will be any different.
Can you share your experiences on how to get somebody out of a Penguin penalty?
Rae says there are several things that are difficult things to make a client understand. One being that Google may or may not update Penguin in the next five years … it’s already been nine months. Even if you do the work you won’t see the rewards until they do update. Another is explaining to folks that there will be a dip in rankings because you’ll be removing links that may or may not have helped you rank well. Greg adds that you have to prepare the client for some really tedious rewardless work. They went out and bought the links which is sometimes a long process, but clients don’t understand that sometimes it takes even longer to get the links removed.
Greg warns about the ‘big one’ coming in regards to a Penguin update. He feels its not far around the corner and that it’ll be pretty devastating to a lot of sites. Rae adds that if you’re paranoid like her who wears the tinfoil hat, that you have to realize people have been feeding Google list after list of bad crappy sites so the next Penguin will surely cast a wide net.
When do you decide when to just give up in a Penguin penalty situation?
Greg talks about how in the first meeting he’ll evaluate the situation to decide how bad it is. Often times companies have done a lot of bad stuff including crappy content so in order to get out of the mess it means changing the overall business practice on the web and that is often times harder for the client.
Tips that you would recommend for a multi-brand ecommerce site that features its sister brand side by side?
The discussion on the panel turns to whether you should have multiple sites or put all the assets into one site. Marshall points out that there are 10 blue links in the SERPs…having multiple sites can still be beneficial. Greg suggests considering merging the sites all into the one domain that has the age and domain authority that’s higher than the other domains. It really depends on the situation though. Marshall says there is a point to consolidating, while there can also be a case made for staying separate.
How do you discover and develop different SEO tests? How do you measure the success and how often do you test?
Greg makes a comment about how this is where ‘black’ comes in handy because they did a lot of ‘testing’ to see what was working. Greg has always been one who loves to test the boundaries and rules so he does tests that would make other folks faint such as removing all title tags to see if it negatively impacts the listings. I’m pretty sure that’s not the type of tests the person was looking for LOL.
Rae recommends that people put up multiple dummy sites that you can run very specific tests on without the risk of hurting a client’s site.
Aside from all this the panel has said that you can test things like CTR based on changing Title or Description tags. Change the tags, run the test, compare results to the baseline data and make a decision. Test different tags that are available and recommended and compare the results against your baseline. Greg says a common problem is making sure you don’t test too many things at the same time in order to get the data you need.
Jeff recommends that any test you decide to run, if you are managing multiple sites, start small before deploying out across all sites or whole large sites.
Ellen recommends testing for local search because it’s easy to do.
Is there a tool to test responsive design?
Greg says that responsive isn’t always the best way to do things and that people need to be aware that responsive sometimes kills the user experience once it’s reduced to the smallest screen size. Bloated code is definitely one of the biggest issues with responsive design.
Marshall adds that there are a few mobile testing sites, and he names Cindy Krum’s MobileMoxie in particular.
Panda 4.o – what tactics were targeted?
Greg says ‘high volume crap’. He says it’s not about the page, but it’s about the site. If you have 10,000 pages but a majority of your traffic is going to only 500 pages that means you’re feeding a lot of crap to the search engines that people aren’t finding useful.
Marshall loved 4.0. He says Panda forced companies to pay more attention to search again. It refocused everyone back to putting out good content. He says you can resurrect the health of a site that may have previously been hit by Panda just by fixing the problem.
What is the best way to align social with SEO?
Marshall says that what’s important is that social pays attention to search and search pays attention to social. They share the same types of signals. This means there are opportunities for sites to capitalize on. It’s important to coordinate the two teams in order to really win.
Greg says it’s a great collaborating signal. He says the more effective your social is the more the ripple effect will positively impact your search and traffic.
Marshall also adds that there is an opportunity to take a piece of content that isn’t necessarily seeing the love and figure out how to resurrect it and get it the social love it needs.
Do you believe it is possible to get the links you need simply by creating great content?
Greg says ‘NO! the build it and they will come myth doesn’t work.” Simply putting great content on your site and pushing it out on social accounts that no one knows about will not result in great rankings. You’ve got to leverage the audience of someone else to help get your foot in the door and start those links coming in.
Rae says everything you do has to be done with a great game plan and very strategically. Put time into building social networks so you can push your great content out and see results. Great content is required to have a good website but if no one knows about it it’s worthless.
Can you share some of your favorite enterprise level tools?
Nutch is an open source crawler that Marshall highly recommends. Ellen suggests Moz’s site crawler and Xenu. Greg mentions Deep Crawl out of the UK. Screaming Frog is mentioned but it has it’s limitations. Screaming Frog isn’t enterprise level but it gives you a ton of great information that you can use in multiple different ways.
Is rel=canonical really important across a network or across domains important?
Yes, you can use it without seeing any negative effects.
Suggestions for SEO and PPC integration?
Greg says you should always use PPC data because you can match up organic content to certain phrases to get those quick wins. Ellen says that they use integration for underperforming terms on the SEO to go to PPC and have them beef up the ads in that area or vice versa. Rae recommends making sure your PPC landing pages non-indexable in order to prevent bad content or duplicate content getting indexed.
Thoughts on link sabotage?
Greg says there is a whole side industry doing this and it’s definitely on the rise. That’s why it’s important to be proactive and watching those links to make sure you spot problems early. He does not recommend link sabotage as a tactic that anyone should employ. Marshall suggests using Link Detox regularly (LinkResearchTools.com). You do have to manually check the links that Link Detox reports and categorizes. I’ve seen several mis-categorized – so take that data with a grain of salt. Just because they report it’s Toxic doesn’t mean it really is.
That’s it for this conference folks. Hope you were able to glean a nugget or two from the posts!