What Technical SEO Metrics Are Important? — SMX East 2012
We just covered which SEO metrics are important to track in the last session. Now we’re talking technical SEO metrics. So, are you ready to get all jiggy with yummy tech metrics? Me too. You can follow tidbits from this session at #12b on Twitter.
Vanessa Fox (@VanessaFox) is moderating, and she is starting things off. It’s easy to look at rankings and traffic. It’s much harder to see how a search engine sees your site. Google and Bing now offer tools to do this. So look at this level before rankings become a problem.
Look at server log analysis. How many URLs are the search engine crawling? How many unique pages are being crawled? If you are doing a site migration, for example, this can help you set up expectations.
Another thing you can get out of server logs is if anyone is scraping you. You can also see if the redirects have been done properly — as in 301s, not 302s.
You should take all of these issues and ask: How much is this really impacting the site? Some of them you might just have to let go.
Google Webmaster Tools alerts in your email are sometimes vague. Sometimes they are just trying to let you know of something. Instead of freaking out, take all the pieces of tech data you have to discern if you have a problem.
Dixon Jones (@Dixon_Jones) of Majestic SEO is up. The search metrics that matter is hidden in plain site. There are more than 500 pages in Google Webmaster help. Read between the lines there, even though it seems vague. They do change/update it regularly, too.
Crawling issues are important.
Avoid shared hosting. If you do though, then check the IP of the other sites on your server. They will be stealing your share of Googlebot’s attention. Minimize the number of URLs Googlebot sees – it’s more efficient for it.
Within the back end of server logs for a site, you can mine a lot of information, as Vanessa was pointing out. See if there is enough bandwidth for your site. Check 500 server errors. Track soft 404s; these aren’t really 404s, but are sending error pages.
Know what great content is. Google tries to explain what that is and they do a pretty good job. He is referring to the leaked quality rating manual. The problem is, great content is hard to measure. But you need to have quality content.
[Side note: check out Bruce Clay, Inc.’s SEO Newsletter this month on creating great content.]
David Burgess (@DavidBurgess00) from Ayima is next. He is going to be talking about how to identify early warning signs before they become a big problem. This is about being prepared.
There are always reasons for drops in traffic. Look at on-page factors and then look at the off-page factors.
First, make friends with people who can help in your organization AKA “hug a techie” within your company. Set up auto email alerts with your techie to monitor what’s going on with the site if you can.
Then ramp up the types of alerts you are receiving from the tools available to you, and quantity of alerts. Have these coming twice per day.
If your site has more authority, you tend to have more problems. Google is continually looking to discover URLs. If you have problems, create a “disaster timeline” to see how the problems usually come about and how you deal with it.
Once you have all the alerts, it’s about response; your production environment needs to be able to “live patch” problems. You can reduce the crawl rate manually in Webmaster Tools while you fix problems.
Once you have early warning signs, then you have a “new” disaster timeline. You know about issues as they come up and can work to fix them efficiently, more proactively.
Closing thoughts by Vanessa: Always look at the context. For example, if you have a ton of 404s, Google won’t back off crawling, but look to see if you have broken links on the site. Just because you have 404s isn’t important within itself, it’s how it’s impacting it.