Analytics Every SEO Needs To Know
On to the session. We’ve got Rand Fishkin, Co-Founder and CEO, SEOmoz, Inc., moderating. Our speakers are Brian Klais, Executive Vice President, Search, Netconcepts; Laura Lippay, Group Program Manager, Search Strategy, Yahoo!; Jonah Stein, Founder, ItsTheROI; and Richard Zwicky, President, Enquisite.
Brian Klais is up first and is going over a list of metrics that you need to know. Non-yielding pages should be identified and know the rate that each yielding page is generating. Look for inequality in page placement. Measure the engine yield rate. ROI and brand reach are critical metrics. The missed opportunity cost can be calculated, based on assumptions that all your unique pages should be yielding, and that the average number of visitors per page apply to all pages. For more on the subject, search for “natural search KPIs”.
Laura Lippay is going to share how to prove the worth of SEO. She says that the Grid is the way to show what you’ve accomplished. It’s a collection of keyword-based data, stored in an Excel file or database, which can be pulled into multiple varied reports.
What can you do with it?
- Balance SEO, PPC, and Paid Inclusion (PI)
- Find SEO referral gaps
- Find SEO content opportunities
- Make SEO traffic and value projections
What do you need?
- Number of Searches
- Conversion metric ($ per PV, LTV, etc.)
- PPC data
- Search engine CTR by position
What to use it for?
- Gather keywords (and optionally, the number of searches)
- Add keyword data for performance comparisons
- Or just to see performance of one channel
- With search volume and CTR by position, make assumptions
Once you have the data, showcase your skills in the grid. Anything keyword based can be part of this grid.
Jonah Stein says he’s going to talk about the five forgotten metrics:
- Customer lifetime value
- Crawl frequency – the new page
- Page views to conversion
- External links
- Webmaster Central all query stats
To manage your customer relationships, write a permanent cookie on First Touch, write First Touch data to CRM on conversion event, and capture missing data at every touch point. With that data you get more accurate LTV & ROI.
Crawl frequency is one of the most meaningful metrics. CF is a product of search engine’s score of your page. Obfuscating CF lowers index quality. Relative CF is great for diagnostics. Crawl frequency is governed by the number and quality of inbound links, the content update frequency, server performance and Sitemap settings.
Metrics for crawl frequency include:
- How often crawler visits
- Crawl depth
- Crawl saturation
- Crawl frequency rank by page (infrequent, supplemental)
- Pages that don’t get crawled have issues
How to measure crawl frequency:
- Crawl rate tracker
- Log file analysis
- Develop a custom solution
The little-used metric of page views to conversion will illuminate that good SEO gets the user closer to conversion. Decreased page views can be good or bad. Decreased time on page can be good or bad. And decreased time on a site can be good or bad. To measure this, you need to know the funnel for each landing page and to measure success to the funnel. As for external links, measure the total number of external links and the external links per page.
Webmaster Central for all query stats should be looked at. Sometimes the content is not human readable. A converter for Google WMC stats can help you decipher the data. The keyword ranking is shown by directory. CMS developers and large site publishers should integrate into the back end by importing WMC query stats, importing external linking data, and displaying internal linking data.
Richard Zwicky is next, and will somewhat base his presentation on a recent blog post he published. Using SEOmoz as an example, he’s walking through their analytics data on Enquisite software. The program allows you to look at the location queries are coming from and look at multiple keywords. Once he knows what terms are bringing people to a page, and look at what’s not performing in your target markets.
What tips or best practices do you have for measuring 2.0 data?
Jonah says that you can tag AJAX states to get a map of what’s going on and create an alternate value. That way you can tag what user has completed what action.
Laura, what measure do you use to measure the lift between different search engines?
She says she bases it on models that are out there, available online. She doesn’t get search data just because she’s at Yahoo. She has access to what everyone else does.
When performing human evaluation of the data, is it meaningful to look at the top 200, 1,000?
Brian says that you have to position yourself implement scalable solutions. It’s kind of looking at like business metrics, it’s a signal to look at the health of the business. Richard says that the tail end is what needs to be focused on. Jonah says that if your site has a hierarchical structure, just look at the landing pages that delve deeper into your site. It’s easier than looking at a list of 200,000 pages.
Does the panel have any recommendations for log file analysis to track spider behavior if you don’t have the resources to custom build?
Marty Weintraub, sitting in the audience says that ClickTracks has an API for log file. However that still requires custom development. Someone else in the audience says ClickTracks Pro can do it. Brian says some clients use programs like Omniture or Hitwise.
Is Omniture worth the cost?
Laura says it’s always worth it. Richard says some people run their whole company around it. He says that some customers fall back on privacy of data. There’s more sampling and more opportunities to slice and dice your data on Omniture. Jonah says the threshold is whether or not there’s a full-time analyst looking at the data all day. If so, then it’s probably worth it. If it’s only looked at for two hours a month, then maybe it’s not worth the cost.