Analytics Every SEO Needs To Know

I’m stepping in late after a lunchtime interview for SEM Synergy with the awesome Jeremiah Andrick. Check that out next Wednesday on WebmasterRadio.fm.

On to the session. We’ve got Rand Fishkin, Co-Founder and CEO, SEOmoz, Inc., moderating. Our speakers are Brian Klais, Executive Vice President, Search, Netconcepts; Laura Lippay, Group Program Manager, Search Strategy, Yahoo!; Jonah Stein, Founder, ItsTheROI; and Richard Zwicky, President, Enquisite.

Brian Klais is up first and is going over a list of metrics that you need to know. Non-yielding pages should be identified and know the rate that each yielding page is generating. Look for inequality in page placement. Measure the engine yield rate. ROI and brand reach are critical metrics. The missed opportunity cost can be calculated, based on assumptions that all your unique pages should be yielding, and that the average number of visitors per page apply to all pages. For more on the subject, search for “natural search KPIs”.

Laura Lippay is going to share how to prove the worth of SEO. She says that the Grid is the way to show what you’ve accomplished. It’s a collection of keyword-based data, stored in an Excel file or database, which can be pulled into multiple varied reports.

What can you do with it?

  • Balance SEO, PPC, and Paid Inclusion (PI)
  • Find SEO referral gaps
  • Find SEO content opportunities
  • Make SEO traffic and value projections

What do you need?

  • Keywords
  • Number of Searches
  • Conversion metric ($ per PV, LTV, etc.)
  • PPC data
  • PI
  • Algo
  • Search engine CTR by position

What to use it for?

  1. Gather keywords (and optionally, the number of searches)
  2. Add keyword data for performance comparisons
  3. Or just to see performance of one channel
  4. With search volume and CTR by position, make assumptions

Once you have the data, showcase your skills in the grid. Anything keyword based can be part of this grid.

Jonah Stein says he’s going to talk about the five forgotten metrics:

  1. Customer lifetime value
  2. Crawl frequency – the new page
  3. Page views to conversion
  4. External links
  5. Webmaster Central all query stats

To manage your customer relationships, write a permanent cookie on First Touch, write First Touch data to CRM on conversion event, and capture missing data at every touch point. With that data you get more accurate LTV & ROI.

Crawl frequency is one of the most meaningful metrics. CF is a product of search engine’s score of your page. Obfuscating CF lowers index quality. Relative CF is great for diagnostics. Crawl frequency is governed by the number and quality of inbound links, the content update frequency, server performance and Sitemap settings.

Metrics for crawl frequency include:

  • How often crawler visits
  • Crawl depth
  • Crawl saturation
  • Crawl frequency rank by page (infrequent, supplemental)
  • Pages that don’t get crawled have issues

How to measure crawl frequency:

  • Crawl rate tracker
  • SEOmeter.com
  • Log file analysis
  • Develop a custom solution

The little-used metric of page views to conversion will illuminate that good SEO gets the user closer to conversion. Decreased page views can be good or bad. Decreased time on page can be good or bad. And decreased time on a site can be good or bad. To measure this, you need to know the funnel for each landing page and to measure success to the funnel. As for external links, measure the total number of external links and the external links per page.

Webmaster Central for all query stats should be looked at. Sometimes the content is not human readable. A converter for Google WMC stats can help you decipher the data. The keyword ranking is shown by directory. CMS developers and large site publishers should integrate into the back end by importing WMC query stats, importing external linking data, and displaying internal linking data.

Richard Zwicky is next, and will somewhat base his presentation on a recent blog post he published. Using SEOmoz as an example, he’s walking through their analytics data on Enquisite software. The program allows you to look at the location queries are coming from and look at multiple keywords. Once he knows what terms are bringing people to a page, and look at what’s not performing in your target markets.

Q&A

What tips or best practices do you have for measuring 2.0 data?

Jonah says that you can tag AJAX states to get a map of what’s going on and create an alternate value. That way you can tag what user has completed what action.

Laura, what measure do you use to measure the lift between different search engines?

She says she bases it on models that are out there, available online. She doesn’t get search data just because she’s at Yahoo. She has access to what everyone else does.

When performing human evaluation of the data, is it meaningful to look at the top 200, 1,000?

Brian says that you have to position yourself implement scalable solutions. It’s kind of looking at like business metrics, it’s a signal to look at the health of the business. Richard says that the tail end is what needs to be focused on. Jonah says that if your site has a hierarchical structure, just look at the landing pages that delve deeper into your site. It’s easier than looking at a list of 200,000 pages.

Does the panel have any recommendations for log file analysis to track spider behavior if you don’t have the resources to custom build?

Marty Weintraub, sitting in the audience says that ClickTracks has an API for log file. However that still requires custom development. Someone else in the audience says ClickTracks Pro can do it. Brian says some clients use programs like Omniture or Hitwise.

Is Omniture worth the cost?

Laura says it’s always worth it. Richard says some people run their whole company around it. He says that some customers fall back on privacy of data. There’s more sampling and more opportunities to slice and dice your data on Omniture. Jonah says the threshold is whether or not there’s a full-time analyst looking at the data all day. If so, then it’s probably worth it. If it’s only looked at for two hours a month, then maybe it’s not worth the cost.

Virginia Nussey is the director of content marketing at MobileMonkey. Prior to joining this startup in 2018, Virginia was the operations and content manager at Bruce Clay Inc., having joined the company in 2008 as a writer and blogger.

See Virginia's author page for links to connect on social media.

Comments (3)
Filed under: SEO
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!
Bruce Clay on April 3, 2024
How Can I Improve My Website Rankings Through SEO?
Bruce Clay on April 2, 2024
What Is SEO?
Bruce Clay on March 28, 2024
Google’s Explosive March Updates: What I Think

3 Replies to “Analytics Every SEO Needs To Know”

Nice article, has anyone got an example of the Excel grid file to highlight SEO opportunities? It sounds like an interesting creation. Thanks.

Virginia Nussey

Joe, I’m glad you found it useful. This was a very valuable conference, and for those who couldn’t make it, we bloggers hope these posts were helpful.

Joe Woods

Thank you for the great notes. I sure wish my boss would have let me go to this conference, but it was just too far away. This one session would have been worth the trip.

LEAVE A REPLY

Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919