SMX East 2011: Google Survivor Tips
Matt McGee is moderating. SMX asked the speakers of favorite sessions from SMX Advanced to update their presentations. That’s what this track is all about.
For this session, the Panda update is the main focus. What was the impact and how did some sites turn around?
Mark Monroe, @markemunroe, starts us off.
Feb. 24: BOOM. Panda strikes.
It was the most significant change to oganic search algos since the intro of PR and link reputation. Whether or not you were hit by Panda, you need to pay attention.
Factors of getting hit: content explosion, thin content, heavy content aggregation, heavy ad/content ratio, article and q&a farms
Pre-panda Google: search relevancy defined by content and links (anchor text).
SEOs were always looking for ways to create new content and new pages even if users were less targeted, they still get money (some conversions and clicks).
Post panda: SERP positions are modified based on site’s whole quality. He believes it’s primariliy determined by how users are interacting with your site in the SERP.
Besides the block button, what else can a user do to your site?
When it comes to the SERP user experience, Google knows more than you.
What do you know?
What does Google know?
User searches for Android app reviews > click on topandroid.com > finds no reviews > back to Google SERP.
To understand metrics that Google is interested in, think like a Google product manager. Likely metrics indicating user experience:
- G bounce
- Query behavior after a G bounce
- Avg. time before a G bounce
- Click-through rates
- Repeat visits
Does it matter if it’s behavioral (vs. content)? Yes, because if behavioral it’s based on data collection, which takes time to analyze enough for statistical significance. It takes a long time to collect enough new, good data. Address the pages that drive you traffic.
SEO vs. user experience engineers: in the past they’ve had conflict, but it’s more important now that they work together than ever.
Survey your users, and make sure they’re the ones who epresent your overall traffic coming from Google. Surveys are cheap and faily easy to implement.
Do actual usability testing starting on the SERP. Use keywords from analytics representative of your traffic. Create scenarios that test the key questions poeple have when they come to your site.
Do they have a sense of trust and understanding of what your site’s about? Change content or delete keywords where traffic isn’t delivering. If you allow comments and UGC, get good spam filtering. Spammers can bring unwanted traffic. Beware hidden content behind buttons that draw traffic that will bounce.
- Content should be very focused on the title.
- Avoid bringing less-relevant traffic.
- If you have content that doesn’t deliver, noindex, follow the page, but carefully so it doesn’t go on the wrong page.
- Link freely to relevant content – if you can’t give the user what they want, send them where they can get it.
- Don’t annoy the user, like with too many ads.
- Site up time and performance is extremely important.
- Give a good mobile experience. This can dramatically improve the experience and metrics.
- Standard bounce rates as reported by most analytics packages are extremely flawed. Time on site is equally flawed. Nevertheless you need to track and analyze these metrics, but understand the limitations.
Metrics he’d like to see:
- 15 sec bounce
- 30 sec bounce
- 1 min bounce
Shopping sites, directory sites, affiliate sites are industries that got hit significantly.
It appears subdomoains are treated as distinct sites.
Alan Bleiweiss, @alanbleiweiss, is next with info on how to innoculate your sites so you don’t feel the effect after the next update.
Myopic SEO: focuses only on Google based magic bullet methods.
Sustainable SEO: focuses on user experince as seen through eyes of search bots and algorithms.
- Topical cross contamination
- Text anorexia
- Internal link poisoning
- Unnatural anchor text
- Consistent signals on topical focus?
- Does this confirm or confuse focus?
- Does this page overwhelm the senses?
- Off-site diversity (links, social, reviews)
This page has tons of links and no topical focus. It overwhelms the senses. The main content is small and dwarfed by the links and while it’s good to give users options, that’s not what the page should be about.
A page with sustainable SEO:
- Section specific sub-navigation
- Micro-data bread crumbs
- High quality topic focused unique content
- Main area content
Your keywords tell a story of myopic vs. sustainable SEO. If brand terms and competitive phrases dominate, you’re probably not helping users. Understand your site. Sustainable SEO keywords have a close link-to-root ratio, a mix of high value and mid-value phrases.
With sustainable SEO you’ll find traffic coming from sites besides Google (20% to 40%).
Sustainable SEO cares about the future.
- What are users ding on the Web today?
- What myopic SEO technique is Matt’s next target?
- What are the biggest emerging tech trends?
The next big SEO thing?
- Identify authority tweeters
- More diversity of deep info
- People profiles
- Page segmentation
- Better topical identification
- Better spammy behavior detection
He thinks schema.org will be a ranking factor – helping search engines understand what a site’s about.
Micah Fisher-Kirshner is next and will talk about dealing with algo updates. He has a checklist for events.
1. Is the data fully in?
- Maintain good relationship with ops
- Massive flexibility within the org
- Utilize g analytics hourly reports wit advanced segments
2. Who else is affected?
- Communications with other teams is essential
- Find the limits of the event
3. Algo update rumors?
- Read, read, read
- Focus on forums and quickly updated news sites
4. What was recently launched?
- Keep a log
- Bug the engineers
- Watch for rollbacks that undo critical changes
- Even if you find one issue, it may not be the only problem
5. What areas are affected?
- Segment any way you can
6. Did anything break?
- Broken or forgotten processes can lead to a broken site that looks like black hat SEO
- Back-end functions are easiest to miss
- Worker transitions always miss certain processes
- Go back to ops team throughout
7. Who’s talking?
- If an algo update is confirmed, look for what’s available on it
- Recognize regulars, scrutinize strangers and know that jerks will be jerks but that you need to look beyond that and listen
8. What sites dropping?
- Your ranking data shows severity of impact
- Competitive ranking data is essential, especially those surviving
9. What are the theories?
- Think like a Black Hat to understand what the search engine is trying to address
- Look at strong factors and find confirmation examples
- Where you fit will help you determine what to fix on your site
10. What theories fit?
- Find out everything you can
- Work your biz connections
- Read blogs for in-depth analysis
- Jot down likely possibilities – make sure you have enough data and get out in front fast before others
11. What can we do to recover?
- Think like a search engine
- Learn to love statistics
- Look at coding clues
- Do a/b testing, not just guessing and seeing what sticks
Now it’s Q&A time and Alan wants to start by disagreeing with the recommendation to de-index content. Don’t miss out on opportunities to have good content.
Mark agrees that you don’t want to throw out content that has potential to rank – he was talking about algo generated content.