SEO in a Two Algorithm World: Pubcon Keynote by Rand Fishkin

Rand FishkinRand dedicates this presentation to Dana Lookadoo, who will always be with us.

This author’s TL:DR take: On top of traditional SEO optimization factors (ranking inputs such as keyword targeting, quality and uniqueness, crawl/bot friendliness, snippet optimization, UX/multi-device optimization), SEOs need to optimize for searcher outputs (such as CTR, long-clicks, content gap fulfillment, amplification and loyalty, task completion success).

Here’s where you can get the presentation: http://bit.ly/twoalgo

Remember when we only had one job? We had to make perfectly optimized pages. The search quality team would get it ranked, and they used links as a major signal. By 2007, link spam was ubiquitous. Every SEO is obsessed with tower defense games because we love to optimize. Even in 2012, it felt like Google was making liars out of the white hat SEO world (-Wil Reynolds).

Rand says today, that statement isn’t true anymore. Authentic, great content is rewarded by Google better than it ever has been. Google has erased old school practices by combating things like link spam. And they’ve leveraged fear and uncertainty of penalization to keep sites in line. It’s often so dangerous to use disavows that many of us are killing links that provide value to our sites because we’re so afraid of penalties.

Google Has Become Smarter

Google has also become good at figuring out intent. They look at language and not just keywords.

Rand F Movie Star Trek

They predict diverse results.

Books for Startups

They’ve figured out when we want freshness.

Best conferences

They can separate navigational from informational queries. They connect entities to topics and keywords. Even brands have become a form of entities. Bill Slawski has noted that Google mentions brands in a number of their filed patents.

Google is much more in line with their public statements. They mostly have policy that matches the best way to do search marketing today.

Google’s Stance on Machine Learning Has Changed

During these advances, Google’s search quality team underwent a revolution. Early on, Google rejected machine learning in their organic ranking algorithm. Google said that machine learning didn’t let them own, control and understand the factors in the algorithm. But more recently, Amit Singhal’s comments suggest some of that has changed.

In 2012, Google published a paper on how they use machine learning to predict ad click-through rate. Google engineers called in their SmartASS system (apparently that’s ACTUALLY the name of the system!). By 2013, Matt Cutts talked at Pubcon about how Google could be using machine learning (ML) publicly in organic search.

As ML takes over more of Google’s algorithm, the underpinnings of the rankings change. Google is public about how they use ML in image recognition and classification. They take factors they could use to classify images and then add training data (things that tell the machine something is a cat, dog, monkey, etc.), and there’s a learning process that gets them to a best-match algorithm. Then they can apply that pattern to live data all over.

Google representative Jeff Dean’s slide presentation on Deep Learning is a must-read for SEOs. Rand says that this is essential reading and not too challenging to consume. Jeff Dean is a Google fellow and someone they like to make fun of a lot at Google: “The speed of light in a vacuum used to be about 35 miles per hour. Until Jeff Dean spent a weekend optimizing the physics.”

Best fit algo

Bounce, clicks, dwell time — all these things are qualities in the machine learning process, and the algorithm tries to emulate the good SERP experiences. We’re talking about an algorithm to build algorithms. Googlers don’t feed in ranking factors. The machine determines those itself. The training data is good search results.

What Does Deep Learning Mean for SEO?

Googlers won’t know why something ranks or whether a variable is in the algorithm. Between the reader and Rand, doesn’t that sound a lot like the things Googlers say now? ;)

The query success metrics will be all that matters to machines:

  • Long to short click ratio
  • Relative CTR vs. other results
  • Rate of searchers conducting additional related searches
  • Sharing/amplification rate vs. other results
  • Metrics of user engagement across the domain
  • Metrics of user engagement on the page (How? By using Chrome and Android)

If lots of results on a SERP do all of the above, then they’ll keep including that. We’ll be optimizing more for searcher outputs. These are likely to be the criteria of on-site SEOs in the future.

OK — but are these metrics affecting us today? In 2014, Moz did a queries and clicks test. Since then, it’s been much harder to move the needle with raw queries and clicks. Google is catching on to raw clicks and queries manipulations.

At SMX Advanced, Gary Illyes said that using clicks directly in rankings would not make too much sense with that noise. He said there were people producing noise in clicks, calling out Rand Fishkin. – Case closed! Or is it … ?

But what if we tried long clicks vs. short clicks? At 11:39 am on June 21st, Rand asked people to do a test where they quickly clicked back on result No. 1 and then clicked and dwelled on the No. 4 result. The No. 4 result stayed at SERP position No. 1 for about 12 hours. This tells us that searcher outputs affect rankings. (P.S. This is hard to replicate. Don’t do it, because it’s dark magic.)

What you should be doing is things that will naturally make people want to click your result in the SERP.

A Choice of Two Algorithms

This is why Rand says we’re optimizing for two algorithms. We have to choose how we’re balancing our work. Hammer on signals of old? They still work. Links still work. Anchor text still moves the needle. But we can see on the horizon more clearly than ever before where Google is going.

Classic On-Site SEO (ranking inputs) vs. New On-Site SEO (searcher outputs):

classic vs new seo

Using both matters because there are two algorithms.

New Elements of SEO

Let’s talk about the five new elements of modern SEO.

1. Punching above your average CTR

Optimizing the title, meta description and URL a little for keywords but a lot for clicks. If you rank No. 3 but you can boost your CTR, you can earn a boost in ranking. Every element counts. Do searchers recognize and want to click your domain? Does the URL seem compelling? Do you get a brand drop-down?

Optimized Serp Listing

Drive up CTR through branding or branded searches and it may give you an extra boost. Branding efforts (such as advertising on TV, radio, PPC) has an impact on CTR. Brand budget helps relative click-through rate and all sorts of other ranking signals, and that lift is causing some part of this.

With Google Trends’ more accurate, customizable ranges, you can actually watch the effects of events and ads on search query volume. For example, there’s a spike in “fitbit” queries after Fitbit has been running ads on NFL Sunday.

2. Beating fellow SERP listings on engagement

Together, pogo-sticking and long clicks might largely determine where you rank (and for how long). What influences them? Here’s an SEO’s checklist for better engagement:

  • Content that fulfills the searcher’s conscious and unconscious needs
  • Speed, speed, and more speed
  • Delivering the best UX on every browser
  • Compelling visitors to go deeper into your site
  • Avoiding features that annoy or dissuade visitors

Example: The New York Times has high-engagement graphics that ask visitors to draw their best-guess finish of a graph.

3. Filling gaps in visitors’ knowledge

Google’s looking for signals that show a page fulfills all a searcher’s needs. ML models may note that the presence of certain words, phrases and topics predict more successful searches. Rankings go to pages/sites that fill the gaps in searchers’ knowledge. TIP: Check out Alchemy API or MonkeyLearn. Run your content through them to see how it performs from an ML perspective.

4. Earning more shares, links and loyalty per visit

Data from Buzzsumo and Moz show that very few articles earn shares/links, and that these two have no correlation. People share a lot of stuff they have never read. Google almost definitely classifies different kinds of SERPs differently. A lot of shares on medical information, for instance, won’t move the result up in ranking; accuracy will be more important.

A new KPI: Shares and links per 1,000 visits. Unique visits over shares + links.

Knowing what makes people return or prevents them from doing so is critical, too.

We don’t need better content, we need 10X content (i.e., content that is 10 times better than the best currently out there).

5. Fulfilling the searcher’s task (not just their query)

Task = what they want to accomplish when they make that query. Google doesn’t want a multisearch path of continually focused queries. They want a broad search for which they fill in all the steps and you complete your task.

The search engine might use the clickstream data ­to help rank a site higher even if it doesn’t have traditional ranking signals. A page that answers the initial query may not be enough, especially if competitors do allow task completion.

Algo 1: Google

Algo 2: Subset of humanity that interacts with your content (in and out of search results)

“Make pages for people not engines” is terrible advice.

Engines need a lot of the things we’ve always done and we better keep doing that. People need additional things and we better do that, too.

Bonus links:

Virginia Nussey is the director of content marketing at MobileMonkey. Prior to joining this startup in 2018, Virginia was the operations and content manager at Bruce Clay Inc., having joined the company in 2008 as a writer and blogger.

See Virginia's author page for links to connect on social media.

Comments (5)
Filed under: SEO — Tags: , ,
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!

5 Replies to “SEO in a Two Algorithm World: Pubcon Keynote by Rand Fishkin”

More and more SEO can be summed up in the term user engagement. change is the only constant!

John Alexander

Great post, Virginia! A couple of questions:
1. Is a “long click” the same as a visit with relatively high dwell time?
2. Are searchers doing the training that “teaches” the algorithm, or is all of that done at Google before they publish the update? Machine learning is largely dependent on feedback, which dwell time/CTR and other “searcher outputs” provide; however, are those outputs simply teaching these algorithms about the quality of the results, or are the outputs actually helping the algorithm to learn? If it’s the latter, then personalization will begin to increase exponentially, and the usefulness of API ranking data will become less and less meaningful, and on-site metrics will start to become more important.

Virginia Nussey

Hi John! Thanks! To your first questions, as I understood it, yes. “Long click”=long time before hitting back to the search results. And to your second question, what timing, considering that RankBrain has been confirmed. I don’t know how any of it works but for what it’s worth, it appears that the machine learning bits are not algorithm-wide, but rather involved in defined areas … or in some way its affects are predictable by the engineers (see Gary Illyes’ comment: https://twitter.com/methode/status/658735811886628864). I also suspect, as you have, that personalization is only going to soar from here.

Nice article and very well said! Thanks

Nice post actually I really like your site its up o the mark and most of the post are to the point and deeply cleared. thanks for sharing.

LEAVE A REPLY

Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | PO Box 1338 | Moorpark CA, 93020
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919