SEO Strategy for Semantic Search
The advent of blended search in 2007 forced SEOs to rapidly reevaluate their strategies. Blended search, which incorporates video, images, news and other result categories into general Web search results, meant that along with keyword-rich content, engagement objects represented an additional opportunity for SERP real estate.
Another potential game changer, semantic search has been buzzed about for a long time already, and with big boy Google beginning to implement some semantic technology, it looks like semantic search could be making its move to the mainstream. So is it time to rethink SEO strategy again, this time for semantic search?
Yesterday I wrote about Ask.com and their progressive search technology, including impressive strides in semantic search. One commenter, Phill Midwinter (aka SEO Phill), brought up the implications semantic search could have on the search engine optimization community:
What I find odd is that despite the improvements taking place with semantic search technology, the majority of SEOs still insist on using a keyword strategy that doesn’t take semantics into account.
ASK is doing some great work and has been innovating steadily for some years now – but the recent slew of semantic improvements from Google are also being taken at face value and strategy isn’t changing.
You only need to look at the recent PageRank sculpting fiasco […] to see that SEOs are losing touch instead of keeping up.
Well isn’t that a frightening prediction. I decided that a semantic search SEO strategy was worth a deeper look, so I pinged Phill to find out more. And omgosh, I had a lot to learn.
Where’s Semantic Search Today?
First some background. Latent Semantic Analysis (LSA) uses statistical algorithms to figure out what Web content is about based on the words used and their relationships with other words on the page. By comparing all the words and they way they’re used, LSA can figure out what a page is actually about and base result rankings on that as opposed to the keyword matching system that powers search today. In this way a search engine using LSA can return the most relevant results — regardless of query word order and regardless of whether or not the result actually includes the queried terms.
Of course, none of today’s major search engines work solely from latent semantic indexing. Rather, LSA is being used on top of the more traditional ranking factors. Phill explained to me (emphasis mine):
The slew of recent upgrades [Google] released are all based on this LSI layer over the top of their existing index. What that means, is that Google is storing an “ontology” or a semantic profile. […] They expose their ontology in four key areas right now: the Wonder Wheel (limited), search suggestions at the bottom of the SERP, AdWords Keyword Tool and their search-based keyword tool.
Off-Page SEO Recommendations
So this is where things get science-y. Phill explained that if you collate the data from the four sources above, you’ll have yourself a map of Google’s semantic profile for a specific Web site. When Phill took this profile and compared it to a map of the site’s inbound links, it turns out the two shapes match! So Phill deduced that Google’s using LSA to analyze link equity as well as for serving queries. It that’s the case, an SEO can use a site’s semantic profile as a guide for the link building strategy. Of course, SEOs have always been concerned with getting the most relevant links to a site, but by using a semantic profile one can tailor the shape of a site with the help of a handy guide map. Oh, and of course, as LSA becomes more predominant, anchor text won’t be the end-all-be-all of link value — all the content on the linking page (or even site) will hold sway.
On-Page SEO Recommendations
And this is where it gets a little scary. Search engine optimizers have become pretty accustomed to using a keyword list as a guide for their on-page SEO efforts. But because LSA is looking at the content on the site as a whole and the relationship between different concepts, trying to fire at a narrow list of keywords is not the best approach. Specific keyword mentions aren’t the key, because as far as LSA is concerned, a truly strong site will cover a topic in a natural and holistic way. Phill recommends that the concise keyword list be transformed into a list of thousands of long tail keywords that guides the content of the site rather than dictates it. By providing broader content topics for copywriters to write about, the copywriter can write naturally, following the guidelines of the site’s ideal semantic profile. According to Phill, this approach will result in a site that is based on the way people actually think and search. With the broad shape of the site in place, tailoring can be done through the site’s inbound links.
Search may not depend on semantic technology now, but I do believe it’s worth thinking about today.