SEO is in the Details: Bruce Busts the Boondoggle
SEO is not snake oil, not a boondoggle, not a waste. That SEO is snake oil is a point of view that’s been argued over the years, sometimes even endorsed by well-known industry insiders. That’s not to say that there’s no one selling sub-standard SEO services and thus giving the profession a bad name. But I believe most poor SEO can primarily be traced back to a lack of education and a poor understanding of search marketing concepts.
A few weeks ago, the article Is Most of SEO Just A Boondoggle? sparked some lively conversation within the SEO community. While I don’t often enter public debates online, this one gave me much pause for thought as it weighs on the future of search engine optimization and how our business will be conducted online.
As you all know, I have been doing SEO for many years. In that time I have seen the good, the bad and even some ugly excuses for search optimization services as they play out on Web sites both large and small.
But those points made in the article above suggest to me a different kind of confusion — this one between the objectives of SEO and the development of clean, spiderable code. I believe the remedy for this issue is to promote greater understanding through testing, observation and education.
Over the years, search engines have treated various tags in different ways. Take the Meta Keywords tag for instance. It was once an important ranking indicator. Today it holds little sway in the ranking algorithm.
This serves to illustrate just how fickle the search engines can be. Rather than trying to predict how important each HTML tag is to search engine rankings, I find more consistent results come from using HTML as intended.
It used to be that a site needed every Meta tag in order to rank. Now that’s not the case. No one element will cause a site to rank, but it’s crucial to remember that SEO is effective as a whole. When I read that an optimized H1 tag is a waste of time, I immediately wonder if an H1 tag can hurt your site. How could it ever be wrong to use a valid HTML construct the way it was supposed to be used? The search engines have certainly changed their minds before. Make your site all that it can be by using HTML as it was intended.
As Vanessa Fox pointed out in her response to the article above, XML Sitemaps serve an important and necessary function. An XML file does not promote PageRank transfer since the search engine bot will not be brought to pages through a link; a crawlable site map is much better suited for this purpose. However, an XML file often causes pages to be added to the index, which is an important objective of search engine optimization.
There’s something to be said here about the difference between optimizing a small site versus a large site. While the methodology is much the same, there are special considerations that large sites have — a prominent one being the challenge of getting pages found and indexed by search engines. In these situations, nothing is as helpful as an XML Sitemap.
Also, consider the use of an XML Sitemap when redirecting one site to another. A 301 is great, but the old page needs to be spidered for the 301 to be found and for the new target page to acquire the PageRank. By submitting your old XML file the 301s will be found faster. That’s worth something to me!
Keywords in URLs and Image Links
In SEO, nothing is ever black and white. Keyword-rich URLs are helpful to ranking goals, and so is domain age. These two considerations must be considered and balanced before making any changes to a site. However, it would be misleading to say that keyword-rich URLs serve no purpose.
First of all, SEOs have a responsibility to make the best Web site possible, not just for search engines. Keyword-rich URLs stand out as relevant to human users scanning search results, thus serving a purpose for human users. Furthermore, when a link uses a URL as anchor text, any keywords included in that URL will add increased relevancy as a link value signal to the search engine. Keyword-rich URLs add value to a site in more than one way.
As for the point about image links, they are followed, but our research suggests that the anchor text may not be associated with that image ALT text. This topic requires a much longer explanation than this post would allow.
As the body copy makes up the majority of a page’s crawlable content, it makes sense that it would be an important consideration of the search engine optimization process. The inclusion of targeted keywords throughout the body copy is a signal of relevance for search engine rankings.
In the past, keyword density was used as a hard-line way of specifying keyword use objectives for a page. Today, keyword density is only valuable when characterizing populations of pages and determining the target for a specific phrase.
There is no magic density, but there are winning page footprints. Through keyword distribution analysis — which uses density as just one of many considerations — an SEO can compare their site’s page with a competitor’s page and understand how keyword usage on a page relates to rankings. Few people use keyword distribution and density metrics correctly or understand how to use it, and density alone is nearly useless, but it does have its place.
Linking Structure and Site Architecture
The proper use of the nofollow tag has been discussed at great length recently. Overall, nofollow links are a waste… unless you know when to use them. Selective nofollow use is reasonable if used to manage the dilution of theme-architected content; this is a technique used in the practice of siloing a site.
I think this is a nice opportunity to address arguments that theming a site is a waste. We have documented cases of tripling traffic within hours of uploading siloed, or themed, sites that are properly architected for SEO.
According to Google’s head of Web spam, Matt Cutts: “Spend your time on making good site architecture so PageRank just flows wherever you want.”
In our experience, using selective nofollow links has been shown to help support a theme-architectured site. Removing nofollow tags where the targeted page is link poor has also been known to help.
As always, there is no easy rule to follow to get successful SEO results. The best results require a skilled professional with background, experience and a strong ability to analyze a site’s strengths and weaknesses.
Unfortunately, many SEOs do not understand how to perform the duties associated with the job. This young industry is gaining mainstream attention, attracting a number of new and inexperienced novices to the profession. Many are junior webmasters who claim to be SEOs, and clients simply do not know to challenge those claims.
Some of them go to forums, read posts that are ancient, and then take it as gospel. They seldom run their own experiments, they do not have firsthand knowledge gained through observing cause and effect, and they often regurgitate the “knowledge” gained by playing telephone.
As more and more new blood enters the SEO arena, the availability of high-quality education will become increasingly important. Do-it-yourself learning plays a critical role in the development and advancement of SEO methodology, but as with anything available on the Internet, credibility and reliability can be suspect. Knowledge transfer and sharing by those with trustworthy history and experience in the industry will contribute much to the ethical and educated growth of Internet marketing.
The uneducated, those that make a living guessing and doing only part of the job, and the few who teach but cannot do — these are the people aiming to knock down SEOs who succeed. Fear, uncertainty and doubt are great tools for the snake oil SEO. But trust that SEO does work. If it did not, so many people would not be so easily fooled.
Search engine optimization is only a boondoggle and a waste if done by someone lacking SEO skills.
The Big Picture
In a very competitive space, every little thing can make a difference. There is no HTML element too small or tactic too insignificant when the strength of SEO comes down to the combined whole.
I often say that optimization is about making a site the least imperfect, a strategy which must take into account every last bit of a site. Why object to on-page code optimization when doing it right can’t hurt you?
Education and experimentation will prove the most reliable road to SEO success. And you can’t seek to understand SEO only as it is today — you must also try to predict the SEO of the future. Every week SEO is a new industry. In the end it comes down to this: SEO is not the blind application of hard-and-fast rules, but rather SEO aims to make every page the best it can be.