SEO is in the Details: Bruce Busts the Boondoggle
SEO is not snake oil, not a boondoggle, not a waste. That SEO is snake oil is a point of view that’s been argued over the years, sometimes even endorsed by well-known industry insiders. That’s not to say that there’s no one selling sub-standard SEO services and thus giving the profession a bad name. But I believe most poor SEO can primarily be traced back to a lack of education and a poor understanding of search marketing concepts.
A few weeks ago, the article Is Most of SEO Just A Boondoggle? sparked some lively conversation within the SEO community. While I don’t often enter public debates online, this one gave me much pause for thought as it weighs on the future of search engine optimization and how our business will be conducted online.
As you all know, I have been doing SEO for many years. In that time I have seen the good, the bad and even some ugly excuses for search optimization services as they play out on Web sites both large and small.
But those points made in the article above suggest to me a different kind of confusion — this one between the objectives of SEO and the development of clean, spiderable code. I believe the remedy for this issue is to promote greater understanding through testing, observation and education.
Over the years, search engines have treated various tags in different ways. Take the Meta Keywords tag for instance. It was once an important ranking indicator. Today it holds little sway in the ranking algorithm.
This serves to illustrate just how fickle the search engines can be. Rather than trying to predict how important each HTML tag is to search engine rankings, I find more consistent results come from using HTML as intended.
It used to be that a site needed every Meta tag in order to rank. Now that’s not the case. No one element will cause a site to rank, but it’s crucial to remember that SEO is effective as a whole. When I read that an optimized H1 tag is a waste of time, I immediately wonder if an H1 tag can hurt your site. How could it ever be wrong to use a valid HTML construct the way it was supposed to be used? The search engines have certainly changed their minds before. Make your site all that it can be by using HTML as it was intended.
As Vanessa Fox pointed out in her response to the article above, XML Sitemaps serve an important and necessary function. An XML file does not promote PageRank transfer since the search engine bot will not be brought to pages through a link; a crawlable site map is much better suited for this purpose. However, an XML file often causes pages to be added to the index, which is an important objective of search engine optimization.
There’s something to be said here about the difference between optimizing a small site versus a large site. While the methodology is much the same, there are special considerations that large sites have — a prominent one being the challenge of getting pages found and indexed by search engines. In these situations, nothing is as helpful as an XML Sitemap.
Also, consider the use of an XML Sitemap when redirecting one site to another. A 301 is great, but the old page needs to be spidered for the 301 to be found and for the new target page to acquire the PageRank. By submitting your old XML file the 301s will be found faster. That’s worth something to me!
Keywords in URLs and Image Links
In SEO, nothing is ever black and white. Keyword-rich URLs are helpful to ranking goals, and so is domain age. These two considerations must be considered and balanced before making any changes to a site. However, it would be misleading to say that keyword-rich URLs serve no purpose.
First of all, SEOs have a responsibility to make the best Web site possible, not just for search engines. Keyword-rich URLs stand out as relevant to human users scanning search results, thus serving a purpose for human users. Furthermore, when a link uses a URL as anchor text, any keywords included in that URL will add increased relevancy as a link value signal to the search engine. Keyword-rich URLs add value to a site in more than one way.
As for the point about image links, they are followed, but our research suggests that the anchor text may not be associated with that image ALT text. This topic requires a much longer explanation than this post would allow.
As the body copy makes up the majority of a page’s crawlable content, it makes sense that it would be an important consideration of the search engine optimization process. The inclusion of targeted keywords throughout the body copy is a signal of relevance for search engine rankings.
In the past, keyword density was used as a hard-line way of specifying keyword use objectives for a page. Today, keyword density is only valuable when characterizing populations of pages and determining the target for a specific phrase.
There is no magic density, but there are winning page footprints. Through keyword distribution analysis — which uses density as just one of many considerations — an SEO can compare their site’s page with a competitor’s page and understand how keyword usage on a page relates to rankings. Few people use keyword distribution and density metrics correctly or understand how to use it, and density alone is nearly useless, but it does have its place.
Linking Structure and Site Architecture
The proper use of the nofollow tag has been discussed at great length recently. Overall, nofollow links are a waste… unless you know when to use them. Selective nofollow use is reasonable if used to manage the dilution of theme-architected content; this is a technique used in the practice of siloing a site.
I think this is a nice opportunity to address arguments that theming a site is a waste. We have documented cases of tripling traffic within hours of uploading siloed, or themed, sites that are properly architected for SEO.
According to Google’s head of Web spam, Matt Cutts: “Spend your time on making good site architecture so PageRank just flows wherever you want.”
In our experience, using selective nofollow links has been shown to help support a theme-architectured site. Removing nofollow tags where the targeted page is link poor has also been known to help.
As always, there is no easy rule to follow to get successful SEO results. The best results require a skilled professional with background, experience and a strong ability to analyze a site’s strengths and weaknesses.
Unfortunately, many SEOs do not understand how to perform the duties associated with the job. This young industry is gaining mainstream attention, attracting a number of new and inexperienced novices to the profession. Many are junior webmasters who claim to be SEOs, and clients simply do not know to challenge those claims.
Some of them go to forums, read posts that are ancient, and then take it as gospel. They seldom run their own experiments, they do not have firsthand knowledge gained through observing cause and effect, and they often regurgitate the “knowledge” gained by playing telephone.
As more and more new blood enters the SEO arena, the availability of high-quality education will become increasingly important. Do-it-yourself learning plays a critical role in the development and advancement of SEO methodology, but as with anything available on the Internet, credibility and reliability can be suspect. Knowledge transfer and sharing by those with trustworthy history and experience in the industry will contribute much to the ethical and educated growth of Internet marketing.
The uneducated, those that make a living guessing and doing only part of the job, and the few who teach but cannot do — these are the people aiming to knock down SEOs who succeed. Fear, uncertainty and doubt are great tools for the snake oil SEO. But trust that SEO does work. If it did not, so many people would not be so easily fooled.
Search engine optimization is only a boondoggle and a waste if done by someone lacking SEO skills.
The Big Picture
In a very competitive space, every little thing can make a difference. There is no HTML element too small or tactic too insignificant when the strength of SEO comes down to the combined whole.
I often say that optimization is about making a site the least imperfect, a strategy which must take into account every last bit of a site. Why object to on-page code optimization when doing it right can’t hurt you?
Education and experimentation will prove the most reliable road to SEO success. And you can’t seek to understand SEO only as it is today — you must also try to predict the SEO of the future. Every week SEO is a new industry. In the end it comes down to this: SEO is not the blind application of hard-and-fast rules, but rather SEO aims to make every page the best it can be.
14 Replies to “SEO is in the Details: Bruce Busts the Boondoggle”
SEO is one of the laziest processes and boring work to do in the world but it is a very important role to any new launch business
A very thoughtful, well written post, and like everyone else who has commented here, I enjoyed it very much.
I think it’s important to note that the Boondoggle article, which I have been following closely since it was published, is essentially making the same point you are: there is no one solution or magic trick to SEO.
That post’s author was just being a little more controversial with her rhetoric, and it turned out to be an effective way to generate debate and probably traffic to her website/blog.
Bruce, you said everything so well and you clearly and simply explained the differences between those that really know and those that offer it, but have no clue. I wish I could put this post in my local paper. Great post!
Finally someone with authority speaks up and gives a logical and rational viewpoint. Thanks Bruce.
Great comment Alex.
Great post. Have been following you (as well as other creditable seo people) as a valuable source of seo information since the late 90’s.
SEO in my view is common sense – thinking like a customer – avoid cheating (white hat) and knowing what works and doesn’t by experimenting and listening to experts such as yourself.
Well put! I’d have to say that most SEOs are overcharging for their services compared with true benefit derived– but it certainly is true that much of SEO is client education and getting the client to agree to make the recommended changes. Easier said than done.
Great post Bruce!
Having been through your SEO and SEOToolSet training courses it becomes clear that, as you mentioned in the post, it’s the accumulation of many small correctly produced elements that creates the whole.
Thanks for the reminder that it’s the attention to the small details and expert body of knowledge (based on quality training) that produces good SEO results.
I’ve been watching your site for the last ten years Bruce and I couldn’t agree more with this post. Build a great site, from top to bottom, and people will come. Try to approach SEO with the get traffic-rich quick mentality and you’re not going to do anyone, any good.
Awesome, awesome article.
As I was reading Jill’s article, I felt like I was reading old news. I agree with some of the stuff Jill mentioned in her article, but it was obvious stuff that any good SEO wouldn’t do.
This article on the other hand explains how and why the little things DO count. Small things don’t always take that long to implement and I think any little bit helps as well.
Great post. One of things that I’ve always stressed to those I’ve worked with is that SEO is not a single activity, but a thousand different details that need to be continually addressed for success in the search engines.
All things being equal with submission of an XML sitemap bring me to the top of the engines? A link from a relevant PR7 site? Modification of a img alt attribute? Removal of twenty lines of code on a page to reduce bloat? Microscopic manipulation of a title tag? Singly, no. In conjunction with one another, yes.
For those practitioners that claim so-and-so technique “doesn’t matter,” I don’t think anybody in the biz has either enough insider knowledge of search engine algorithms or the wherewithal to conduct sufficient testing to say that for sure in regard to any single tactic (in the case of testing, not necessarily for lack of expertise, but because of inherent problems with testing for optimization). So the “all things being equal” law applies. E.g., is a div rather than table structure better for rankings? All things being equal, it may help, and since it’s good web architecture anyway, why not use it?
I had a discussion on whether or not to bother with the keywords tag. I continue to do so as a best practice, though some in the discussion considered it a waste of time.
This sums up my feelings exactly! “Rather than trying to predict how important each HTML tag is to search engine rankings, I find more consistent results come from using HTML as intended.”
Well said, Bruce. Your point about designing for the user is particularly important – as Google et al have improved their ranking alogrithims over the years, SEO and user-centric design are no longer mutually exclusive, and indeed are often the same.
A lovely article thanks Bruce. Quite contrary to that Boondoggle article on almost every point.
This part summed it up:
“The uneducated, those that make a living guessing and doing only part of the job, and the few who teach but cannot do — these are the people aiming to knock down SEOs who succeed.”
“SEO aims to make every page the best it can be.”
On that, we agree! Actually most of what you said was covered in the caveats of the articles on this subject.