BACK TO BASICS: The Fundamentals of Search Engine Optimization

by Susan Esparza, June 30, 2007

Most search engine optimization professionals focus on writing about
the very latest thing, the bleeding edge of optimization. They’re
looking for the thing that’s going to put their client at number one.
What’s the problem with that?

The trouble is that most sites aren’t
fighting for number one. The vast majority of Web sites are looking
just to get in the game. They’re not training to be a gold medalist;
they’re just trying to get into shape. The objective of the Back to
Basic series is to look at the ways in which site owners can begin their
quest for rankings.

The tendency of SEO practitioners is to develop and discuss advanced
techniques for improving positioning. For a site owner new to search
marketing, this can look very much like the experts are advocating these
advanced techniques as surefire ways to ranking success. A beginner
learns about social media, linkbait and media optimization and
concentrates their efforts on those, neglecting the so-called ‘old
school SEO techniques’. This leads to frustration as their efforts
require a great deal of work but yield very little in the way of

The trouble with this approach is that when you try to leap into
advanced techniques without doing the basics it is very often like
trying to build a castle on air. The foundations of your search engine
optimization campaign–a well-built site on a stable, optimized
server–have never been put into place and thus can not support the
weight of whatever other efforts you’re engaged in. If you don’t do the
basics right (see how to do Seaerch Engine Optimization for how to prepare for expert SEO) in navigation, structural and content in the first place,
you’re going to have a very hard time getting anywhere.

While it is not impossible to rank for a term with a site that has
zero optimization (for example, a site built entirely in a single Flash
file), it will require exponentially more work. Instead of handicapping
your campaign with a site that just doesn’t work, take the time to
establish best practices. While many of these techniques are old and do
not contribute much individually to keyword rankings, overall the
combination will yield greater return on efforts than neglecting them in
favor of newer, more advanced techniques.

Optimizing the Head Section

In the old days, all that it took to get ranked in the search engines
was five minutes of tweaking the Head section by adding keywords to the
Meta tags. You would submit the page and shortly thereafter be number
one for your query. Obviously this is no longer the case, but that does
not mean that optimization of the Head section of a page is a worthless
endeavor. Optimizing the Head section of a Web page doesn’t take much
time but the investment of those minutes per page is training your site
for the big show.

The first step when creating a new site or re-engineering an older
site for search engine ranking success, is to take a look at the Head
section of each page on the site. Are your Titles, Meta Description,
Meta Keywords, Meta Robots tags in place with unique, relevant
information that is targeted to the topic of the page as well as the
theme of the silo? Is the code clean and understandable, without spider
traps or overly long JavaScript or CSS code?

Search engine spiders read every page, top to bottom, parsing the
code and building it to see what the page really looks like. They seek
relevancy based on the type of criteria that they can understand–text
based content. The first words they encounter are in the Title tag. It
is intuitive that these words should therefore set the theme of the Web
site and give the search engine an idea of what sort of content the site
will provide.

To be clear, you will never rank for a competitive term on the basis
of your Title tag alone. Nor on your Description or Keywords tag. But
the repetition of key phrases without spamming of these tags begins to
form the basis for further ranking efforts.

Creating Worthy Content

It is an old and well-accepted cliché that “Content is king”.
Content is also time consuming, resource intensive and utterly vital to
the success of a Web site in the search engines. Without keyword rich
text content, search engine spiders have to rely only on secondary
sources of data to discover what a Web site is about. They look at
anchor text–both internal and inbound, read alt attributes and try to
determine a theme through guesswork and none of these factors will
substitute for unique supporting content.

Develop a strategy for adding at least 250 words, preferably more, to
each page. Setting a schedule and producing X number of pages every
week will eventually build up a site that can serve as a subject matter
expert in the areas which are important to your business. Focus not on
gimmick pages, link bait Top 10 lists and other flash in the pan
strategies but on developing content that will satisfy a researcher and
convert them to a buyer.


The best and easiest way to display mastery over a long tail of
keywords is to structurally build your site so that generic keywords are
supported by more specific keywords. Sites with a clear directory
structure, siloed content and easy to follow navigation are at an
advantage over sites without these foundationally elements.

Construct information silos on your key topics with each silo’s
landing page supported by at least five sub-pages of information rich
content. Search engines are research oriented, if you present them with
information in a way that they can understand, they will be better able
to determine what your site is about and will find it relevant for that
many more queries. Failing to organize your site will lead to a
muddled theme that generally will have a hard time ranking for anything
but the most generic keywords.

Server Stability

A key and yet often overlooked point of failure for a Web site is the
server environment on which it resides. A slow server or a server that
often fails can prevent a site from ever being indexed in the first
place or cause it to drop out of the index if it is present. Sites that
frustrate a user by not being available are not the type of sites that
any search engine wants to present in their results pages.

Google’s Webmaster Console provides information in the event that they
are having trouble spidering your site. Fixing these issues is
imperative to developing and maintaining search engine rankings even if
server issues do not feature at all in the 200 ranking signals in
Google’s algorithm.

None of the issues covered in this article will shoot a site to
number one. There are no quick fixes presented here. However, doing
these things in the beginning of your SEO project will get your Web site
into shape in order to compete for top rankings. Once you’ve developed
a strong site, you can get into the advanced techniques that will get
you number one.

For permission to reprint or reuse any materials, please contact us. To learn more about our authors, please visit the Bruce Clay Authors page. Copyright 2007 Bruce Clay, Inc.

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | 2245 First St., Suite 101 | Simi Valley, CA 93065
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919