11 Technical SEO Elements to Help Your Site Win a SERP Rank Gold Medal

SEO is an extremely competitive sport. So competitive, in fact, that sometimes competing for a page one ranking in the SERPs can feel a lot like competing in the Olympics, with the coveted number one SERP spot shimmering as a distant dream like one of Michael Phelps’s 18 gold medals.

Photo by tableatny (CC BY 2.0)

Like the sizable number of factors that contribute to whether an athlete is able to snag the gold for her home country, there’s a corresponding list of ranking factors that contribute to whether a web page is able to beat out the competition and seize the golden first SERP spot.

At the top of both those lists not to be overlooked is qualifying to compete. For athletes this means training and paperwork; for optimizers this means technical SEO. In both cases, you can’t just show up before a series of met requirements deem you eligible to stand at the starting line.

In other words, just like a swimmer can’t possibly win a race if they never qualify to compete, a web page can’t possibly beat out the competition and win a number one SERP position if crawl, server or indexing errors prevents it from being discovered, cached or recalled.

Technical SEO is all about making sure your website is eligible to take that first step off the starting line.

Because I know your ranking reports deserve to be filled with high-quality content decorated with bushels of number one SERP rankings (metaphorical gold medals), I’ve compiled this 11-point technical SEO guide to help you make sure you content gets to the game on time.

Please use this guide as a technical SEO primer, and then feel free to join me for a discussion of the topic on Google+, or – if you’re ready to make a big step toward being a SERP gold medalist – consider joining Bruce Clay in March 2014 for SEOToolSet Training or an SMX SEO Workshop. Both are comprehensive ramp-up training sessions with ample one-on-one time.

11 Technical SEO Elements That Help Your Site Rank

1. Create an HTML Sitemap

An HTML sitemap is a regular page on your website that contains a collection of links intended to help both humans and search spiders navigate your site. Since web crawlers use links to navigate from one page to another, having an HTML sitemap in the footer of every page of your website allows the search spider to enter your site at any page and then, from that page, systematically discover a significant portion of your other pages quickly via the sitemap. Human users also reference the HTML sitemap and use it to navigate your site, so human-friendly presentation and organization is recommended.

Learn what Matt Cutts has to say about the value of HTML sitemaps.

2. Create an XML Sitemap

An XML Sitemap lists all of the pages on your website that you want a search spider to crawl and index. The XML Sitemap is only for search spiders, so it doesn’t have to be pretty; it can literally just be a one URL per line list of links saved as a text file. To help ensure that all the important pages on your site get crawled and indexed, it’s important that you keep your XML Sitemap up to date. While an XML Sitemap doesn’t guarantee that all yours pages will be crawled or indexed, it definitely can help.

Learn how to build and submit an XML Sitemap.

3. Keep Code Clean and Make JavaScript and CSS External

Search spiders only spend a limited amount of time crawling your web pages, so you don’t want to waste that time having the spider crawl hundreds of lines of useless clutter code. To make your website’s underlying code more spider-friendly consider minimizing inline markup, putting JavaScript code in an external .js file, and externalizing design-oriented CSS.

4. Make Your Site Speedy

Since Spring 2010, Google has been using website site speed as a known ranking factor. Google loves speed; Google Senior Vice President Amit Singhal has said it himself many times. One way to make your website faster is to clean up your code, since less code means smaller file sizes and faster load times.

Learn more about how to optimize your website’s speed  or analyze your site’s speed with the Google PageSpeed Insights tool.

5. Include a Robots.txt File

A Robots.txt file is a publicly accessible text file that guides search spider crawling directives. It is placed at the root of a website host, and is commonly used to stop search spiders from indexing specific directories and designated files. It’s important this file exists, even if it’s empty. Approach your Robots.txt file with caution and make sure you don’t accidentally exclude any important files!

Learn more about the Robots.txt file and how to use it.

6. Be Thoughtful About Your Internal Linking Structure

Implementing a website siloing strategy can help search spiders more easily understand the theme of your content and its perceived relevance in relation to keyword phrases.

Learn more about website siloing for SEO and the importance of site structure in the absence of keyword data.

7. Check Your Server Configuration for Errors

Search engines may reduce the rankings of a website if search spiders encounter web server errors. In severe cases server errors can cause web pages to be dropped from the index all together. In less severe cases they can negatively affect PageRank as spiders are always looking for the “least imperfect” option and are likely to rank a cleaner, error-free site above a site laden with server errors. To aid your content’s rankability, make sure to regularly check your server for errors that need to be resolved.

Learn how to detect and resolve server issues or check your server’s response with the free Bruce Clay, Inc. Check Server tool.

8. Avoid Flash and Text Contained in Images

An old lesson that still remains valuable: Search spiders can’t “see” Flash content or text contained in images, so don’t use them to convey important information! Instead, use HTML and Alt tags to make your content crawlable.

9. Use the Canonical Tag to Make Sure Dynamic URLs Aren’t Creating Duplicate Content

Google can see and index dynamic URLs, like those that contain sessions IDs, but there is a chance the search engine will crawl and attempt to index each of your dynamic URLs as unique pages – which, if not prevented, could trigger a Panda penalty for duplicate content. To prevent this, make sure you use the canonical tag and Webmaster Tools to indicate the primary page you want Google to return in search results, and to tell Google to ignore the other dynamic versions of your page URL. Google calls this “setting your preferred domain.

Read what Google has to say about canonical URL optimization.

10. Make Sure Your Site is Optimized for Mobile

User experience is the number one priority of Google, and the search engine has been very open about their preference for responsive websites that seamlessly adapt or respond to multiple devices.

That said, since Google sees not having a mobile optimized website as a major user experience flaw – and they are always looking to rank the “least imperfect” websites in top SERP positions – it can be deduced that having a website optimized for mobile is essential to see improved rankings.

Google has several resources to help you improve your mobile optimization including this YouTube video explaining how to improve mobile pages, a Webmaster Tools checklist for mobile website improvement and recommendations for building smartphone-optimized websites.

11. Consider Using Schema Markup

Disclaimer: This recommendation is based on predictive intuition, not actual ranking-factor facts. Last year Matt Cutts publically stated flat out  that Schema markup is not currently a ranking factor. In other words, Schema markup makes SERP listing more prominent – which can undoubtedly increase CTR – but the addition of Schema markup does not send any signals to Google that help a web page rank any higher.

That said, this is the reason why I am going out on a limb to suggest you might consider making Schema one of your technical optimization priorities for 2014:

We are in the era of the semantic web where Google is hungry for context and the ability to deliver page one results that answer queries, rather than repeating them back to searchers. Schema markup gives Google additional, crawlable information about the contents of web pages, as well as advanced information about a page’s theme and contextual purpose (for instance, consider product/offer schema markup). So, in my speculative opinion, I think it’s safe to say schema markup may be able to help Google further determine a web page’s relevance in relation to a search query – which could also help Google see your content as “less imperfect” than another competitor website. Why wouldn’t Google take into consideration all the available crawlable clues? If they aren’t already using Schema as a secret ranking factor, I see a good chance they will be in the future. (And if they don’t, I consider implementing Schema markup to be a no-lose SEO strategy since Schema is indisputably an incredible click-through driver.)

Learn more about Schema or watch Matt Cutts’s Google Webmaster Help video that discusses Schema as a theoretical ranking algorithm.

Technical SEO is Hard…

Like winning a gold medal, earning a top spot in the SERPs is hard – and so is technical SEO. If you’re new to the left-brain side of optimization and feel lost in the abyss of robots, siloing, canonicals, and Sitemaps don’t feel discouraged; even Michael Phelps had to start somewhere!

Please use this guide as a technical SEO primer, and then feel free to join me for a discussion of the topic on Google+, or – if you’re ready to make a big step toward being a SERP gold medalist – consider joining Bruce Clay in March 2014 for SEO ToolSet Training or an SMX SEO Workshop. Both are comprehensive ramp-up training sessions with ample one-on-one time.

Chelsea Adams Brooks is a long-distance cyclist, aspiring cob house builder, schema/analytics/algorithm obsessor, and a former senior content writer at Bruce Clay, Inc. Go to Chelsea's author profile to read more of her articles.
Comments (9)
Filed under: SEO — Tags: ,
Still on the hunt for actionable tips and insights? Each of these recent SEO posts is better than the last!

9 Replies to “11 Technical SEO Elements to Help Your Site Win a SERP Rank Gold Medal”

i heared more about structured data, but still i not understand how it is implementing to an website.

I agree with tip # 11 – that structured data will most assuredly become more and more important to Google as we moved forward, ands sites that have it properly implemented will be rewarded by the search engines… in time :-)

Chelsea Adams

Yes! Thanks for chiming in, Andy. Always good to hear some community chime-in one way or another when you throw out a recommendation based on “predictive intuition.” 2014 (and the future) is all about more information, more data, and the sunset of the anonymous web. Schema tells Google more about your site, your content, and your authors — this can only help them in their quest to rank the most relevant high-quality web pages highest (and your quest as an SEO to make your pages more and more “less imperfect” than your competitor’s).

But, like you said, it’s all “…in time.”

To keep the conversation going, join me on Google+ if you’re a G-pluser: https://plus.google.com/u/0/+ChelseaAdamsWrites (Or you can also join all of the Bruce Clay, Inc. team over at https://plus.google.com/+Bruceclayinc)

If a site is made with the right starting onpage optimization, loads quickly and is validated he has all preconditions to rank well. But if there is no quality content, links and does not retain the visitor and the search engines will don’t like it.

You have made some excellent points here. The ones that stick out the most for me are the make your site speedy, and making it optimised for mobile. These two things are probably the most important things for customers because this is what happens first. If your site doesn’t load quickly enough it wont matter if everything else is perfect because they won’t stick around to wait.

Chelsea Adams

Absolutely, Jayne. I think optimizers sometimes forget that rank and CTR don’t matter if clicks never turn to conversion because people hate your website — or worse yet, can’t even navigate through it via Mobile. Speed matters! The Internet has made us an incredibly impatient society, and if I have to wait more than two seconds for your page to load, I’m gone. I know there is another option just one click away and I simply don’t have time to watch the pinwheel spin on my computer while your graphics load…

Bruce, Good article. Its refreshing, but all information is still actual and its all working:)

Chelsea great post!
Thanks I think you have mentioned all the technical views in these 11 points. I have also shared this story on Inbound and twitted!
Keep up the great work!

Chelsea Adams

Thanks for reading, Zeshan — and extra thanks for sharing!


Your email address will not be published. Required fields are marked *

Serving North America based in the Los Angeles Metropolitan Area
Bruce Clay, Inc. | 2245 First St., Suite 101 | Simi Valley, CA 93065
Voice: 1-805-517-1900 | Toll Free: 1-866-517-1900 | Fax: 1-805-517-1919