Step 16: Technical SEO Tips
Wow, we sure have covered a lot of ground so far! Up to this point, the SEO tutorial has primarily focused on developing quality content that earns links naturally, and optimizing that content for search. Now, we’re going to shift gears and give you technical SEO tips on various issues that can each be critical to ranking success. Without keeping an eye on a few technical things, you could watch the hard work you put into optimizing your website go to waste — like a leak that ends up sinking an otherwise seaworthy vessel.
The search engines must be able to find, crawl and index your website properly. In this lesson, we've assembled a list of technical topics you need to be aware of, with tips to avoid mistakes that could sink your SEO ship:
- Duplicate content
- 404 error page
- Site performance
- User-generated spam
- Structured data
Checking Your Instrument Panel
Before we cast off and start talking technical, let's make sure your instruments are working. To do SEO well, you must have analytics installed on your website. Analytics data is the driving force of online marketing, helping you better understand how users are interacting with your site.
We recommend you install this free software: Google Analytics and possibly Bing Analytics (or a third-party tool). Set up goals in your analytics account to track activities that count as conversions on your site. Your analytics instrument panels will show you: which pages are visited most; what types of people come to the site; where do visitors come from; traffic patterns over time; and much more. Getting analytics set up is one of the most important technical SEO tips, since it will help you keep your search engine optimization on course.
Casting Off ... Technical Issues to Watch Out for
1. Avoid Cloaking
Keep your site free from cloaking (i.e., showing one version of a page to users, but a different version to search engines). Search engines want to see the identical results users are seeing and tend to be very suspicious. Technically, any hidden text, hidden links or cloaking should be avoided. These types of deceptive web practices frequently result in penalties.
You can check your site for cloaking issues using our free SEO Cloaking Checker tool. We suggest you run your URL through it on a monthly or regular basis (so bookmark this page).
Free Tool – SEO Cloaking Checker
Enter your website's URL below.
|= No cloaking found, = Cloaking found. You may need to check the site for the specific user-agent.|
2. Use Redirects Properly
When you need to move a web page to a different URL, make sure you’re using the right type of redirect, and that you’re also redirecting users to the most appropriate page. As a technical SEO tip, we recommend always using a 301 (permanent) redirect. A 301 tells the search engine to drop the old page from its index and replace it with the new URL. Search engines transfer most of the link equity from the old page to the new one, so you won’t suffer a loss in rankings.
Read more: How to Properly Implement a 301 Redirect
3. Prevent Duplicate Content
You need to fix and prevent duplicate content issues within your site. Search engines get confused about which version of a page to index and rank if the same content appears on multiple pages. Ideally, you should only have one URL for one piece of content. When you have duplicated pages, search engines pick the version they think is best and filter out all the rest. You lose out on having more of your content ranked, and also risk having "thin or duplicated" content, something Google's Panda algorithm penalizes. (See Step 14 in this tutorial for more detail on penalties.)
If your duplicate content is internal, such as multiple URLs leading to the same content, then you can decide for the search engines by deleting and 301-redirecting the duplicate page to the original page. Alternatively, you can use a canonical link element (commonly referred to as a canonical tag) to communicate which is the primary URL. Either solution should be used with care.
Read more: Plugging Duplicate Content Holes
4. Create a Custom 404 Error Page
When someone clicks a bad link or types in a wrong address on your website, what experience do they have? Let's find out: Try going to a nonexistent page on your site by typing http://www.[yourdomain].com/bogus into the address bar of your browser. What do you get? If you see an ugly, standard "Page Not Found" HTML Error 404 message (such as the one shown below), then this technical SEO tip is for you!
Most website visitors simply click the Back button when they see that standard 404 error, leaving your site forever. Since it's inevitable that mistakes happen, and people will get stuck sometimes, you need a way to help them at their point of need. To keep people from jumping ship, create a custom 404 error page for your website.
First, make the page. A custom 404 page should do more than just say the URL doesn’t exist. While some kind of polite error feedback is necessary, your customized page can also help steer people toward pages they may want with links and other options. Additionally, you want your 404 page to reassure wayward visitors that they’re still on your site, so make the page look just like your other pages (using the same colors, fonts and layout) and offer the same side and top navigation menus. In the body of the 404 page, here are some helpful items you might include:
- Apology for the error
- Home page link
- Links to your most popular or main pages
- Link to view your sitemap
- Site-search box
- Image or other engaging element
Since your 404 page may be accessed from anywhere on your website, be sure to make all links fully qualified (starting with http).
Next, tell your server. Once you’ve created a helpful, customized error page, the next step is to set up this pretty new page to work as your 404 error message. The setup instructions differ depending on what type of website server you use. For Apache servers, you modify the .htaccess file to specify the page’s location. If your site runs on a Microsoft IIS server, you set up your custom 404 page using the Internet Information Services (IIS) Manager. WordPress sites have yet another procedure.
We should note that some smaller website hosts do not permit custom error 404 pages. But if yours does, it’s worth the effort to create a page you’ve carefully worded and designed to serve your site visitors’ needs. You’ll minimize the number of misdirected travelers who go overboard, and help them remain happily on your site.
Read more: Create useful 404 pages (Google Support article)
5. Watch Out for Plagiarism (There are pirates in these waters ...)
Face it; there are unscrupulous people out there who don't think twice about stealing and republishing your valuable content as their own. These villains can create many duplicates of your web pages that search engines have to sort through. Search engines can usually tell whose version is the original in their index. But if your site is scraped by a prominent site, it could cause your page to be filtered out of search engine results pages (SERPs).
We suggest two methods to detect plagiarism (content theft):
- Exact-match search: Copy a long text snippet from your page and search for it within quotation marks in Google. The results will reveal all web pages indexed with that exact text.
- Copyscape: This free plagiarism detection service can help you identify instances of content theft. Just paste the URL of your original content, and Copyscape will take care of the rest.
Try to remedy the plagiarism issue before it results in having your pages mistakenly filtered out of SERPs as duplicate content. Ask the site owner to remove your stolen content from their website. You could also consider revising your content so that it’s no longer duplicated. (SEO Tip: If you can't locate contact information on a website, look up the domain on Whois.net to find out the registrant's name and contact info.)
Read more: About Scraper Sites
6. Protect Site Performance
How long does it take your website to display a page? Your website’s server speed and page loading time (collectively called "site performance") affect the user experience and impact SEO, as well. It's a site accessibility issue for users and spiders. The longer the web server response time, the longer it takes for your web pages to load. Slow page-loading times can reduce conversion rates (because your site visitors get bored and leave), slow down search engine spiders so less of your site gets indexed, and hurt your rankings.
You need a fast, high-performance server that allows search engine spiders to crawl more pages per sequence and that satisfies your human visitors, as well. Web design issues can also sink your site performance, so if page-loading speed is a problem, talk to your webmaster. (SEO tip: Use Google's free tool PageSpeed Insights to analyze a site's performance.)
SEO TUTORIAL BONUS VIDEO
Matt Cutts reveals here that page speed can be a factor in Google's algorithm, particularly as a tie-breaker between otherwise equal web or mobile results.
In our SEO consulting and testing, we've seen that there is an SEO optimization benefit to fast site performance — and conversely, great harm to your users and bottom line if your site is too slow.
7. Use robots.txt Appropriately
What's the first thing a search engine looks for upon arriving at your site? It's robots.txt, a text file kept in the root directory of a website that instructs spiders which directories can and cannot be crawled. With simple "disallow" commands, a robots.txt is where you can block indexing of:
- Private directories you don't want the public to find
- Temporary or auto-generated pages (such as search results pages)
- Advertisements you may host (such as AdSense ads)
- Under-construction sections of your site
Every site should put a robots.txt file in their root directory, even if it's blank, since that's the first thing on the spiders' checklist. But handle your robots.txt with great care, like a small rudder capable of steering a huge ship. A single disallow command applied to the root directory can stop all crawling — which is very useful, for instance, for a staging site or a brand new version of your site that isn't ready for prime time yet. However, we've seen entire websites inadvertently sink without a trace in the SERPs simply because the webmaster forgot to remove that disallow command when the site went live. (SEO Tip: Some content management systems (e.g., WordPress) come with a prefabricated robots.txt file. Make sure that you update it to meet your site's needs.)
Google offers a robots.txt Tester in Google Webmaster Tools that checks your robots.txt file to make sure it's working as you desire. Secondly, we suggest running the Fetch as Google tool if there's any question about how a particular URL may be indexed. This tool simulates how Google crawls URLs on your website, even rendering your pages to show you whether the spiders can correctly process the various types of code and elements you have on your page.
Read more: Robots Exclusion Protocol Reference Guide
8. Be on the Lookout for Hacked Content & User-Generated Spam
Websites can attract hacked content like a ship's hull attracts barnacles — and the bigger the site, the more it may attract.
Hacked content is any content that's placed on your website without your permission. Hackers work through vulnerabilities in your site's security to try to place their own content on your URLs. The injected content may or may not be malicious, but you don't want it regardless. Some of the worst cases happen when a hacker gains access to a server and redirects URLs to a spammy site. Other cases involve bogus pages being added to a site's blog, or hidden text being inserted on a page.
Google recommends that webmasters look out for hacked content and remove it ASAP.
Similar to this problem, user-generated spam needs to be kept to a minimum. Your website's public-access points, such as blog comments, should be monitored. Set up a system to approve blog comments, and keep watch to protect your site from unwanted stowaways. Google often gives sites the benefit of the doubt and warns them, in Webmaster Tools, when it finds spam. However, if there's too much user-generated spam, your whole website could receive a manual penalty.
9. Use Structured Data
Structured data markup can be a web marketer's best mate. It works like this. You mark up your website content with additional bits of HTML code, and the search engines read these notes to learn what's what on your site. The markup code gives search engines the type of context only a human would normally understand. The biggest SEO optimization benefit is search results may display more relevant information from your site — those extra "rich snippets" of information that sometimes appear below the title and description — which increases your click-through rates. Structured data markup is available for many categories of content (based on Schema.org standards), so don't miss this opportunity to improve your site’s visibility in search results by helping your SERP listings stand out.
Just two more lessons to go! In the next lesson, we'll cover essential mobile SEO tips for the ever-growing number of mobile searchers.
Related blog posts and articles: