Step 9: Sitemaps – How to Create a Sitemap

Give search engines a map

Photo by Christopher Bulle (CC BY 2.0), modified 

Creating a sitemap for your website and keeping it up-to-date are important SEO best practices. Search engines can crawl and index your website more completely if they have an XML sitemap for reference.

This necessary SEO tutorial lesson covers how to create a sitemap so you can welcome search engine spiders and help them find their way around when they visit.

What Is a Sitemap?

An XML sitemap is a text file webmasters create that tells search engines like Google and Bing what a website contains.

Basically, it’s a list of all the URLs (the page addresses) that you want indexed ​for your site — URLs of web pages, images, videos and other content files on the site — formatted with a few XML tags.

In this lesson you learn how to set it up right so that the search engines refer to your sitemap again and again.

XML vs. HTML Sitemaps

XML sitemaps should not be confused with HTML sitemaps, which are regular web pages created to help human visitors get around a website. Each has SEO benefits, so you should create both XML and HTML types for your site. Here are the differences:

XML Sitemaps HTML Sitemaps
Crawlable by search engines Yes Yes
Read by human visitors No Yes
Maximum size 50,000 URLs or 10MB uncompressed Not specified, but keep it user-friendly
Format XML file (plain text) Web page (can be pretty)
Linked from Robots.txt file Site navigation (footer)
Can be manually submitted to search engines Yes No
Recommended for SEO Yes Yes

How Many Sitemaps Should I Create?

Every site needs at least one XML sitemap. Having an up-to-date XML sitemap is really an essential SEO best practice.

(By contrast, submitting your site manually is an optional task. You ​only need to do a submission occasionally, such as when you launch a new site, add a new site section, or change content and don’t want to wait for the crawlers to find it.)

Large websites may need to break their list of URLs into multiple XML sitemaps. This ensures that the number of page URLs per sitemap doesn’t exceed the limit.

It is also recommended (for any size website) that certain types of files be listed in their own specialized sitemap: videos and news ​are two examples. Therefore, if you have videos on your site, create a specialized video XML sitemap to help make sure the search engines find your video files.


In another useful video from Google Webmaster Help, Matt Cutts answers why it’s important to offer an HTML site map AND an XML Sitemap.

Listen as he explains that since they meet different needs, both are important, especially for search engine crawl​ing.

How to Create an XML Sitemap

You can create a sitemap manually, but using a sitemap generator makes the job easier. There are many good third-party tools for creating XML sitemaps automatically. One is Microsoft Bing’s free server-side Bing XML Sitemap Plugin, which can automatically generate two types of XML sitemaps that can be read by any search engine:

  • Comprehensive sitemap, which includes all files (except any you disallow in your robots.txt file)
  • Recently updated sitemap, which includes URLs of changed files only (useful for your own tracking or for prioritizing the pages that search engines should crawl)

NOTE: Any search engine can read your XML sitemap files because they comply with protocol.

How to Submit Your Sitemap to Search Engines

You can submit your XML sitemap(s) to Google and Bing using the Sitemaps feature within their webmaster tools:

  • Google: Log in to your Google Search Console account. Under the Crawl menu, choose Sitemaps.
  • Bing: Log in to Bing Webmaster Tools. You can use the Sitemap widget on your Dashboard or go to the Sitemaps feature, located under the Configure My Site section.

The above methods let you proactively submit your XML sitemap file(s) to the search engines if you want to. Regardless, make sure you specify your XML sitemap’s location in your robots.txt file, where the spiders are sure to find it then next time they come crawling. (A robots.txt file is simply a text file saved at the root of your website that gives instructions to visiting search engine spiders.) Your robots.txt file should look similar to this, with a Sitemap directive line for each of your different XML sitemaps:

User-agent: *
Disallow: /tmp/
Disallow: /filename.html

That’s it! Once you create your XML sitemaps and tell search engines where to find them using your robots.txt file, the search engine spiders should do the rest. If you need more details on creating a Sitemap, see Google’s Search Console Help.

Next in the SEO tutorial, you’ll learn how to use rich media elements properly to make your site more engaging and more rankable.

Need more SEO tips?
Building an XML Sitemap

Related blog posts and articles:
XML Sitemaps in SEO – Part 1
10 Video SEO Tips to Improve Rank and User Experience
How to Set Up Google Search Console (Webmaster Tools)


8F, omi Bldg. 3-19-1 Shibuya
Shibuya, Tokyo 150-0002 JAPAN