3 Reasons to Always Have Structured URLs
|Estimated reading time:
• The concept of flat site architecture is often misunderstood; structured URLs are the way to go.
• Structured URLs: 1) Help semantics 2) Offer the best indexing control and 3) Give better SEO traffic analysis.
• Ecommerce sites require special attention for a product that exists in several categories at a time.
Since the flat site architecture concept appeared on the SEO horizon and gained some traction around 2010, many SEO consultants got it wrong. The flat site architecture concept is related to the click distance between pages in a site, and how relevancy is distributed according to internal links structure — yet has nothing to do with URLs.
The main misunderstanding was, and unfortunately still is, that you have to get rid of directories in URL structures. Although it is widely agreed that you may want to keep URLs short and locate keywords close to the root or left part of the URL, there are many reasons why you should keep a certain structure of folders or directories there. This is what I’m going to explain in this post.
After the flat site architecture concept was introduced, many SEO consultants freaked out and changed their structures from something like:
Or even worse:
A little warning: If, for any reason, you do decide to remove directories from content page URLs, never get rid of category pages. Following the previous examples, do not destroy URLs/pages like:
… Because they represent a great space/opportunity for a better content strategy, keyword allocation and internal link building. This is SEO 101, but just in case.
1. Structured URLs Help Semantics
You already know that from a search engine perspective, a site is not a big bag of unordered words. Search engines try to make sense of text by analyzing how those texts are organized in main topic and subtopics. URLs are the ID of every page, so the more they reveal about how the content is structured, the better. An example of how relevant directories at URL structures can be is the breadcrumbs you see frequently on search engine results pages (SERPs):
It is true that the typical HTML breadcrumbs in a page can trigger them to appear, but I’ve seen many cases where the only reason for that was a clear, organized URL structure (showtimetickets.com/concerts/rock-pop/) with no HTML breadcrumbs at all.
Tip: Add specific semantic markup to breadcrumbs in combination with a coherent URL structure and your chances to get breadcrumbs in the SERPs skyrocket.
Take the URL: http://www.overstock.com/Clothing-Shoes/Womens-Shoes/692/cat.html. Have a look at the HTML code of page breadcrumbs to see the semantic markup:
You don’t want to miss that extra click-through rate, right?
2. Exhaustive Indexation Control
A common task of SEOs is checking how many of the pages in their sites are indexed by search engines. It should be simple: list of URLs in a site, list of URLs indexed at search engines and compare. Not so, especially at large sites. Trying to make an exhaustive inventory of indexed URLs to find out the non-indexed ones can be a real pain. Even worse, Google’s site: command is not going to show more than 1,000 URLs.
The no-brainer trick here is to use site: command for sections of the site URL delimited by directories. Once you get a number of indexed pages smaller than 1,000, it is not hard to list them all out of the SERPs by using the OutWit Hub Firefox plugin, for example.
Of course, it takes time to collect all URLs indexed by section, but this is one of the reasons they pay us SEOs, right?
The second step is comparing the indexed URLs with the ones available at the site. Your XML Sitemap should work like a charm and list absolutely all URLs; unfortunately, this doesn’t happen frequently. Use some of those tools that simulate a bot crawling your site like Screaming Frog Spider Tool or Xenu Link Sleuth. Both are able to crawl and list all URLs below a certain directory, but Screaming Frog does that by default. If using Xenu, it’s something you have to configure before a crawl job.
In any case, you can probably see how handy structured URLs can be for the finest indexing control.
3. Better SEO Traffic Analysis
Measuring SEO performance is a remarkable chunk of time in any project. It was always important to me, but after the Google Panda algorithm update rocked our world, it became essential to analyze organic traffic by sections of the site.
Content drilldown reports are nothing without the proper URL structure.
Reports based on pages are way useful and easier to manage when we can filter groups of URLs to break down by sections and, again, having directories at URLs makes regular expressions easy as pie; this is a real advantage. Otherwise, you would end up having the mother of all regex, and the uncertainty of leaving part of them out of the bucket to analyze.
If your URLs have not been carefully crafted, traffic analysis is going to be the worst of your nightmares (believe me, I’ve been through that) unless you make perfect use of virtual pageviews. The CMS of your site has extra fields in the database to handle readable names, and the code logic to populate those virtual pageviews; unlikely to happen and an expensive solution.
For example, the URL:
- myautoparts.com/audi-a6-quattro-engine-belt-kit.html (no directories here)
- VPv _gaq.push(['_trackPageview', '/engine-parts/cooling/audi-a6-quattro-engine-belt-kit.html'] (Directories inserted by code logic at page tracking level with category names coming from database to cover the lack or real categories at URL level)
An E-Commerce Scenario
It is quite common at e-commerce sites to have URLs reflecting directory categories and subcategories of products, for example: /engine-parts/cooling/ — but when it comes down to product level, they are all allocated under something like /products/audi-a6-quattro-engine-belt-kit.html — completely out of their natural allocation under corresponding categorized URLs.
I usually dislike this solution, but it is handy to solve duplicate content issues where one product belongs to several categories at the same time. If you must do this, use something better to describe what you sell other than just “products” – it could be /auto-parts/. And at least use one directory for all products. Do not place them directly under root domain.
One of the advantages is you can easily guess at-a-glance how much traffic product pages get. I always include a chart in SEO e-commerce dashboards that shows:
- Total visits from organic traffic
- Visits that view product pages
- Visits that added products to shopping cart
- Visits that completed checkout process
This gives me a very nice perspective of how my SEO performs while converting visitors into clients.
In sum, don’t fool yourself with misconceptions. Before making any decision in your SEO, think about the pros and cons. We have reviewed three primary reasons why you should do a clever use of directories in URLs for a better SEO. Those reasons are:
- They reinforce the search engines semantic understanding of your content.
- Indexation control is easier to manage while having directories.
- Full profit of analysis capabilities thus better insights from your data.