Changing Site Structures with Care
Website structure refers to the framework by which the information or content of a website is presented for both usability and presentation. Developing a sound website structure will result in a guideline for everything else to fall into place, as it lays the foundation to a solid navigation scheme for users and search engines.
When developing the framework, considerations should be made to the URL structure, internal linking patterns, folder/directory structures, siloing, and theming. You can read about this in further detail in our how to build a website with silos post.
Rarely would somebody develop a website structure that can handle every type of content required over the lifespan of the website. There are a number of reasons for this:
- Addition of new technologies, products or services that were not catered for in the initial IA
- Change of focus of the website usually driven by a change in the focus of the business
- Change in terminology or categorisation used to describe the technologies, products or services
- Website being sold to a new company
- Buying a new company and wanting to incorporate their products and services into one website
- Developing partnerships with another website or brand and catering for this on site
- Implementing a broader scope of topics to cover to further build out the site and compete with ranking competitors
- Implementing a different navigation structure to improve the ease of conversion steps
Ultimately, most of the issues above stem from insufficient time spent developing logical and structured information architecture and considering the SEO implications when developing your IA.
As you could imagine, changing a website structure that has gained rankings, trust and authority over a number of years can be risky business. Webmasters need to be especially sensitive when chopping and changing a site around, as carelessly doing so will have Google and the other major SE’s leaving you stung.
Why so?
Because amongst other search engine optimisation issues, playing around with site structure and the movement of pages without considering the SEO implications increases the likelihood that your site will suffer from:
- broken internal links
- loss of PageRank from external links
- indexation issues
- reduced rankings
- reduced SEO traffic
- reduced user experience (e.g. increased 404 errors),
When crawlers or users come to your site, your number one priority should be to ensure that they have a hassle-free experience when navigating your pages. This is why precautions must be taken both before and after making any drastic changes to a website.
Depending upon the scenario, you may be moving to a completely new URL structure, a new domain, or merely incorporating new themes within your website.
Organisations should consider the following in light of their own needs and requirements, but for the sake of making an example, the following focuses on changing the URL structure on the same domain.
Before The Site Goes Live
Prior to making any changes, you should consider:
Documenting Key Site Characteristics
If you haven’t already, set up Google Search Console and document key site characteristics such as:
- Top search queries
- Links to your site
- Most Common Keywords
- Internal Links
- Crawl Errors
- Crawl Stats
- Site Performance
Outside of Google Search Console
It’s also a good idea to:
- Check the pages indexed and inlinks from Yahoo! Site explorer
- Check the pages indexed by Google with a site: search i.e. site:www.example.com
- If you haven’t already, set up Bing Webmaster Tools (BWT) and export all data and take a screenshot of your site’s stats in the BWT interface.
Staggering Site Changes
If you’re intending on making quite a lot of changes to pages, attempt to minimise the impact of this on search engine rankings by staggering the changes over time. Smaller changes over time will have a smaller impact on search results than a bull-at-a-gate approach. Put together a brief plan on when and what you’ll be changing over a set period to keep track of this. This also allows you to measure the impact of each change and roll back changes that appear to have a long term negative impact.
Analytics Code
Before you put any new pages up, be certain that all pages include the appropriate code for your analytics platform to track data. Without analytics data, it will be impossible to measure the impact of the changes you are making to your website (if you’re using Google Analytics, then see our how to build a tracking code instructions). It’s also a good idea to include annotations within your analytics package or a change log that records the date and details of the changes made.
Conversion Steps
If your site’s primary goal is to make a conversion such as making a sale, generating a lead or encouraging an action, make sure you have revised the changes you will be making to ensure that the new structure won’t directly affect the site’s conversion steps.
503 Status Header
If the site is going to be down for a number of hours, the site should be set-up to return a page 503 header status, with a Retry-After header, for all pages requested. This tells the search engines that the server is down for maintenance and asks it to return after the specified number of hours. This should only be done for the time the site is to be down and then removed.
Internal Linking Structures and Anchor Text
Make sure that all of your internal links follow the correct linking strategy. You should be linking within silos and not across them, and up to key parent pages from child pages. Ensure that you’ve used good anchor text with all of your links. The anchor text should include primary or associated keywords of the page that you’re linking to. You can read here in detail more about both internal linking structures and anchor text.
Testing Broken Links
The last thing you need is for a bunch of broken internal links messing up the experience for both user s and crawlers on your site. Xenu Link Sleuth is your best friend for avoiding this.
This tool crawls your website and identifies any broken links in a quick report. Run this on the new site or pages before they go live and fix any troublemakers.
Styling and Formatting
Although not so important from an SEO perspective, ensure all pages across the site have consistent styling with headings, title casing, white space, bolding, and bullets. Most content management systems produce generic templates to ensure this, however if you’re not using a CMS it’s something to think about.
Robots.txt
If you use a publicly accessible staging site, avoid indexing the staging version with an exclusion statement in your robots.txt file.
User-agent: *
Disallow: /
When The Site Goes Live
301 Redirects
If you have existing pages that you have moved to new URLs, make sure you implement 301 permanent redirects to preserve backlinks and PageRank that the page originally had. These need to be done immediately at go-live and are the most critical consideration in this entire process.
HTML and XML Sitemaps
Update your HTML sitemap that appears on your new website. If you don’t have one, you must create one to help visitors navigate your site.
In addition, creating an XML Sitemap and submitting your site to Google is crucial for spiders to understand the site’s key pages and structure. Manually generating an XML Sitemap is a little tricky, but there are many sitemap generator tools online to make this a little easier.
The last step here is to ensure you submit a sitemap of the old URLs and maintain it to make sure all of the 301 redirects get followed.
Robots.txt
Once again, revisit your robots.txt file and revise it to now include or allow access to the new site’s pages once live by specifying:
User-agent: *
Disallow:
Of course, your robots.txt file may have more instructions to include such as including or excluding certain directories, but this is just an example.
If your old site still exists, implement a robots.txt file to exclude crawlers to its pages.
See here for a more detailed insight into the robots.txt exclusion protocol.
Breadcrumbs
Breadcrumbs are a trail of hyperlinks that usually appear toward the top of a page to help users understand where on the website they are, and how to get back.
Make sure that these are appearing structurally correct throughout the trail and also that they include the right keywords. Breadcrumbs should generally be around 1-5 words max.
Basic SEO
Although it may be overlooked at times, basic SEO considerations such as the following are still important and should be included in all of your website’s new pages:
- Meta title
- Meta description
- Meta keywords
You can read here in further detail about the ins and outs of meta data.
After You Make the Changes
Now that you’ve gone live with the new site or content, it’s not time to sit back and kick your shoes off. There’s still some work to do.
After making any changes, you should immediately consider:
404 Error Page
During this sensitive process, the likelihood of 404’s appearing increases. This is due to broken internal links or missed 301 redirects, so remember to check GWT immediately after the site goes live to check for any 404 errors. You can read further in detail here about 404 pages.
Broken Links
Run Xenu again as a precautionary to find any more broken internal links.
Check Analytics
Remember I said before going live you must implement analytics code into all new pages? Now it’s time to check that this is actually working. Across most analytics platforms, generally data won’t appear for a few days, however once you think it should be tracking then jump into your account and make sure that it is.
Keep an Eye on Site Performance
All of the key site characteristics outlined in point one that you should document before going live, should now be monitored immediately on go-live, and then regularly for at least 30 days.
Make note of any drastic changes to anything such as traffic, backlinks and pageviews, but more importantly make sure there are no crawling issues or duplicate content. You can read here in detail about monitoring, measuring and analysing web analytics.
Remember that it’s better to be safe than sorry, so by considering these factors before and after any site changes you are avoiding what inevitably would be a nightmare to recover from for you and your website’s rankings.
5 Replies to “Changing Site Structures with Care”
Thank you chee chun,
It’s a clear answer but Honnestly …
I m a little bit surprised about using nofollow link in 2011 (in a page rank perspectve)
In one part, We “heared” that using nofollow create an evaporation of page rank, in an ather part, I m sure that you have a strong knowledge to share, and if you recommand (in some case) the used of nofollow, i m sure that you know what you’re saying, so :
Do you recommand to use nofollow (between differents silos supporting page) for control the flow of “page rank” ?
or,
Do you recommand to use nofollow for control the “flow of relevance flow”/”flow of trust” within the pages (accepting to loose few points of page rank) ?
Hi Pay Per Results,
According to Bruce Clay guidelines we recommend that you link between supporting pages within a silo. Since the /ice-cream/black-strawberry.html page and the /ice-cream/strawberry.html both lie in the Ice Cream silo, and both are pointing to the Ice Cream landing page it is perfectly fine to add a link on the /black-strawberry.html page pointing at the /strawberry.html page.
It is treated differently however if you link from a supporting page in Silo A to a supporting page in Silo B (linking supporting pages within different silos). In this case, you can link to each other but remember to include the rel=nofollow attribute as the destination URL is not a landing page and you don’t want to cause PageRank bleed from the Ice Cream silo.
Ideally, you should move your /ice-cream/black-strawberry.html page to /ice-cream/strawberry/black.html to create a stronger silo around strawberry.
Hope that answers your question!
Sorry,
let me give you an example of situation that for me is problematic.
Lets say that i have a landing page about “ice cream”
url : /ice-cream/index.html for –> “ice cream”
And these subsection pages :
url : /ice-cream/strawberry.html for “strawberry ice cream”
url : /ice-cream/chocolate.html for “chocolate ice cream”
url : /ice-cream/black-strawberry.html for “black strawberry ice cream”
I know that this example is funny, but could i add a link on black-strawberry.html that point to strawberry.html (of course using it’s specific anchor text) and vice versa.
Thanks for your help
Hi there, I m french so please apologize my english level ;-)
I hope you’ll help me to undersand one point concerning Siloing in 2011.
Historical rules of siloing says : To don’t link between subsection contents (also if they are part of the same directory ?)
Let say that i have a landing page that i m trying to rank for a single Keyword. Then i have athers pages (subsection contents) that :
> 1st, Are supposed to support my favorite landing page
> 2nd, I m trying to rank each for a more specific “keyphrase”
Let’s focus on these specifics pages now. Knowing that they have a verry strong “relation” between them (and not only with the landing page), why i couldn’t also link between them ? (of course using specific mid tail anchor text for each one of them ?)
If it’s really relevent in the context, why i couldn’t do it ?
Is it possible to try to think about a new and a little bit more “flexible siloing approach” that authorized (only if it’s really relevant) linking between these subsection pages ?
Thank you so much.
PS: If not, don’t you see a risk that these pages will be not often crawled by robots ?