5 Pillars of SEO-Friendly Site URL Structure

URL – what does the abbreviation stand for? For most of us a URL (aka Uniform Resource Locator) is just an address of some file on the Web.

But everyone who has some grasp of SEO knows that creating an SEO-friendly website environment with a properly organized URL structure is crucially important from an SEO perspective.

A rule of thumb states that a keyword-rich domain name, sound information architecture, proper page naming can significantly enhance website rankings progress. While poor organization of a website URL pattern can not only trigger indexation issues, but also be the reason for lost traffic and bad rankings performance.

Below are 5 practical tips that will tell you how to create an SEO-friendly URL structure for a site.

1. Integrating the non-www and www versions of a domain

Having both www and non-www versions is a common issue lots of domains are diagnosed with. Some site owners just forget to choose their preferred domain and it continues to serve two versions of a URL to search engines. That, in turn, triggers duplicate content issues.

  1. Using 301redirect. This redirect permanently points one version of a site to the other.
  2. Editing an .htaccess file. This is quite a tricky procedure that requires knowledge of coding. You may contact your hosting provider for more instructions.
  3. Setting a preferred domain through Google Webmaster Tools account. You can specify a preferred version in Google Webmaster Tools in Configuration -> Settings -> Preferred Domain menu.

2. Changing dynamic URLs to static

Dynamic URLs usually appear during the process of website development. This is how a dynamic URL may look like: www.domain.com/?p=7463. And this is an example of a static one: www.domain.com/about-us.

Generally search engines have no issues with accessing dynamic URLs and perfectly index both URL types. But it’s more SEO-wise to use static URLs. The two main arguments are:

  • Static URLs are more visitor friendly compared to dynamic ones;
  • Adding keywords to URLs helps search engines and website visitors better understand what a webpage is about.

You can rewrite the URLs of your site pages via editing an .htaccess file (or its analogue) or right in your content management system (if it supports this option).

And remember, when rewriting your site URLs, you should make them meaningful and wisely optimized for your keywords. Avoid using long numbers, different punctuation marks and underscores (_).

3. Creating and managing an XML Sitemap

Basically, there are 2 types of sitemaps. These are HTML and XML. The former type is designed for website visitors and lets them navigate through site pages, and the latter one is designed for search engine crawlers.

XML sitemaps basically serve two purposes:

  • they help search engines faster crawl your site and index its pages;
  • search engines use an XML a sitemap as a reference when choosing canonical URLs on a website.

Nowadays, the software market offers a wide range of SEO tools that can automate the process of creating and managing this type of sitemaps. For instance, Website Auditor can automatically generate an XML sitemap of your website and upload it via FTP.

4. Hiding irrelevant pages with robots.txt

Sometimes it may be more SEO-prudent to hide some pages from search engine crawlers. There could be pages that contain personal information, details about company’s alpha products, private correspondence, pages for internal company use or pages under development.

As a rule, these pages have no or little value for site visitors and don’t contain your keywords.

There are two simple ways to hide these pages from indexation: by using robots.txt files or with robots meta tags. A noindex tag is a more secure way to hide pages from search engine crawlers. However, managing these tags is pretty hard as it’s applied on a page-per-page basis. Using robots.txt files is a lot easier as all indexation instructions are stored in one file.

5. Specifying canonical URLs with the help of a special tag.

Using the rel=canonical tag is another way to highlight canonical URLs on a website. Simply put, this tag tells search engines that a page is actually a copy some other page on a website. Thus you can carpe diem on any duplicate content that may present on your site.

But note that a rel=canonical tag should be applied with the only purpose – that is helping search engines decide on your canonical ULRs. To redirect your site pages, you need to use special redirects. As far as paginated content is concered, it makes sense to employ rel=”next” and rel=”prev” tags.

That’s basically it.


To sum it all up, a well-organized URL structure improves your site rankings and makes it more user-friendly. It is like hitting birds in one stone: you get the chance improve your site visibility on the one hand and provide better user experience to your site visitors on the other.

Tags: , , .
  • Trackbacks
  • Comments
  • Beth
    September 19, 2013

    Really useful advice! I agree that it is important to make a URL both search engine and visitor friendly.

  • Sean Maki
    September 26, 2013


    What are your thoughts on hierarchy for URL structure, specifically for categories/products? Aside from breadcrumb links fo you think their is value for parent categories from url structure? Just looking for other’s opinions on the topic.

    For example:






    or domain.com/category/subcategory/product

    Hope the questions makes sense.


    • admin
      October 11, 2013

      Hi Sean,

      I always supported the following structure:


  • Maryland SEO company
    September 29, 2013

    Very nice post. I just stumbled upon your weblog and wanted
    to say that I’ve truly enjoyed surfing around your weblog posts.

    After all I’ll be subscribing to your rss feed and I hope you
    write once more very soon!

  • Comments are disabled.
  • Trackbacks are disabled.