How To Create SEO-Friendly Site URL Structure

in Blogging Tips

You can do more SEO friendly with these more effective suggestions for your site. You can read out “How to create an SEO-friendly Site URL structure.


Some people say there is no such thing as SEO-friendly URL structure. They claim that search engines are perfectly capable of making sense of any type of URL and pretty much any URL structure. In most cases, the people who say this are web developers.


I’ve noticed that sometimes web developers and SEOs live in two parallel universes, each with its own center of gravity. While web developers basically care about crawlability, site speed, and other technical things, SEOs are mostly focused on what constitutes their sacred grail: website rankings and ROI.

Hence, what may be an OK site URL structure to a web dev, could be a totally SEO-unfriendly URL architecture to an SEO manager


What is an SEO-friendly URL structure?

First of all, let me start by saying that it is always better to call in an SEO manager early in the development stage, so that there is no need to make sometimes hard-to-implement tweaks afterwards.



From an SEO point of view, a site’s URL structure should be:

Straightforward: URLs with duplicate content should have canonical URLs specified for them there should be no confusing redirects on the site, etc.

Meaningful: URL names should have keywords in them, not gibbering numbers and punctuation marks.

With emphasis on the right URLs: SEO-wise, not all URLs on a site are of equal importance as a rule. Some even should be concealed from the search engines.



So, here is what one can do to achieve an SEO-friendly site URL structure:

As a rule, there are two major versions of your domain indexed in the search engines, the www and the non-www version of it. These can be consolidated in more than one way, but I’d mention the most widely accepted practice.


Alternatively, you can specify your preferred version in Google Webmaster Tools in Configuration >> Settings >> Preferred Domain. However,this has certain drawbacks:


1. This takes care of Google only.

2. This option is restricted to root domains only. If you have an site, this method is not for you.

3. But why worry about the www vs non-www issue in the first place? Thing is, some of your backlinks may be pointing to your www version, while some could be going to the non-www version.


So, to make sure that both versions SEO value is consolidated, it’s better to explicitly establish this link between the two.



Avoid dynamic and relative URLs

1. Depending on your content management system, the URLs it generates may be “pretty” like this one:


3. or “ugly” like this one:


5. As I said earlier, search engines have no problem with either variant, but for certain reasons it’s better to use static (prettier) URLs rather than dynamic (uglier) ones. Thing is, static URLs contain your keywords and are more user-friendly, since one can figure out what the page is about just by looking at the static URL’s name.


Some web devs make use of relative URLs. The problem with relative URLs is that they are dependent on the context in which they occur. Once the context changes, the URL may not work. SEO-wise, it is better to use absolute URLs instead of relative ones, since the former are what search engines prefer.


Now, sometimes different parameters can be added to the URL for analytics tracking or other reasons (such as sid, utm, etc.). To make sure that these parameters don’t make the number of URLs with duplicate content grow over the top, you can do either of the following:


1. Ask Google to disregard certain URL parameters in Google Webmaster Tools in Configuration > URL Parameters.

2. See if your content management system allows you to solidify URLs with additional parameters with their shorter counterparts.



Create an XML Sitemap

An XML Sitemap is not to be confused with the HTML sitemap. The former is for the search engines, while the latter is mostly designed for human users.


What is an XML Sitemap? In plain words, it’s a list of your site’s URLs that you submit to the search engines. This serves two purposes:


1. This helps search engines find your site’s pages more easily.

2. Search engines can use the Sitemap as a reference when choosing canonical URLs on your site.



Close off irrelevant pages with robots.txt

There may be pages on your site that should be concealed from the search engines. These could be your “Terms and conditions” page, pages with sensitive information, etc. It’s better not to let these get indexed, since they usually don’t contain your target keywords and only dilute the semantic whole of your site.


The robotx.txt file contains instructions for the search engines as to what pages of your site should be ignored during the crawl. Such pages get a noindex attribute and do not show up in the search results.

Leave a Reply

Your email address will not be published.