URL (Uniform Resource Locator) allows web users to access the webpage, documents or any other resources hosted in your server. How should the urls be structured for blog sites with 40 pages, 1000s of pages or e-commerce site with over 10000 pages? The answer is each type of blog has a different way of presenting its unique page to the user. More than the bots being able to figure out what your webpage is about, it is important that you structure your web urls for your visitors. Here are some best practices:
1) Power of Hyphen: Google in its webmasters blog recommends the use of hyphens. For example http://www.example.com/home-insurance is much better than http://www.example.com/homeinsurance. Although Google Algorithm has evolved and can recognize words and phrases from urls, make it easier for Google bots. Using hyphens to separate words make it much more readable for visitors too.
2) Static over Dynamic URLS: For websites that are run on CMS (Content Management System), a common problem is the generation of dynamic urls. Most CMS has plugins to convert non-readable dynamic urls to static easy to understand urls. Make use of them. If you don’t have the flexibility to use plugins, use 301 redirects on the dynamic urls and map it to readable urls.
3) Session Ids and URL Parameters: Another common issue seen in websites is the generation of session ids and URL parameters, which are difficult to recognize. Such URLs are inevitable for sites with search and sort features. Most of these URLs are not necessary to be indexed. Exclude such urls by using robot.txt
Let us say that you have a search page that passes the page id to the url and also sorts the results, like
www.example.com/search?id=132&sort=asc
For such cases, disallow indexing any urls with parameters by using robot.txt
In Robot.txt use the following entry:
User Agent: *
Disallow: /search*
The above entry will disallow any parameters after the www.example.com/search
4) Mind the Length
Although the Maximum URL allowed by Internet Explorer is 2,083 characters, follow the best practice recommended by Matt Cutts (Head of Google’s Web spam team) – Stick to 3-5 words in your URL. This can be tricky when you have category and sub-category pages
5) Keyword in URLs
Use of keywords in the initial part of the URL can improve the relevance of your webpage and improve click through rate in SERP.
For Category Page:
For Sub-Category Page
www.example.com/category/sub-category
For the actual page use the following variations:
www.example.com/keyword1-sub-category-category
www.example.com/keyword1-category-sub-category
Depending on whether the page has close resonance to the category or sub-category page, use the corresponding structure after the keyword
For pages that have long keywords, avoid categories and sub-categories altogether
www.example.com/keywordWord1-keywordWord2-keywordWord3
Any URL with more than 3 words can be construed as spam by the visitor.
6) Case Consistencies
Although there is no one solution for all websites, avoid ALL CAPs in your URL
How many times have you clicked such URLs when you search for a topic? The URL has spam written all over it.
If you are going to use First Letter Caps for all categories and sub-category pages, stick with it. Don’t change after 100 pages.
www.example.com/Category/Sub-category
As a general practice, use of lower case in URL keywords has been found effective. Not for SEO but to improve click through rates in SERPs
If you want expert help in Creating Search Engine Friendly URLS, contact us or you can call us at: India (Ph): +91 9497189032, UK (Ph): +44 (0)20-3371-9976 or US (Ph): +1 650-491-0004
ByteFive provides Internet Marketing Services(Search Engine Optimization (SEO), Search Engine Marketing (SEM), Develop processes and best practices for Internet Marketing, Manage and Analyze Web Analytics packages, Perform Landing page Optimization, Manage E-Mail Newsletter, Perform Competitor Research, Content Creation and Marketing) and Training to help you achieve your revenue and business goals.