With Google Panda Updates, webmasters and businesses across the world have realized the importance of maintaining quality pages for higher SERP ranking. You might not intentionally create low ranking web pages but once you start publishing on a regular basis, low quality pages will be created or generated.
Steps to Identify low quality pages
1) List the Indexed pages
The first step is to find out the indexed pages from your website. Start with the command
site :<websitename.com>
Go through each indexed page
Find the pages that do not add value to readers – images, tag pages and thin pages.
Add Disallow directive for each pages and files, in robots.txt
User-agent: *
Disallow: /*.gif$
Disallow: /*.swf$
Disallow: /tag1/
Disallow: /tag2/
Disallow: /thin-pages
In some cases, even after providing directives in robots.txt, the pages will be indexed. For such scenarios, add page level meta-information
<html>
<head>
<title>Page Title</title>
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
</head>
2) Perform Page Audit
Create a custom analytics report with four columns – Page Title, Keyword, Time on Page and Bounce Rate. Find the list of pages that have high bounce rate and low time on page. What purpose do the pages serve for your website?
Find out the keywords that brought the visitor to a page. For example, the keyword: “how to find low quality page” might bring visitors to this page. After we start getting visitors for this keyword, our team has to review the page on a weekly or monthly basis. We have to ask ourselves the following questions:
- Has ByteFive done justice to this topic?
- What does analytics say about user engagement (time on page and bounce rate)? If the time on page is above 3 minutes then we can safely say that the reader has found value from this page.
- Are the readers reaching the page using the right keyword? If they are not, then we have to optimize the title, sub-headings and build links to this page with the right keyword.
- Are we getting social shares? No? Spread the word! If the content has not wowed the audience then we have to keep updating this page until the audience feels the need to share the article with the Internet Marketing community.
3) Page Audit Plan
Even after careful examination, pages with low quality will come up in your website. Create a plan to review the pages and repeat steps 1) and 2)
- Create a schedule for page audit – Weekly, Monthly or Quarterly
- How is the quality of the pages analysed?
- Define the parameters for analysing a page – Clarity of thoughts, formatting, actionable tips and overall value
- Does high bounce rate and low time on page always mean that the quality of the page is low? In some cases, the query that brought the visitor to the page might be informational queries. For such visitors, the information and tips offered in the rest of the article might not be immediately relevant. For example, a visitor reaching this page, looking for the syntax for robots.txt will just read 1) and leave this page. Does that mean that the visitor didn’t get any value from this page? No! He got the answer in the first paragraph. So before classifying a page as low quality, find the keywords that brought the visitor to the page.
4) For [secure search]
With the Google SSL secure search, you might not be able to analyse each page based on keywords. You will see results like the one below for all visitors who performs search by logging into a Google product
Title Keyword
How to Identify low quality pages in your website [secure search]
But not all visitors will reach the page logged into a Google product. Find the keywords that they used to reach this page. If you still can’t find the keywords, do reverse analysis. Analyse the pages for keywords and entities using OpenCalais or alchemy API.
5) Social Media
The disadvantage with Social Media is that we don’t have control on how the visitor highlights the article. The title and description might not be the right representation of what we were expecting. But in most cases, the readers will maintain the original structure of the information.
Read: Social media monitoring tools for your Business
6) Links
We face a similar problem with links. But with the recent Google Penguin update, lack of control over anchor text would actually help us. The only thing that you have to keep a tab on is whether competitors are using links for negative SEO. Bing and Google Webmasters tool have integrated Back Link Monitoring features with recent updates. Register your websites and keep a close eye on the link growth.
Above: Links to Your Site through Google Webmasters Tool
7) AdWords
With Adwords, it becomes extremely easy to see the keywords, ad copy and titles that attracted visitors to the page. Let us say that after we published this page, we created an Ad spreading the word about this page on how to identify low quality pages.
We can easily monitor user engagement from the AdWords section in analytics. By adding page title as the secondary dimension, we can map the keywords to each page that we are reviewing. This is a fast way to monitor user engagement for your recent articles.
Before you remove a low quality page, ask one question – has the page informed or entertained the reader. If it has not, then the page does not deserve to be in your website.
We hope this article has provided value for you. Please share this post with your friends and team members. If you would like to see other information, added to this article, please use the contact form. We would be happy to update it.