Tuesday 18 November 2014

top 5 Best Ways to Fix Duplicate Content Issue Effectively for Web pages

Copy substance issue is one of the commonest issues in the World Wide Web. It happens when the same bit of substance is accessible to the internet searcher bots and the client on more than one URL. As an issue, internet searchers need to work a considerable measure with a specific end goal to expel copy results from their record. Additionally, as an issue of becoming spam through the utilization of boundless copy substance, Google dispatched the Panda overhaul which punished locales having copy or close copy content in them. Presently, it has gotten to be fundamental for the website admins to keep their webpage protected from any sort of copy substance punishment connected by Google. There are a few courses through which the website admins can keep their webpage sheltered from being punished under "copy content punishment" by Google.



1- Add Rel=canonical Tag 

The rel=canonical connection was acquainted in 2012 with take care of the issue of comparable web records. The rel=canonical connection component lets the web indexes distinguish a favored rendition of the URL to be considered as unique. With respect to case, if a site has 3 URL's to be specific:-

Example.com/camcorder (the first URL)

Example.com/gadgets/camcorder (Duplicate URL 1)

Example.com/electronics?item="camcorder" (Duplicate URL 2)

From the over 3 separate Url's, the same bit of data about the camcorder can be gotten to. This can result in genuine copy substance issue for the fundamental site. We can include rel=canonical tag in 2 copy URL's as given beneath:-

<head>

<link rel="canonical" href=" http://www.example.com/camcorder/ >

</head>

Including the above rel=canonical tag in the copy URL's will advise the internet searcher crawlers to property the substance of the page to the first URL this sparing the website from getting punished because of the copy substance issue.



2- Assign a 301 Redirect 

301 redirects tell the web search tools that the page has moved to an alternate area in this manner passing all the connection value and quality to the principle website. This ought to be the arrangement when the copy page has backlinks and movement coming to it.

The 301 redirection ought to be given in the .htaccess document. A case code is given beneath:-

Redirect 301/ http://mt-example.com/

3- Remove the Link 

As a rule the straightforward and the best arrangement is to expel the copy pages from your site. This will make your undertaking and the web crawler crawlers assignment much less demanding. You can uproot the pages and return 404's for them.

4- Use robots.txt or Meta robots 

An alternate favored method for settling the copy substance issue is by either utilizing robots.txt or the Meta robots tag.

Through robots.txt

Include the accompanying code with a specific end goal to square the internet searcher crawlers from getting to the copy content. This will guarantee the copy substance can be seen by the clients yet stays obstructed for the web crawlers.

Client operators: *

Deny: /copy

Deny:/duplicate.html

Deny:/unique/duplicate.html

Change the lines according to the record names and areas of the copy Url's.

Through Meta robots tag

Meta robots tag is a header level order that advises the internet searchers to file the substance of the page according to the mandates specified in the tag.

A basic order like nofollow can coordinate the web indexes to not to list the substance of the website page. An illustration is given beneath:-

<head>

<meta name= "ROBOTS" content= "NOINDEX, NOFOLLOW"/>

</head>



5- Use Parameter Blocking 

For vast ecommerce locales, parameter blocking can be utilized as a powerful answer for obstructing the copy content. To set parameter blocking, take after the steps given underneath:-

a- Log into the Google Webmaster Tools

b- Move to "URL Parameters" found under "Crawl" tab.

c- Click on Edit and select "no" starting from the drop list. A "no "shows the vicinity of copy substance in the chose URL parameter.

An expression of alert: - Be 100% certain when you are utilizing URL parameters to square comparable substance on the grounds that it can result in the non copy pages to get obstructed by the internet searchers.

For me, the favored choices are utilizing rel=canonical tag and the Meta robots tag. Both these alternatives are less dangerous and fathom the copy substance issue viably.

No comments:

Post a Comment