Google Webmaster tool is most powerful free SEO optimization tool that you can use for your blog or website.With the help of this tool you can
submit sitemap to Google, find 404 errors URL's, and many more.So, this tool is very helpful for webmasters and blogger.I have shared tutorial about
Custom permalinks in blogger.
Google makes all efforts to ensure that any website is crawled by its crawlers or
‘spiders’ efficiently. This is to ensure that duplicate content is detected and does not get prominence by slipping through. There is this latest feature in Google Webmaster Tools which is called Parameter Handling. If you search in Site Configuration setting you should be able to locate it. Any site owner is allowed up to
fifteen parameters that Google will remember to ignore while the site is crawled and indexed. This should ensure that your site does not lose proximity as Google’s spiders will not mistake any of these fifteen as duplicate content.
What Are Permalinks?
For all details related to what a permalink is and where can you find this option in your post editor, please read the following tutorial:
These parameters found in the URLs on the respective sites are registered by Google and marked to either be ignored or not ignored which makes all the difference. These markings can always be contested by you and rejected as well. You, as the site owner also have the liberty to add new parameters or delete any of the existing ones. This feature is really good for site owners as it improves the data standardization or what Google calls canonicalization, which in turn helps prevent duplication of data. Whenever canonicalization occurs, it affects the efficient crawling of the sites by Google’s spiders.
The primary goal of the spiders is to ensure any content gets the page rank it deserves. If many sites lead to different pages of the same URL, each of those pages lose value in relation to prominence, in other words – page ranking. Though there are other solutions for this issue, Google has come out with URL parameters. These parameters however are effective where Google makes it a point to ignore them, however it cannot command for them to be remembered.
The greatest benefit of this option is that the crawl efficiency gets enhanced manifold. Whenever Google encounters a new URL, all it needs to do is to compare the parameters with the pre-compiled list it has and remove the ones that are not relevant before it can start crawling. This reduces the time taken to crawl all the pages of a site and at the same time prevents duplication, which is the ultimate goal. The ease with which this option can be used is what makes it all the more desirable. It takes very little time for Google to scan a prospective list and remove the optional or unwanted ones. It also eliminates the need to create source codes for web pages and in the process saves time.
One drawback with this option can be that it works only with Google. If searches were to be conducted using other search engines like Yahoo and Bing, the site owners face the issue of their URLs not being optimized at all. In terms of market share, that could work out to anywhere between 20 to 30%. Another risk that site owners run is the loss that will be incurred by them if they were to instruct Google to ignore important parameters. This can happen only by accident, however, in the event that it happens, the loss is to be borne only by the site owners. The only solace can be that Google would have certainly taken into account such probabilities and have appropriate safeguards in position.
You May Like To Read:
Here is a detailed video from Google team to understand what is URL parameters and how it works and affect your crawling and indexing of your site.
By Guest Author - This guest post is brought to you by timewarnercable.wedocable.com, a site that offers savings and current information on time warner internet.
0 comments:
Post a Comment