A robots.txt actually give signals to search engines bots that how to crawl your blog or website pages/posts, images etc. For example you can remove blog images on search result using this feature as well as improve your site
Search Engine Optimization ranking. In this short tutorial I'm going to show you how to set up Robots.txt for improve your blogs SEO.
In old blogger interface this feature was not available. In self hosted website we create an extra robot.txt file and put code in it. Lets see how can I do this:
Important - Do All Things Very Carefully !
How To Create A Custom Robots.txt On Blogger Blog
To Setup The Robot.txt In Your Blogger Blog Follow Given Steps , This Is Very Easy Task For You To Do :)
1. Go to Blogger Dashbaord >> Settings >> Search Preferences
2. Now Click On Edit In Custom robots.txt
Now you can add different Crawler to instructions. This post I added different useful crawl robots.txt codes.
Block Label and Search Pages Crawling.
This given code will allow entire blog but, not allow crawl label and search pages.
User-agent: *
Disallow: /search
Allow: /
Block Some Blog Page (s).
Some times, you might need to hide your selected page or pages from the search engines. Use following code for that purpose.
User-agent: *
Disallow: /p/page-name1.html
Allow: /
For more-than one page add their URL one by one on Disallow section like below.
User-agent: *
Disallow: /p/page-name1.html
Disallow: /p/page-name2.html
Allow: /
Block Specific Crawler
If you want to block only one crawler, you can use the following code.
User-agent: <bot name>
Disallow: /
User-agent: Googlebot-News
Disallow:
Setup AdSense Crawler Instruction
You can improve your Google Adsense revenue performance ; you can specify how AdSense bot crawl your website or blog. Use following code:
User-agent: Mediapartners-Google
Disallow:
Block Images Indexing.
Many people don't like to see their blog post’s images on the Google image search result, you can remove them by reading the following short tutorial.
You can find more crawlers and their user agent information in
here and in
here.
All In One Code For Your Blog
If you want well performance for your blog's search engine visibility and crawl whole blog other than labels and search pages using the following code as your robots.txt.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://
www.blogsdaddy.com/feeds/posts/default?orderby=UPDATED
Note - Replace highlighted red color URL with your own blog URL.
Congratulations! You are all done for your blog, After few days you will get better performance from your blog.
Warning! Use with caution. Incorrect use of these features can result in your blog being ignored by search engines.
If you need any help with this tutorial then ask via
comments. See you with next useful article. Stay Blessed and Happy Blogging :)
Author - Gagan Masoun is the owner of Blogs Daddy Blog.Gagan lives in India, has been blogging since 2010 and writing Blogs Daddy Blog since 2011.You can find him in the usual social networks.
Thank you for this wonderful post.. I always have a problem understanding this both on blogger and webmaster tool
ReplyDeleteGireesh these are important things for bloggers, so please understand them completely :)
DeleteHello gagan... my question if when I use the "ALL in ONe"...will I use that sitemap information: http://www.blogsdaddy.com/feeds/posts/default?orderby=UPDATED... or my blog name...I'm confuse
ReplyDeleteVerne you should use your own blog name. See example - http://www.YOURBLOGNAME.com/feeds/posts/default?orderby=UPDATED
ReplyDeleteFor further inquiry leave your comment any time.