Saturday, December 29, 2012

URLs Restricted By ROBOTS.TXT On My Blogspot Site


First thing is you should know What Is a ROBOTS.TXT File ? and How To Use It ?. Before Googlebot crawls your site, it accesses your ROBOTS.TXT file to determine if your site is blocking Google from crawling any pages or URLs.



IF you are getting Errors or Messages in your Google Webmaster Tools account like
   "URLs Restricted by robot.txt "

and link of those URLs have the following 
   http://latesthacksandtricks.blogspot .com / search /



If YES the you need not to panic this is normal for blogs.

It blocks the search  section of your site from Bots to stop Duplication i.e. you will get the same content on many URLs ,which could result in strange ranking and search results.These URLs are automatically blocked.


But these are New

  • You never had this before?
  • You've hava a old blog, and never seen these sorts of errors?
There are 2 very likely reasons -
   1) You have recently added links to your site that point to your Search Pages.
   2) Someone else has just started linking to your Search Pages.


They are also not a serious matter .The problem will be automatically solved after some time .The googlebots take time in these matters .

By default, every blog that uses the Blogger platform will have a robots.txt as follows:


But after all this you want every section of your blog to be indexed,sub directories then -
go like this -(WHICH I DON'T SUGGEST )

With the configuration as above then all of the articles and the label will be indexed.
WARNING -
But the problems of duplicacy may arise.




1 comment: