How To Setup Robot.txt File In Blogger
Robot.txt file is important to give instruction to search engine that what to be crawl and which page is not for crawl. This will help google search engine to crawl your blog. It can rank your site in google search. But a proper robot.txt file required.
You can setup it in your blogger setting>search preference>robot.txt
I have a sample for robot.txt file here
____________________________________________________
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://********.blogspot.com/feeds/posts/default?orderby=UPDATED
____________________________________________________
You have to replace ******** by your blogger URL. That's it. And copy and pate it into robot.txt box in blogger.
Than goto webmaster into the sitemap and add a new sitemap file Atom.xml now your blog will be on google.
This is very important for good crawling.
And don't use allow for search. Google didn't want crawl internal search link. It wiil decrease your site ranking in google search. So don't use it.