We all know that before Google bot crawls your site, it accesses your robots.txt file to determine if your site is blocking Google from crawling any pages or URLs. But Google webmasters suggest that if you want search engines to index everything in your site. You don't need a robots.txt file. But still lots of website webmaster use robots.txt, while they want search engines to index everything in your site. Google never suggest making a robots.txt file for every website, but webmasters creates.
Website must need this and allow this agent to check site.
where you read this? "Website must need this and allow this agent to check site."
I don't use robots.txt while creating websites. And you?
It is better if you have robots.txt. Only then the search engine crawler will visit regularly.
robots.txt is very necessary for every website.Without this goggle will not able to crawl our website.
I never used it, and i think it's not necessary for small webs.
If you have a new website, you can utilize robots.txt file to tell search engines that how robots or crawl will index your website. Think that you have a website related to dog, and you have put a content related to cat on your website. Search engine assumes that this site does not focus on particularly on one theme, this thought is called theme bleeding, and search engine will do lower your page rank. To prevent theme bleeding in your site, you can utilize robot.txt file to stop search engine to index pages, which doesn't match with the theme of your site.
I believe that every web master should include robot.txt file on their websites. This file is really very helpful for one's website indexing and crawling. Search engine robots always read this file first.
Robots.txt files are more helpful when webmaster do not want to crawl & index some pages of site by Search Engine Crawlers.