What is disallow in robots.txt file?

Started by PoolMaster, July 04, 2019, 03:30:44 AM


ANSH

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

bhavesh

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

HARSH_12

The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. The slash after "Disallow" tells the robot to not visit any pages on the site.

jay_11

Disallow in robot.txt means the user do not allowed bot to crawl some page of website