Webmaster Forums - Website and SEO Help

Search Engines and Marketing => Website Crawling and Indexing => Topic started by: PoolMaster on July 04, 2019, 03:30:44 AM

Title: What is disallow in robots.txt file?
Post by: PoolMaster on July 04, 2019, 03:30:44 AM
What is disallow in robots.txt file?
Title: Re: What is disallow in robots.txt file?
Post by: ANSH on July 04, 2019, 06:24:38 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
Title: Re: What is disallow in robots.txt file?
Post by: bhavesh on July 04, 2019, 08:24:59 AM
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
Title: Re: What is disallow in robots.txt file?
Post by: HARSH_12 on July 05, 2019, 07:17:53 AM
The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. The slash after "Disallow" tells the robot to not visit any pages on the site.
Title: Re: What is disallow in robots.txt file?
Post by: jay_11 on July 06, 2019, 04:26:33 AM
Disallow in robot.txt means the user do not allowed bot to crawl some page of website