Website and Webmaster Help Forums

Search Engines and Marketing => Website Crawling and Indexing => Topic started by: bangalorewebguru on June 08, 2017, 04:17:02 AM

Title: Use of robot.txt ?
Post by: bangalorewebguru on June 08, 2017, 04:17:02 AM
Use of robot.txt ?
Title: Re: Use of robot.txt ?
Post by: RH-Calvin on June 12, 2017, 09:51:47 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Title: Re: Use of robot.txt ?
Post by: sirishasiri on December 06, 2017, 06:56:37 AM
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard
used by websites to communicate with web crawlers and other web robots. The standard specifies how to
inform the web robot about which areas of the website should not be processed or scanned.
Title: Re: Use of robot.txt ?
Post by: compressjpg on December 14, 2017, 09:44:29 AM
Thanks for sharing useful information...
Title: Re: Use of robot.txt ?
Post by: hotmailserviceuk on December 20, 2017, 02:03:03 AM
Hello Guys
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots.