Website and Webmaster Help Forums

Search Engines and Marketing => Website Crawling and Indexing => Topic started by: bangalorewebguru on June 08, 2017, 04:17:02 AM

Title: Use of robot.txt ?
Post by: bangalorewebguru on June 08, 2017, 04:17:02 AM
Use of robot.txt ?
Title: Re: Use of robot.txt ?
Post by: RH-Calvin on June 12, 2017, 09:51:47 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Title: Re: Use of robot.txt ?
Post by: sirishasiri on December 06, 2017, 06:56:37 AM
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard
used by websites to communicate with web crawlers and other web robots. The standard specifies how to
inform the web robot about which areas of the website should not be processed or scanned.
Title: Re: Use of robot.txt ?
Post by: compressjpg on December 14, 2017, 09:44:29 AM
Thanks for sharing useful information...
Title: Re: Use of robot.txt ?
Post by: hotmailserviceuk on December 20, 2017, 02:03:03 AM
Hello Guys
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots.
Title: Re: Use of robot.txt ?
Post by: salenaadam on February 09, 2018, 01:07:30 AM
Robots.txt is a text  file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. Robots.txt is a text file located in the site’s root directory that specifies for search engines’ crawlers and spiders what website pages and files you want or don’t want them to visit.
Title: Re: Use of robot.txt ?
Post by: praveenitech1 on February 21, 2018, 02:51:14 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Title: Re: Use of robot.txt ?
Post by: vishnu priya on March 07, 2018, 07:26:29 AM
Robots.txt is a text file created to instruct search engine robots, how to crawl pages on their website.
Title: Re: Use of robot.txt ?
Post by: agustina on March 17, 2018, 02:25:43 AM
Web owners only have rights to use robot.txt file to provide information about their site to web robots
Title: Re: Use of robot.txt ?
Post by: grawhill on March 23, 2018, 06:05:07 AM
A robots.txt file gives instructions to web robots about the pages the website owner doesn’t wish to be ‘crawled’.  For instance, if you didn’t want your images to be listed by Google and other search engines, you’d block them using your robots.txt file.
Title: Re: Use of robot.txt ?
Post by: digitalaimschool on March 27, 2018, 06:58:09 AM
Robots.txt is a text file that lists site pages and using robot.txt file you can instruct search engines robots. Which webpage are allowed and disallowed for crawling and  It contain instructions for robots.