What is Robots.txt file?

Started by MoulanaRafi, June 08, 2018, 01:32:17 AM



manisthajain

Search Engine first comes to this file and read given Instructions by site owner that who pages need to crawl and not.

RH-Calvin

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Cheap VPS | $1 VPS Hosting
Cheap Dedicated Servers | Free Setup with IPMI

Mercy Michal

Robots.txt file is the robots exclusion protocol. It hleps indicates the lists of we bpages that are what pages are allowed and what pages are disallowed from search engine crawl

Christiana Robert

Search engine first read Robot.txt file instruction. In this file, all instructions are given by website owner like what are the pages are allowed crawl by Google and what are the pages are disallowed crawl by Google

kiruthi_18

Website owner create the Robot.txt file for giving instruction to search engine. Inside the file the site owner gave instructions which URL's are allowed crawl by search engine and which URL's are disallowed crawl by search engine