See Our Webmaster Tools for Windows and Mac

                                  

Use of robot.txt ?

Started by bangalorewebguru, June 08, 2017, 04:17:02 AM

bangalorewebguru

Web Designing Company Bangalore | Web Designing Companies Bangalore | Website Development Company Bangalore

RH-Calvin

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Cheap VPS | $1 VPS Hosting
Cheap Dedicated Servers | Free Setup with IPMI

sirishasiri

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard
used by websites to communicate with web crawlers and other web robots. The standard specifies how to
inform the web robot about which areas of the website should not be processed or scanned.

compressjpg

Thanks for sharing useful information...

hotmailserviceuk

Hello Guys
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots.

salenaadam

Robots.txt is a text  file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. Robots.txt is a text file located in the site's root directory that specifies for search engines' crawlers and spiders what website pages and files you want or don't want them to visit.

praveenitech1

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

vishnu priya

Robots.txt is a text file created to instruct search engine robots, how to crawl pages on their website.

agustina

Web owners only have rights to use robot.txt file to provide information about their site to web robots

grawhill

A robots.txt file gives instructions to web robots about the pages the website owner doesn't wish to be 'crawled'.  For instance, if you didn't want your images to be listed by Google and other search engines, you'd block them using your robots.txt file.

digitalaimschool

Quote from: bangalorewebguru on June 08, 2017, 04:17:02 AM
Robots.txt is a text file that lists site pages and using robot.txt file you can instruct search engines robots. Which webpage are allowed and disallowed for crawling and  It contain instructions for robots.