Webmaster Forums - Website and SEO Help

Search Engines and Marketing => Website Crawling and Indexing => Topic started by: Jessicad on December 18, 2018, 08:33:41 AM

Title: what are the uses of robots.txt
Post by: Jessicad on December 18, 2018, 08:33:41 AM
what are the uses of robots.txt
Title: Re: what are the uses of robots.txt
Post by: Netparticle on December 18, 2018, 11:16:11 PM
The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
Title: Re: what are the uses of robots.txt
Post by: rachana on December 18, 2018, 11:46:19 PM
Robots.txt can be used for following purposes:
1. Robots.txt file indicates whether the pages of their website should be crawled or not.
2. To hide few pages that you don't want search engines to show in the SERP.
3. It shows the search engines where the XML sitemap is located so that they can find new pages quickly.
Title: Re: what are the uses of robots.txt
Post by: sinelogixtech on December 18, 2018, 11:59:31 PM
Robots.txt is useful for preventing the indexation of the parts of any online content that website owners do not want to display.
Title: Re: what are the uses of robots.txt
Post by: sinelogixtech on December 19, 2018, 11:35:51 PM
Using this file a website owner can block robots to follow some link in website. Ex: User-agent: * Disallow: / Above code means website owner restricting all ("*") to visit whole website("/").
Title: Re: what are the uses of robots.txt
Post by: RH-Calvin on December 20, 2018, 12:50:05 AM
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Title: Re: what are the uses of robots.txt
Post by: andrewtiwsan on January 23, 2019, 05:37:34 AM
thx for this intresting topic. nice information.