Webmaster Forums - Website and SEO Help

Search Engines and Marketing => Website Crawling and Indexing => Topic started by: lillianabe on October 14, 2011, 02:21:02 AM

Title: Different ways for stopping Google bot for crawling
Post by: lillianabe on October 14, 2011, 02:21:02 AM

use meta tags no follow no index
specifying in robots.txt
creating captcha or form to be filled by real users with page load

I know these three methods, if you know any other method please post here.
Title: Re: Different ways for stopping Google bot for crawling
Post by: anandiphone on October 14, 2011, 08:13:01 AM
Robots.txt is used for not crawling particular pages in the website , first we need to stop the SEO activities for Google bot to stop crawling.
Title: Re: Different ways for stopping Google bot for crawling
Post by: zaboka on December 27, 2011, 11:02:22 PM
I also want to say that Robots.txt is best way for this purpose.
Title: Re: Different ways for stopping Google bot for crawling
Post by: tesori on January 04, 2012, 01:27:21 AM
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.
Title: Re: Different ways for stopping Google bot for crawling
Post by: anandiphone on January 05, 2012, 06:39:25 AM
We need to stop the SEO activities and use robots.txt to stop crawling.
Title: Re: Different ways for stopping Google bot for crawling
Post by: sancon77 on February 25, 2012, 01:25:17 AM
Quote from: tesori on January 04, 2012, 01:27:21 AM
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.
It is best suggestion indeed.
Title: Re: Different ways for stopping Google bot for crawling
Post by: seenathkumar on February 25, 2012, 05:45:20 AM
Website owners give instruction to robots about their website and some pages that robots has to ignore certain pages, robot.txt file is used to is used to stop crawler to crawl that page which contains confidential content or the website owner don't want to show it. If you have a sensitive data which you don't want to show locally on search engine then robot.txt file can play an important role to stop the crawler from crawling that page. Robot.txt is a text file that you put on your website to tell crawler which pages they don't have to visit.   
Title: Re: Different ways for stopping Google bot for crawling
Post by: Goyum on October 11, 2019, 04:26:16 AM
Quote from: lillianabe on October 14, 2011, 02:21:02 AM

use meta tags no follow no index
specifying in robots.txt
creating captcha or form to be filled by real users with page load

I know these three methods, if you know any other method please post here.


I also follow these three method
Title: Re: Different ways for stopping Google bot for crawling
Post by: sunilrajval on January 08, 2020, 06:16:23 AM
Robot.txt is the main way to stop crawl the website