Join Our Newsletter
More information
WebHelpForums
WebHelpForums

WHF at Twitter   WHF at Google+
Share








Author Topic: Different ways for stopping Google bot for crawling  (Read 580 times)

lillianabe

  • Full Member
  • ***
  • Posts: 184
  • Karma: +0/-0
Different ways for stopping Google bot for crawling
« on: October 14, 2011, 12:21:02 AM »

use meta tags no follow no index
specifying in robots.txt
creating captcha or form to be filled by real users with page load

I know these three methods, if you know any other method please post here.

anandiphone

  • Full Member
  • ***
  • Posts: 203
  • Karma: +0/-0
Re: Different ways for stopping Google bot for crawling
« Reply #1 on: October 14, 2011, 06:13:01 AM »
Robots.txt is used for not crawling particular pages in the website , first we need to stop the SEO activities for Google bot to stop crawling.

zaboka

  • Newbie
  • *
  • Posts: 14
  • Karma: +0/-0
Re: Different ways for stopping Google bot for crawling
« Reply #2 on: December 27, 2011, 10:02:22 PM »
I also want to say that Robots.txt is best way for this purpose.

tesori

  • Jr. Member
  • **
  • Posts: 80
  • Karma: +0/-0
Re: Different ways for stopping Google bot for crawling
« Reply #3 on: January 04, 2012, 12:27:21 AM »
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.

anandiphone

  • Full Member
  • ***
  • Posts: 203
  • Karma: +0/-0
Re: Different ways for stopping Google bot for crawling
« Reply #4 on: January 05, 2012, 05:39:25 AM »
We need to stop the SEO activities and use robots.txt to stop crawling.

sancon77

  • Newbie
  • *
  • Posts: 15
  • Karma: +0/-0
Re: Different ways for stopping Google bot for crawling
« Reply #5 on: February 25, 2012, 12:25:17 AM »
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.
It is best suggestion indeed.

seenathkumar

  • Newbie
  • *
  • Posts: 16
  • Karma: +0/-0
    • Business to Business Directory
Re: Different ways for stopping Google bot for crawling
« Reply #6 on: February 25, 2012, 04:45:20 AM »
Website owners give instruction to robots about their website and some pages that robots has to ignore certain pages, robot.txt file is used to is used to stop crawler to crawl that page which contains confidential content or the website owner don't want to show it. If you have a sensitive data which you don't want to show locally on search engine then robot.txt file can play an important role to stop the crawler from crawling that page. Robot.txt is a text file that you put on your website to tell crawler which pages they don't have to visit.   





 

BitsDuJour