use meta tags no follow no index
specifying in robots.txt
creating captcha or form to be filled by real users with page load
I know these three methods, if you know any other method please post here.
Robots.txt is used for not crawling particular pages in the website , first we need to stop the SEO activities for Google bot to stop crawling.
I also want to say that Robots.txt is best way for this purpose.
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.
We need to stop the SEO activities and use robots.txt to stop crawling.
Quote from: tesori on January 04, 2012, 01:27:21 AM
Hi.
You can create the captcha for the real users or you can go for the robots.txt, which is the best purpose for this.
It is best suggestion indeed.
Website owners give instruction to robots about their website and some pages that robots has to ignore certain pages, robot.txt file is used to is used to stop crawler to crawl that page which contains confidential content or the website owner don't want to show it. If you have a sensitive data which you don't want to show locally on search engine then robot.txt file can play an important role to stop the crawler from crawling that page. Robot.txt is a text file that you put on your website to tell crawler which pages they don't have to visit.
Quote from: lillianabe on October 14, 2011, 02:21:02 AM
use meta tags no follow no index
specifying in robots.txt
creating captcha or form to be filled by real users with page load
I know these three methods, if you know any other method please post here.
I also follow these three method
Robot.txt is the main way to stop crawl the website