What is google crawling?
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider).
Web crawling is the process or reading through your webpage source by search engine spiders. They provide a cache certificate after a successful crawl.