There are two ways a search engine can find a website to add to its database. The first and most direct method is to accept submissions from website optimizers or website owners. Most search engines have a submit your URL section inviting submissions from website owners. The second method of finding websites is that search engines send a spider to follow each and every link on, off, and through websites which in turn link to other websites. If you have links from other sites then your site eventually will be found.
Search engine is one program which works based on specific algorithm, different search engines have their own different algorithm which search engine spiders crawl the websites and save it in their search engine database, by this way search engine visit our website through search engine spiders is one program.
sitemap.xml is also important for that, where we include all urls of our website.
Once the engines find these pages, they next decipher the code from them and store selected pieces in massive hard drives, to be recalled later when needed for a search query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engines have constructed datacenters all over the world.
Search engine is one program which works based on specific algorithm, different search engines have their own different algorithm which search engine spiders crawl the websites and save it in their search engine database, by this way search engine visit our website through search engine spiders is one program.