Thursday, February 16, 2012

Google Crawlers

Not long ago when My college website got affected i came across the term crawler.
Crawler" is a generic term for any program (such as a robot or spider) used to automatically discover and scan websites by following links from one webpage to another. Google's main crawler is called 
GooglebotGooglebot is Google's web crawling bot (sometimes also called a "spider"). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.We use a huge set of computers to fetch (or "crawl") billions of pages on the web. Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.go to the link below for more info.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1061943

No comments:

Post a Comment