Usually for most it is a good sign when you see the Google Crawler or GoogleBot visiting your site frequently. It usually means when you publish or post new content Google will likely find it and add it to the search results or index it in a short period of time.
If Google doesn't frequent your web site it may take much longer before you see your new page appear in Google search results. Generally newer sites and those that do not change very often tend to take a bit longer than more popular and active sites.
Some sites are so active and popular that the Google Crawler Bot almost lives there. There are some that suggest that Google is responsible for nearly 1/3 of all internet bandwidth consumption. Now this most likely will not be apparent with new sites and less popular sites. For a while I was a little skeptical about this figure, but lately I am leaning towards it.
Google does frequent my site daily. Usually I can post something new and within 4 to 12 hours it will be indexed into Google.
Recently I noticed a huge jump in Google Crawling activity. The past week or so Google has been crawling up to 17,000 pages per day, and chewing up nearly 300 MB's of bandwidth per day.
When I look at AwStats I can see that the Googlebot has made 89,531 hits using 1.56 GB's of bandwidth. The other crawler bot that is remotely close to Google is Yahoo Slurp. Yahoo made 9152 hits with 100 MB's. A huge difference. None of the others even come close.
Now I don't have a problem with this extra activity from the Googlebot, but I just hope it isn't putting too much extra strain on the server.
In Google Webmaster Tools there is the option to slow down or speed up the crawling rate. I usually let Google determine my crawl rate. When I select the custom crawl rate option it tells me it currently is making 0.2 requests per second with 5 seconds between requests.
I could slow this down a little, but if you notice it states: "custom crawl rate will be valid for 90 days". Which doesn't have anything to do with how often Googlebot actually visits, just how much time it spends on your site once it is there.
If I slow it down too much or not quite enough, it remains in effect for 90 days. So I really don't want to adjust this quite yet. Being stuck for 90 days is a little too long should something not work out the way I intended it to.
For the time being I will continue to monitor Googlebot's visits and bandwidth consumption over the next few weeks.
Bottom line is I am leaning towards I can see how it could potentially be responsible for nearly 1/3 of all internet bandwidth, or close to it.