Google Sitemaps Webmaster Tools has been around some time giving out some information about how Google treats your website. The most interesting part was a statistic showing for which queries your site shows up and for which queries it gets most clicks.

During the last days Google has added a new great stats function to Sitemaps: Under Diagnostics > Tools > Crawl Rate you can see how often the Googlebot has crawled your site, how many traffic was generated per day and how long it took to download an average website. Especially the last information gives good information about speed problems on your host or server (I have attached a copy of my current speed stats which show some definitive speed problems my webhost has had in the past).

Finally the new page in Sitemaps tells me that Googlebot is currently limiting the number of accesses to my web page to avoid overloading the server. If I’m sure that the host can stand more connections (and I am for most of my sites ;-)) I can allow Googlebot to access the page more often. Unfortunately I don’t keep stats about how often search engine bots access my websites so I won’t find out if this option really changes anything.

Attached Files: