I run a New-Zealand-specific Drupal 7 site for an NPO which is currently exceeding International Traffic quota on our server. While most of the excess is due to spambots hitting the registration page (in the process of dealing with that) I noticed that googlebot is massing up 1.2GB/month in crawling. (Most other bots are under 100MB)
I'm not sure how many pages we have but it's growing steadily. I'm also not sure how often we are being crawled (PR is 4).
Is there any way to reduce the amount of traffic from googlebot without harming ranking too much? Would a sitemap help?