learning_brain
New Member
- Messages
- 206
- Reaction score
- 1
- Points
- 0
I'm not sure where to put this query.
I've been checking my logs because I easily get through 20GB of Bandwidth per month - which takes me up to my limit prematurely and then cuts me off about 7 days before the end of the month (not X10hosting).
In the stats, Googlebot (just last month) ate 16.10GB last month before service was terminated 8 days before month end - yes that's right 16.10 GB - not MB!!! In more detail, I find they consume about 700,000 KB a day!
It hit pages 251,496 times in June alone.
OUCH!!!
OK - it's a biggish site with currently 342,000 odd pages and images - dynamically listed in the sitemap-indexes.
Under Google Webmaster tools, I've now set crawl rate at 200 seconds between requests, but is there any other advice anyone can give? (other than disallowing bots in the robots file).
Thoughts would be appreciated. I get pretty hacked off when I lose service for days on end.
Thanks
Rich
I've been checking my logs because I easily get through 20GB of Bandwidth per month - which takes me up to my limit prematurely and then cuts me off about 7 days before the end of the month (not X10hosting).
In the stats, Googlebot (just last month) ate 16.10GB last month before service was terminated 8 days before month end - yes that's right 16.10 GB - not MB!!! In more detail, I find they consume about 700,000 KB a day!
It hit pages 251,496 times in June alone.
OUCH!!!
OK - it's a biggish site with currently 342,000 odd pages and images - dynamically listed in the sitemap-indexes.
Under Google Webmaster tools, I've now set crawl rate at 200 seconds between requests, but is there any other advice anyone can give? (other than disallowing bots in the robots file).
Thoughts would be appreciated. I get pretty hacked off when I lose service for days on end.
Thanks
Rich
Last edited: