Google Webmaster reporting robots.txt error

Status
Not open for further replies.

addozone

Member
Messages
64
Reaction score
3
Points
8
I received this error message from Google Webmaster when testing a new sitemap.xml file. I'm using a free account. To the best of my knowledge, I'm not using a robots.txt file - I can't see one in my directory. Any ideas?

Network unreachable: robots.txt unreachable

We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.​
 

addozone

Member
Messages
64
Reaction score
3
Points
8
I received this error message from Google Webmaster when testing a new sitemap.xml file. I'm using a free account. To the best of my knowledge, I'm not using a robots.txt file - I can't see one in my directory. Any ideas?

Network unreachable: robots.txt unreachable

We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.​
.
Solved my own issue. Added a generic robots.txt file and all seems to be fine. It would seem that Google's error message was misleading.

EDIT: Turns out there is still a problem! The Google Webmaster robots.txt test tool works fine, but Google reports a 5xx error for a robots.txt file when googlebot tries to crawl the site for real.

Question: Does X10 block googlebot? Is there something I need to do to enable it? Is there another robots.txt file outside of my website root directory that affects the website?

 
Last edited:
Status
Not open for further replies.
Top