theuchoosepodcast
New Member
- Messages
- 1
- Reaction score
- 0
- Points
- 1
Hi x10 Support,
I'm hosting a website on your platform at uchoosepodcast.x10.network and I'm experiencing an issue where Facebook's sharing crawler is receiving a 403 Forbidden response when attempting to scrape my pages.
I've already taken the following steps to resolve this on my end:
- Added a robots.txt file to my public_html root explicitly allowing facebookexternalhit
- Confirmed Hotlinks Protection is disabled
- Verified my files have correct 644 permissions
Despite these steps, Facebook's Sharing Debugger tool continues to return a 403, which suggests the bot is being blocked at the server or firewall level.
Could you please whitelist the following user agent on my domain uchoosepodcast.x10.network?
User-agent: facebookexternalhit/1.1
This is Facebook's official crawler used to generate link preview cards when URLs are shared on Facebook. Without this whitelist, my content cannot generate preview cards when shared on the platform.
Thank you for your help!
I'm hosting a website on your platform at uchoosepodcast.x10.network and I'm experiencing an issue where Facebook's sharing crawler is receiving a 403 Forbidden response when attempting to scrape my pages.
I've already taken the following steps to resolve this on my end:
- Added a robots.txt file to my public_html root explicitly allowing facebookexternalhit
- Confirmed Hotlinks Protection is disabled
- Verified my files have correct 644 permissions
Despite these steps, Facebook's Sharing Debugger tool continues to return a 403, which suggests the bot is being blocked at the server or firewall level.
Could you please whitelist the following user agent on my domain uchoosepodcast.x10.network?
User-agent: facebookexternalhit/1.1
This is Facebook's official crawler used to generate link preview cards when URLs are shared on Facebook. Without this whitelist, my content cannot generate preview cards when shared on the platform.
Thank you for your help!