Robots.txt Fetch Errors

Home Forums BulletProof Security Free Robots.txt Fetch Errors

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
  • #9816

    Last week I began to get crawl errors on my website, and have a 100% crawl error rate for the past five days (9/11-9/15). This problem occurred with 24 hours of updating the Bullet Proof Security Plugin (I uninstalled the plugin, rolled it back to a prior version, deleted all files, reverted to original .htaccess, etc, but it still didn’t fix the problem).

    The website is hosted with GoDaddy and is on a shared plan. The hosting account has 3 domains, all of which are WordPress sites that are configured the exact same and have the exact same plugins (the only difference is content). I am using WordPress 3.6.1, latest version, with all of the latest version of the plugins.

    The “primary” website is http: // (GoDaddy forces you to choose a primary website for a shared hosting plan), and the other 2 websites on the same plan are http: // and http: //

    I don’t have any special robot.txt information added, and the robots file is visible in a browser at http: // Additionally, in settings/reading in WordPress, “Discourage search engines from indexing this site” is NOT checked.

    The crawl errors only apply to the primary site, http: // The other 2 sites are still being crawled ok.

    Additionally, I have tried to “fetch as google” in webmaster tools, and it continues to return “unreachable robots.txt” error messages for both the homepage and other pages on the site. Using the “blocked urls” tool, the last time the robots.txt was downloaded was 9/10, the day before the problem began. With the other 2 websites (that are crawling just fine), it was downloaded on 9/14.

    I have tried to troubleshoot by disabling and uninstalling plugins, but to no avail. I have also tried changing permission on my .htaccess file, and that has not worked either. I have also tried to create a manual robots.txt file and uploaded it via FTP (in order to override the WordPress virtual robots.txt file) but that also did not work, fetch as google still returns errors.

    The only other cause I have been able to find in my research is the problem is caused by a firewall on the hosting/GoDaddy side, but they say their system is working fine. Any suggestions – if it is GoDaddy, can someone tell me exactly what I need to tell them to fix?

    My URL is: http: //

    AITpro Admin

    BPS does not interfere with the robots.txt file or do anything related to the robots.txt file.  I guess look at your caching plugin, sitemap plugin or SEO plugin?  You have the Wordfence plugin installed on your site so double check what Wordfence is blocking.


    Thanks for the quick responses. I understand that BPS doesn’t affect robots.txt, but it does affect .htaccess, which from what I understand also controls access to the robots file?

    I’m confused as to why the BPS plugin update that I installed last week happened at the exact same time the crawl errors started occurring.

    In regards to Wordfence, I have also deleted the log files and plugin files, and those made no impact on the crawl errors (there was no Wordfence plugin update last week) for the problem site. Additionally, the exact same plugins exist on the other 2 website I referenced and there are no crawl errors on those sites.

    I don’t have a caching plugin that I’m aware of.

    Do you have any other suggestions as to the cause of this problem if it isn’t triggered by BPS? Is it possible that GoDaddy is caching something from the old BPS plugin that is causing the crawl errors?

    Thanks for your help.

    AITpro Admin

    You would have to add the .htaccess code to do that.  BPS does not come with any .htaccess code that does anything with the robots.txt file.

    You stated that you removed BPS and the problem was still occurring so I guess it was just a coincidence.  If you want to completely rule out that BPS has anything to do with this issue then do these BPS general troubleshooting steps:

    1. On the Security Modes page, click the Root Folder BulletProof Mode Deactivate button. See Custom Code Note if doing this step works.
    2. On the Security Modes page, click the wp-admin Folder BulletProof Mode Deactivate button. See Custom Code Note if doing this step works.
    3. If an issue/problem is related to Login Security turn Off Login Security on the Login Security & Monitoring page.
    4. If an issue/problem is related to ISL or ACE see this forum topic:

    To completely uninstall BulletProof Security use the Uninstall Options link under the BPS plugin on the WordPress Plugins page.

    I really can’t make a general guess on this one since there are too many unknown factors that could be involved sorry.


    Thanks again for the suggestions – after I did a full uninstall/reinstall of BPS, googlebot was again able to crawl the site with no problems within 24 hours. Perhaps is was just a coincidence that the issue started at the same time as the BPS upgrade. Thanks for a great plugin.

    AITpro Admin

    Yep, must have just been a coincidence.  We have been using BPS Pro for years and are now ranked in the top 24,000 sites in the World so obviously BPS and BPS Pro do not have any impact on World Ranking, SEO, page rank, Google, etc.  I also do professional SEO work for a limited amount of clients and all of them are using BPS Pro and all of those sites are ranked well, never had a problem with Google crawling or indexing thoses sites and all of those sites are $1,000,000+ per year sales revenue generating sites. Alexa World Ranking

Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.