WordPress provides a filter hook, robots_txt
, that dynamically generates robots.txt if the file does not exist. This filter allows developers to create unique robots.txt files for each site in a multisite network, since all the sites in the network share the same root directory (and therefore the same robots.txt file).
If this filter is not working for you, you may need to delete the robots.txt file from the root directory. The robots file will not be generated if the file actually exists.
Robots.txt is a file that communicates The Robots Exclusion Protocol to non-human website visitors, which are commonly called bots. Bots read robots.txt for guidance about how to best crawl the site.
Comments
2 responses to “How to fix WordPress filter robots_txt not working”
I love the fact your are writing posts like this.
One quick note, your post says
I believe
robots.txt
is dynamically generated for any WordPress install (multisite or not) and this phrase could be interpretted as it only applies to multisite.I look forward to further posts.
You’re right–I guess I was conflating my procedure this week with the behavior of WP core. I’ve been updating the text file for all our single installs and using this filter for multisites.
WP sends all requests for files that do not exist to index.php, and then
template-loader.php
runs thedo_robots
action hook, so this behavior will occur on single installs as well as multisites.