How to fix WordPress filter robots_txt not working

WordPress provides a filter hook, robots_txt, that dynamically generates robots.txt if the file does not exist. This filter allows developers to create unique robots.txt files for each site in a multisite network, since all the sites in the network share the same root directory (and therefore the same robots.txt file).

If this filter is not working for you, you may need to delete the robots.txt file from the root directory. The robots file will not be generated if the file actually exists.

Robots.txt is a file that communicates The Robots Exclusion Protocol to non-human website visitors, which are commonly called bots. Bots read robots.txt for guidance about how to best crawl the site.