appeared after the declaration of robots allow/deny for the crawler because the sitemap parser terminated after the allow/deny rules had been found. Now the parser reads the robots.txt until the end to discover also sitemap rules at the end of the file.pull/1/head
parent
442ed50be0
commit
038f956821
Loading…
Reference in new issue