diff --git a/htroot/ConfigRobotsTxt_p.html b/htroot/ConfigRobotsTxt_p.html index 91f2b5045..472e85d1a 100644 --- a/htroot/ConfigRobotsTxt_p.html +++ b/htroot/ConfigRobotsTxt_p.html @@ -17,7 +17,7 @@ ::
Unable to access the local file: #[msg]#
::Deletion of htroot/robots.txt failed
#(/error)# #%env/templates/footer.template%# diff --git a/htroot/CrawlURLFetchStack_p.html b/htroot/CrawlURLFetchStack_p.html index 28c1f2cf0..152c1a683 100644 --- a/htroot/CrawlURLFetchStack_p.html +++ b/htroot/CrawlURLFetchStack_p.html @@ -12,16 +12,16 @@ #(addedUrls)#::Added #[added]# URLs!#(/addedUrls)# #%env/templates/footer.template%# diff --git a/htroot/CrawlURLFetch_p.html b/htroot/CrawlURLFetch_p.html index 60f15f004..251d4ea9f 100644 --- a/htroot/CrawlURLFetch_p.html +++ b/htroot/CrawlURLFetch_p.html @@ -15,7 +15,7 @@ The Re-Crawl option isn't used and the sites won't be stored in the Proxy Cache. Text and media types will be indexed. Since these URLs will be requested explicitely from another peer, they won't be distributed for remote indexing. -
|
+
|
+
|