Make sure that you only download data from trustworthy sources. The new Skin file
might overwrite existing data if a file of the same name exists already.
With the button "discover from index" you can search within the metadata of your local index (Web Structure Index) to find systems which support the Opensearch specification.
The task is started in the background. It may take some minutes before new entries appear (after refreshing the page).
Alternatively you may copy & paste a example config file located in defaults/heuristicopensearch.conf to the DATA/SETTINGS directory.
For the discover function the web graph option of the web structure index and the fields target_rel_s, target_protocol_s, target_urlstub_s have to be switched on in the webgraph Solr schema.
- #{osdsolrfieldswitch}##{/osdsolrfieldswitch}#
+ #{osdsolrfieldswitch}##{/osdsolrfieldswitch}#
Make sure that you only download data from trustworthy sources. The new language file
might overwrite existing data if a file of the same name exists already.
diff --git a/htroot/CrawlStartSite_p.html b/htroot/CrawlStartSite_p.html
index 375dc6beb..a9f47b7cc 100644
--- a/htroot/CrawlStartSite_p.html
+++ b/htroot/CrawlStartSite_p.html
@@ -90,7 +90,7 @@
-
+
diff --git a/htroot/HostBrowser.html b/htroot/HostBrowser.html
index 1a1366f61..41aee5290 100644
--- a/htroot/HostBrowser.html
+++ b/htroot/HostBrowser.html
@@ -80,10 +80,10 @@ function updatepage(str) {
@@ -35,7 +35,7 @@
No reference size limitation (this may cause strong CPU load when words are searched that appear very often)
Limitation of number of references per word: (this causes that old references are deleted if that limit is reached)
-
+
@@ -109,7 +109,7 @@
1000
-
+
@@ -125,7 +125,7 @@
-
+
@@ -133,7 +133,7 @@
Transfer by Word-Hash:
-
+
to Peer:
select
-
+
@@ -255,8 +255,8 @@
-
-
+
+
diff --git a/htroot/IndexControlURLs_p.html b/htroot/IndexControlURLs_p.html
index d578c4296..66db4ba33 100644
--- a/htroot/IndexControlURLs_p.html
+++ b/htroot/IndexControlURLs_p.html
@@ -81,13 +81,13 @@ function updatepage(str) {
Retrieve by URL:
-
+
Retrieve by URL-Hash:
-
+
@@ -108,7 +108,7 @@ function updatepage(str) {
Stop Crawler and delete Crawl Queues
Delete robots.txt Cache
Delete cached snippet-fetching failures during search
-
+
@@ -121,7 +121,7 @@ function updatepage(str) {
One URL stub, a list of URL stubs or a regular expression
-
+
Matching Method
sub-path of given URLs
matching with regular expression
-
-
+
+
#(urldelete-active)#::selected #[count]# documents for deletion::deleted #[count]# documents#(/urldelete-active)#
@@ -38,7 +38,7 @@
Time Period
All documents older than
-
Age Identification
- load date
- last-modified
+ load date
+ last-modified
-
-
+
+
#(timedelete-active)#::selected #[count]# documents for deletion::deleted #[count]# documents#(/timedelete-active)#
@@ -84,19 +84,19 @@
Delete all documents which are inside specific collections.
Not Assigned
-
Delete all documents which are not assigned to any collection
+
Delete all documents which are not assigned to any collection
Assigned
-
Delete all documents which are assigned to the following collection(s)
- #(collectiondelete-select)#, separated by ',' (comma) or '|' (vertical bar); or generate the collection list... ::
+
Delete all documents which are assigned to the following collection(s)
+ #(collectiondelete-select)#, separated by ',' (comma) or '|' (vertical bar); or generate the collection list... ::
- #{list}##{/list}#
+ #{list}##{/list}#
#(/collectiondelete-select)#
-
-
+
+
#(collectiondelete-active)#::selected #[count]# documents for deletion::deleted #[count]# documents#(/collectiondelete-active)#
@@ -110,10 +110,10 @@
q=
-
+
-
-
+
+
#(querydelete-active)#::selected #[count]# documents for deletion::deleted #[count]# documents#(/querydelete-active)#