diff --git a/htroot/AccessTracker_p.html b/htroot/AccessTracker_p.html index 6010163ff..d97fc011b 100644 --- a/htroot/AccessTracker_p.html +++ b/htroot/AccessTracker_p.html @@ -6,17 +6,7 @@ #%env/templates/header.template%# - + #%env/templates/submenuAccessTracker.template%# #(page)#

Server Access Overview

This is a list of requests to the local http server within the last hour.

@@ -61,7 +51,7 @@ #{/list}# :: -

Local Searches

+

Local Search Log

This is a list of searches that had been requested from this' peer search interface

Showing #[num]# entries from a total of #[total]# requests.

@@ -89,7 +79,7 @@ #{/list}#
:: -

Local Search Tracker

+

Local Search Host Tracker

This is a list of searches that had been requested from this' peer search interface

Showing #[num]# entries from a total of #[total]# requests.

@@ -109,7 +99,7 @@ #{/list}#
:: -

Remote Searches

+

Remote Search Log

This is a list of searches that had been requested from remote peer search interface

Showing #[num]# entries from a total of #[total]# requests.

@@ -137,7 +127,7 @@ #{/list}#
:: -

Remote Search Tracker

+

Remote Search Host Tracker

This is a list of searches that had been requested from remote peer search interface

Showing #[num]# entries from a total of #[total]# requests.

diff --git a/htroot/AccessTracker_p.xml b/htroot/AccessTracker_p.xml index 4f08c99f9..0558e58bc 100644 --- a/htroot/AccessTracker_p.xml +++ b/htroot/AccessTracker_p.xml @@ -1,14 +1,23 @@ -#(page)# - #{list}# +#(page)# + #{list}# + + #[host]# + #[countSecond]# + #[countMinute]# + #[count10Minutes]# + #[countHour]# + + #{/list}# + :: + #{list}# #[host]# #[date]# #[path]# - #{/list}# - :: + #{/list}# :: #{list}# @@ -24,7 +33,7 @@ #{/list}# :: - #{list}# + #{list}# #[host]# #[count]# @@ -33,7 +42,7 @@ #[date]# #{/dates}# - #{/list}# + #{/list}# :: #{list}# @@ -50,7 +59,7 @@ #{/list}# :: - #{list}# + #{list}# #[host]# #[count]# @@ -59,7 +68,7 @@ #[date]# #{/dates}# - #{/list}# + #{/list}# #(/page)# \ No newline at end of file diff --git a/htroot/BlacklistCleaner_p.html b/htroot/BlacklistCleaner_p.html index 8befc1272..d099fd925 100644 --- a/htroot/BlacklistCleaner_p.html +++ b/htroot/BlacklistCleaner_p.html @@ -1,7 +1,7 @@ - YaCy '#[clientname]#': Blacklist Manager + YaCy '#[clientname]#': Blacklist Cleaner #%env/templates/metas.template%# diff --git a/htroot/Blacklist_p.html b/htroot/Blacklist_p.html index 0d8c4af34..cd5a605ab 100644 --- a/htroot/Blacklist_p.html +++ b/htroot/Blacklist_p.html @@ -1,14 +1,14 @@ - YaCy '#[clientname]#': Blacklist Manager + YaCy '#[clientname]#': Blacklist Administration #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuBlacklist.template%# -

Blacklist

+

Blacklist Administration

Used Blacklist engine: #[blacklistEngine]#

This function provides an URL filter to the proxy; any blacklisted URL is blocked diff --git a/htroot/CacheAdmin_p.html b/htroot/CacheAdmin_p.html index eafa7464d..c04c5f092 100644 --- a/htroot/CacheAdmin_p.html +++ b/htroot/CacheAdmin_p.html @@ -1,12 +1,12 @@ - YaCy '#[clientname]#': Local Cache Management + YaCy '#[clientname]#': Web Cache #%env/templates/metas.template%# #%env/templates/header.template%# -

Local Cache

+

Web Cache

The current cache size is #[cachesize]# KB. The maximum cache size is #[cachemax]# KB.

cache/#{paths}# - YaCy '#[clientname]#': Connection Tracking + YaCy '#[clientname]#': Server Connection Tracking #%env/templates/metas.template%# - #%env/templates/header.template%# -

Connection Tracking

+ #%env/templates/header.template%# + #%env/templates/submenuAccessTracker.template%# +

Server Connection Tracking

Incoming Connections

Showing #[numActiveRunning]# active, #[numActivePending]# pending connections from a max. of #[numMax]# allowed incoming connections.

diff --git a/htroot/IndexMonitor.html b/htroot/CrawlResults.html similarity index 84% rename from htroot/IndexMonitor.html rename to htroot/CrawlResults.html index 9dd0e8a75..ffb2d1ab8 100644 --- a/htroot/IndexMonitor.html +++ b/htroot/CrawlResults.html @@ -1,26 +1,26 @@ - YaCy '#[clientname]#': Index Monitor + YaCy '#[clientname]#': Crawl Results #%env/templates/metas.template%# - + #%env/templates/header.template%# #(process)# -

Indexing Queues Monitor Overview

+

Crawl Results Overview

These are monitoring pages for the different indexing queues.

YaCy knows 5 different ways to acquire web indexes. The details of these processes (1-5) are described within the submenu's listed above which also will show you a table with indexing results so far. The information in these tables is considered as private, @@ -33,7 +33,7 @@ Some processes occur double to document the complex index migration structure.

:: -

(1) Index Monitor of Remote Crawl Receipts

+

(1) Results of Remote Crawl Receipts

This is the list of web pages that this peer initiated to crawl, but had been crawled by other peers. This is the 'mirror'-case of process (6). @@ -42,18 +42,18 @@ 'Do Remote Indexing'-flag. Every page that a remote peer indexes upon this peer's request is reported back and can be monitored here.

:: -

(2) Index Monitor for Result of Search Queries

+

(2) Results for Result of Search Queries

This index transfer was initiated by your peer by doing a search query. The index was crawled and contributed by other peers.

Use Case: This list fills up if you do a search query on the 'Search Page'

:: -

(3) Index Monitor for Index Transfer.

+

(3) Results for Index Transfer

The url fetch was initiated and executed by other peers. These links here have been transmitted to you because your peer is the most appropriate for storage according to the logic of the Global Distributed Hash Table.

Use Case: This list may fill if you check the 'Index Receive'-flag on the 'Index Control' page

:: -

(4) Index Monitor for Proxy Indexing

+

(4) Results for Proxy Indexing

These web pages had been indexed as result of your proxy usage. No personal or protected page is indexed; such pages are detected by Cookie-Use or POST-Parameters (either in URL or as HTTP protocol) @@ -62,11 +62,11 @@ Set the proxy settings of your browser to the same port as given on the 'Settings'-page in the 'Proxy and Administration Port' field.

:: -

(5) Index Monitor for Local Crawling.

+

(5) Results for Local Crawling

These web pages had been crawled by your own crawl task.

Use Case: start a crawl by setting a crawl start point on the 'Index Create' page.

:: -

(6) Index Monitor for Global Crawling

+

(6) Results for Global Crawling

These pages had been indexed by your peer, but the crawl was initiated by a remote peer. This is the 'mirror'-case of process (1).

Use Case: This list may fill if you check the 'Accept remote crawling requests'-flag on the 'Index Crate' page

diff --git a/htroot/IndexMonitor.java b/htroot/CrawlResults.java similarity index 85% rename from htroot/IndexMonitor.java rename to htroot/CrawlResults.java index 31196d51b..44590e374 100644 --- a/htroot/IndexMonitor.java +++ b/htroot/CrawlResults.java @@ -1,11 +1,15 @@ -// IndexMonitor.java -// ----------------------- -// part of the AnomicHTTPD caching proxy -// (C) by Michael Peter Christen; mc@anomic.de -// first published on http://www.anomic.de -// Frankfurt, Germany, 2004, 2005 -// last change: 09.03.2005 +// CrawlResults.java +// (C) 2005 by Michael Peter Christen; mc@yacy.net, Frankfurt a. M., Germany +// first published 09.03.2005 on http://yacy.net // +// This is a part of YaCy, a peer-to-peer based web search engine +// +// $LastChangedDate: 2006-04-02 22:40:07 +0200 (So, 02 Apr 2006) $ +// $LastChangedRevision: 1986 $ +// $LastChangedBy: orbiter $ +// +// LICENSE +// // This program is free software; you can redistribute it and/or modify // it under the terms of the GNU General Public License as published by // the Free Software Foundation; either version 2 of the License, or @@ -19,29 +23,6 @@ // You should have received a copy of the GNU General Public License // along with this program; if not, write to the Free Software // Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA -// -// Using this software in any meaning (reading, learning, copying, compiling, -// running) means that you agree that the Author(s) is (are) not responsible -// for cost, loss of data or any harm that may be caused directly or indirectly -// by usage of this softare or this documentation. The usage of this software -// is on your own risk. The installation and usage (starting/running) of this -// software may allow other people or application to access your computer and -// any attached devices and is highly dependent on the configuration of the -// software which must be done by the user of the software; the author(s) is -// (are) also not responsible for proper configuration and usage of the -// software, even if provoked by documentation provided together with -// the software. -// -// Any changes to this file according to the GPL as documented in the file -// gpl.txt aside this file in the shipment you received can be done to the -// lines that follows this copyright notice here, but changes must not be -// done inside the copyright notive above. A re-distribution must contain -// the intact and unchanged copyright notice. -// Contributions and changes to the program code must be marked as such. - -// You must compile this file with -// javac -classpath .:../Classes Settings_p.java -// if the shell's current path is HTROOT import java.text.SimpleDateFormat; import java.util.Date; @@ -59,7 +40,7 @@ import de.anomic.tools.nxTools; import de.anomic.yacy.yacyCore; import de.anomic.yacy.yacySeed; -public class IndexMonitor { +public class CrawlResults { public static serverObjects respond(httpHeader header, serverObjects post, serverSwitch env) { // return variable that accumulates replacements @@ -156,7 +137,7 @@ public class IndexMonitor { if (showControl) { prop.put("table_showControl", 1); - prop.put("table_showControl_feedbackpage", "IndexMonitor.html"); + prop.put("table_showControl_feedbackpage", "CrawlResults.html"); prop.put("table_showControl_tabletype", tabletype); } else prop.put("table_showControl", 0); @@ -197,7 +178,7 @@ public class IndexMonitor { prop.put("table_indexed_" + cnt + "_dark", (dark) ? 1 : 0); if (showControl) { prop.put("table_indexed_" + cnt + "_showControl", 1); - prop.put("table_indexed_" + cnt + "_showControl_feedbackpage", "IndexMonitor.html"); + prop.put("table_indexed_" + cnt + "_showControl_feedbackpage", "CrawlResults.html"); prop.put("table_indexed_" + cnt + "_showControl_tabletype", tabletype); prop.put("table_indexed_" + cnt + "_showControl_urlhash", urlHash); } else diff --git a/htroot/DetailedSearch.html b/htroot/DetailedSearch.html index 78683e853..25746082e 100644 --- a/htroot/DetailedSearch.html +++ b/htroot/DetailedSearch.html @@ -1,7 +1,7 @@ - YaCy '#[clientname]#': Search Page + YaCy '#[clientname]#': Detailed Search #%env/templates/metas.template%# diff --git a/htroot/IndexCreateIndexingQueue_p.html b/htroot/IndexCreateIndexingQueue_p.html index c7abbc6fe..2e467b9f1 100644 --- a/htroot/IndexCreateIndexingQueue_p.html +++ b/htroot/IndexCreateIndexingQueue_p.html @@ -1,13 +1,13 @@ - YaCy '#[clientname]#': Index Creation/Indexing Queue + YaCy '#[clientname]#': Indexing Queue #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuCrawler.template%# -

Index Creation: Indexing Queue

+

Indexing Queue

#(indexing-queue)# diff --git a/htroot/IndexCreateLoaderQueue_p.html b/htroot/IndexCreateLoaderQueue_p.html index aa67850b3..cd15f5c06 100644 --- a/htroot/IndexCreateLoaderQueue_p.html +++ b/htroot/IndexCreateLoaderQueue_p.html @@ -1,13 +1,13 @@ - YaCy '#[clientname]#': Index Creation / Loader Queue + YaCy '#[clientname]#': Loader Queue #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuCrawler.template%# -

Index Creation: Loader Queue

+

Loader Queue

#(loader-set)# diff --git a/htroot/IndexCreateWWWGlobalQueue_p.html b/htroot/IndexCreateWWWGlobalQueue_p.html index df7624dc1..d7234864b 100644 --- a/htroot/IndexCreateWWWGlobalQueue_p.html +++ b/htroot/IndexCreateWWWGlobalQueue_p.html @@ -1,13 +1,13 @@ - YaCy '#[clientname]#': Index Creation / WWW Global Crawl Queue + YaCy '#[clientname]#': Global Crawl Queue #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuCrawler.template%# -

Index Creation: WWW Global Crawl Queue

+

Global Crawl Queue

This queue stores the urls that shall be sent to other peers to perform a remote crawl. If there is no peer for remote crawling available, the links are crawled locally. diff --git a/htroot/IndexCreateWWWLocalQueue_p.html b/htroot/IndexCreateWWWLocalQueue_p.html index 9acc906f5..ebefb3bb3 100644 --- a/htroot/IndexCreateWWWLocalQueue_p.html +++ b/htroot/IndexCreateWWWLocalQueue_p.html @@ -1,13 +1,13 @@ - YaCy '#[clientname]#': Index Creation / WWW Local Crawl Queue + YaCy '#[clientname]#': Local Crawl Queue #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuCrawler.template%# -

Index Creation: WWW Local Crawl Queue

+

Local Crawl Queue

This queue stores the urls that shall be crawled localy by this peer. It may also contain urls that are computed by the proxy-prefetch. diff --git a/htroot/IndexCreateWWWRemoteQueue_p.html b/htroot/IndexCreateWWWRemoteQueue_p.html index 94c1eaf31..cbe2a95c9 100644 --- a/htroot/IndexCreateWWWRemoteQueue_p.html +++ b/htroot/IndexCreateWWWRemoteQueue_p.html @@ -1,13 +1,13 @@ - YaCy '#[clientname]#': Index Creation / WWW Remote Crawl Queue + YaCy '#[clientname]#': Remote Crawl Queue #%env/templates/metas.template%# #%env/templates/header.template%# #%env/templates/submenuCrawler.template%# -

Index Creation: WWW Remote Crawl Queue

+

Remote Crawl Queue

This queue stores the urls that other peers sent to you in order to perform a remote crawl for them.

diff --git a/htroot/Network.html b/htroot/Network.html index d1f6df506..86e34b7a2 100644 --- a/htroot/Network.html +++ b/htroot/Network.html @@ -1,7 +1,7 @@ - YaCy '#[clientname]#': Network Overview + YaCy '#[clientname]#': YaCy Network #%env/templates/metas.template%#