- moved all index generation servlets to it's own main menu item, including proxy indexing
- removed external index import because this operation is not recommended any more. Joining an index can simply be done by moving the index files from one peer to the other peer; they will be merged automatically
- fix to prevent endless loops when disconnecting http sessions
- fix to prevent application of bad blacklist entries that can cause a 'Dangling meta character' exception
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@6558 6c8d7289-2bf4-0310-a012-ef5d649a1542
- removed web structure picture from indexing menu and grouped it together with htcache monitor
- added a database for terminated crawls, when a crawl is finished it is automatically moved to the new database
- extended crawl profile edit servlet, shows now also terminated crawls
- option that was used to delete profiles is now redesigned to a function that moves the current crawl to the terminated crawls and removes all urls from the current queues!
- fixed here and there problems with indexing queues
- enhances indexing speed by changing cache flush sizes.
- changed behaviour of crawl result servlet: the list of crawled urls is shown if there is one, othevise the overview window is shown
attention: the new profile databases are not compatible with the old one. current crawls will be lost! the web index is not touched.
next steps: the database of terminated crawls can be used to start with them a new crawl. This is useful if one wants to re-crawl specific pages and wants to use a old crawl profile.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@4113 6c8d7289-2bf4-0310-a012-ef5d649a1542
- caught possible NPE in CacheAdmin_p and added more error-cases
- speeded up deletion of entries in the local crawl queue by crawl profile (it has been noted often that this deletion is slow)
- added a bit javadoc
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3868 6c8d7289-2bf4-0310-a012-ef5d649a1542
If you don't use the default skin, the style will be broken or at least not complete.
YaCy now has two css-files: base.css in htroot/env and the skin. In base.css the layout and black/white text-formating-rules are defined. Colors are only defined in the skin.
The skin is now very easy to read and to change. If you want to make more changes than the colors you see in the default-skin, feel free to use the full power of css, but you are warned: The code is still not ready and may change, but we try to avoid changes which affect anything in the default-style.
Translation will be broken too because the language-files contain HTML-Code which has changed.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@2547 6c8d7289-2bf4-0310-a012-ef5d649a1542
- updated de.lng with translation for simple_search.html and update translation for IndexCreateWWWLocalQueue_p.html
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1174 6c8d7289-2bf4-0310-a012-ef5d649a1542
Crawler StartURLs will now also added to the errorURL-DB if an error occures on this url
*) kelondroStack.java, plasmaSwitchboardQueue.java
Adding method which returns a list of all entries in the queue. This list is used by IndexCreate_p.java
instead of an iterator to display the indexing-list.
Advantages: avoid concurrent modifications of the list while displaying it.
Speedup because now we have to access only one sync function instead of multiple ones
(one for each entry)
*) IndexCreateIndexingQueue_p.java
Using new list() function of plasmaSwitchboardQueue
*) httpdFileHandler.java
If a servelet returns the special value "LOCATION" the httpFileHandler does a Redirection of
the Browser to the URL specified by the servelet. This can e.g. be used when a http get request is
used insead of a post request, but a refresh should not be allowed.
*) IndexCreateWWWLocalQueue_p.html
Now it's possible to delete single entries of the local crawler queue
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@626 6c8d7289-2bf4-0310-a012-ef5d649a1542