The old Windows-only method has not been removed, so Windows-users without updater are still able to restart without updater
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3752 6c8d7289-2bf4-0310-a012-ef5d649a1542
- modified ok icon (was 64x62, now 64x64)
- replaced warning icon
- added size tags to warning image in configBasic
- replaced bad icon for hints on status page by new icons
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3677 6c8d7289-2bf4-0310-a012-ef5d649a1542
- cluster definitions can now contain an addition for local ip addresses
- cluster-cluster communication uses the local ip address instead the global address, if one is given
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3624 6c8d7289-2bf4-0310-a012-ef5d649a1542
the option is only presented if it is known that there is somewhere
another more recent version
- removed some old code
- disabled online-mode presentation. the code is not yet removed until decision about
future usability is made. (not all countries have DSL for everyone)
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3610 6c8d7289-2bf4-0310-a012-ef5d649a1542
automatically aquire release information from download archives
web pages from latest.yacy-forum.net and yacy.net are retrieved, parsed,
links wihin are analysed, sorted and the most recent developer and main
releases are provided as direct download link on the status page, if it was
discovered that a more recent version than the current version is available.
This process is done only once during run-time of a peer, to protect our
download archives from DoS by YaCy peers.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3606 6c8d7289-2bf4-0310-a012-ef5d649a1542
- removed restart-button on non-Windows platforms (until start-script is adjusted)
- margin for <dt>s with more than one line
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3209 6c8d7289-2bf4-0310-a012-ef5d649a1542
- remote proxy configuration can now be "really" changed on the fly and takes effect immediately
- adding possibility to disable remote proxy usage for yacy->yacy communication
- adding possibility to disable remote proxy usage for ssl
- restructuring proxy configuration so that it is stored in a single place now
*) Adding possibility to import a foreign word DB (or even more of them in parallel)
at runtime into the peers DB
- this can be done by calling IndexImport_p.html
- ATTENTION: please not that at the moment this thread must be aborted via gui
before a normal server shutdown is done.
- TODO: integrating IndexImport Thread into normal server shutdown
- TODO: Adding posibility to import crawl-queues, etc. from foreign peers
- TODO: removing old import function from yacy.java and calling the new routines instead
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@968 6c8d7289-2bf4-0310-a012-ef5d649a1542
- adding automatic refresh
- accepts new parameter nameLookup which can be used to deactivate
yacy-peer name lookup (because we have problems with this on large seed-dbs)
*) ViewFile
New page that can be used to view
- original content
- plain text content
- parsed content
- parsed sentences
of a webpage specified by there url hash
Mainly for debugging purpose at the moment
*) Robots.txt
Bugfix for if-modified-since usage
TODO: synchronization of downloads to avoid loading the same robots-file
multiple times in parallel by different threads
*) Shutdown
Better abortion of transferRWI and transferURL sessions on server shutdown
*) Status Page
Adding icon to start/stop crawling via status page
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@950 6c8d7289-2bf4-0310-a012-ef5d649a1542
various checks like the blacklist check or the robots.txt disallow check are now
done by a separate thread to unburden the indexer thread(s)
TODO: maybe we have to introduce a threadpool here if it turn out that this single
thread is a bottleneck because of the time consuming robots.txt downloads
*) improved index transfer
The index selection and transmission is done in parallel now to improve index
transfer performance.
TODO: maybe we could speed up performance by unsing multiple transmission threads in
parallel instead of only a single one.
*) gzip encoded post requests
it is now configureable if a gzip encoded post request should be send on
intex transfer/distribution
*) storage Peer (very experimentell and not optimized yet)
Now it's possible to send the result of the yacy indexer thread to a remote peer
istead of storing the indexed words locally.
This could be done by setting the property "storagePeerHash" in the yacy config file
- Please note that if the index transfer fails, the index ist stored locally.
- TODO: currently this index transfer is done by the indexer thread.
To seedup the indexer
a) this transmission should be done in parallel and
b) multiple chunks should be bundled and transfered together
*) general performance improvements
- better memory cleanup after http request processing has finished
- replacing some string concatenations with stringBuffers
- replacing BufferedInputStreams with serverByteBuffer
- replacing vectors with arraylists wherever possible
- replacing hashtables with hashmaps wherever possible
This was done because function calls to verctor or hashtable functions
take 3 time longer than calls to functions of arraylists or hashmaps.
TODO: we should take a look on the class serverObject which is inherited from hashmap
Do we realy need a synchronization for this class?
TODO: replace arraylists with linkedLists if random access to the list elements is not needed
*) Robots Parser supports if-modified-since downloads now
If the downloaded robots.txt file is older than 7 days the robots parser tries to
download the robots.txt with the if-modified-since header to avoid unnecessary downloads
if the file was not changed. Additionally the ETag header is used to detect changes.
*) Crawler: better handling of unsupported mimeTypes + FileExtension
*) Bugfix: plasmaWordIndexEntity was not closed correctly in
- query.java
- plasmaswitchboard.java
*) function minimizeUrlDB added to yacy.java
this function tests the current urlHashDB for unused urls
ATTENTION: please don't use this function at the moment because
it causes the wordIndexDB to flush all words into the
word directory!
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@853 6c8d7289-2bf4-0310-a012-ef5d649a1542