cutting of comments at the line end
*) Adding Threadpool for stackCrawl Thread to speedup robots.txt download
and double url checks
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@882 6c8d7289-2bf4-0310-a012-ef5d649a1542
various checks like the blacklist check or the robots.txt disallow check are now
done by a separate thread to unburden the indexer thread(s)
TODO: maybe we have to introduce a threadpool here if it turn out that this single
thread is a bottleneck because of the time consuming robots.txt downloads
*) improved index transfer
The index selection and transmission is done in parallel now to improve index
transfer performance.
TODO: maybe we could speed up performance by unsing multiple transmission threads in
parallel instead of only a single one.
*) gzip encoded post requests
it is now configureable if a gzip encoded post request should be send on
intex transfer/distribution
*) storage Peer (very experimentell and not optimized yet)
Now it's possible to send the result of the yacy indexer thread to a remote peer
istead of storing the indexed words locally.
This could be done by setting the property "storagePeerHash" in the yacy config file
- Please note that if the index transfer fails, the index ist stored locally.
- TODO: currently this index transfer is done by the indexer thread.
To seedup the indexer
a) this transmission should be done in parallel and
b) multiple chunks should be bundled and transfered together
*) general performance improvements
- better memory cleanup after http request processing has finished
- replacing some string concatenations with stringBuffers
- replacing BufferedInputStreams with serverByteBuffer
- replacing vectors with arraylists wherever possible
- replacing hashtables with hashmaps wherever possible
This was done because function calls to verctor or hashtable functions
take 3 time longer than calls to functions of arraylists or hashmaps.
TODO: we should take a look on the class serverObject which is inherited from hashmap
Do we realy need a synchronization for this class?
TODO: replace arraylists with linkedLists if random access to the list elements is not needed
*) Robots Parser supports if-modified-since downloads now
If the downloaded robots.txt file is older than 7 days the robots parser tries to
download the robots.txt with the if-modified-since header to avoid unnecessary downloads
if the file was not changed. Additionally the ETag header is used to detect changes.
*) Crawler: better handling of unsupported mimeTypes + FileExtension
*) Bugfix: plasmaWordIndexEntity was not closed correctly in
- query.java
- plasmaswitchboard.java
*) function minimizeUrlDB added to yacy.java
this function tests the current urlHashDB for unused urls
ATTENTION: please don't use this function at the moment because
it causes the wordIndexDB to flush all words into the
word directory!
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@853 6c8d7289-2bf4-0310-a012-ef5d649a1542
- Adding function to manually force peer ping to remote yacy peer
See:Network.html?page=4
- for debugging purpose only!
*) serverAbstractThread.java:
- Adding posibility to notify a server thread via a synchronization object
- this is needed e.g. by the port forwarding feature to send a notification
to the peerPing thread to redo peer-ping with the new ip/port Settings_p.html
*) Port Forwarding Feature (it should work now)
- adding a serverThread which is responsible to detect broken port forwarding
connections and to do reconnect if needed
- serverCore.java: moving port forwarding initialization into a separate function
- adding positility to configure the ssh port
- moving configuration section on the gui into a separate fieldset
- hello.java: only trying to do a second connect to the clientIp address during
peer handshake if either remote port forwarding is not enabled locally or
the clientIP is not equal to any local ip
*) httpdFileHandler.java:
- printout a more verbose errormessage
*) httpc.java
- allowing to deactivate content encoding from outside
*) plasmaCrawlWorker.java
- the crawler worker now tries to refetch the content of a website without
gzip content encoding if a gzip error occured
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@368 6c8d7289-2bf4-0310-a012-ef5d649a1542
via a command-line email program (e.g. sendmail) to a configured email address
- the configuration dialog is reachable via Settings_p.html#messageForwarding
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@332 6c8d7289-2bf4-0310-a012-ef5d649a1542
See: http://www.yacy-forum.de/viewtopic.php?t=516
- removing NIO from server/serverCore.java because of massive problems
with socket close issues
*) Adding support for remote port forwarding via sch
@Orbiter: Please take a look into
- hello.java
- server/serverCore.java.publicIP()
- yacy/yacyClient.java.publishMySeed(...)
*) Making startup loading of additional content parsers more failsafe
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@281 6c8d7289-2bf4-0310-a012-ef5d649a1542
*) serverLog.java logging functions now also accept exceptions als
additional parameters.
The Stacktrace of this ecceptions will then be appended to the
logging message and can e.g. be viewed on the gui logging page
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@265 6c8d7289-2bf4-0310-a012-ef5d649a1542
optional content parsers, thread pool configuration ...
Please help me testing if everything works correct.
*) Migration of yacy seedUpload functionality
See: http://www.yacy-forum.de/viewtopic.php?t=256
- new uploaders can now be easily introduced because of a new modulare uploader system
- default uploaders are: none, file, ftp
- adding optional uploader for scp
- each uploader provides its own configuration file that will be
included into the settings page using the new template include feature
- Each uploader can define its libx dependencies. If not all needed libs are
available, the uploader is deactivated automatically.
*) Migration of optional parsers
See: http://www.yacy-forum.de/viewtopic.php?t=198
- Parsers can now also define there libx dependencies
- adding parser for bzip compressed content
- adding parser for gzip compressed content
- adding parser for zip files
- adding parser for tar files
- adding parser to detect the mime-type of a file
this is needed by the bzip/gzip Parser.java
- adding parser for rtf files
- removing extra configuration file yacy.parser
the list of enabled parsers is now stored in the main config file
*) Adding configuration option in the performance dialog to configure
See: http://www.yacy-forum.de/viewtopic.php?t=267
- maxActive / maxIdle / minIdle values for httpd-session-threadpool
- maxActive / maxIdle / minIdle values for crawler-threadpool
*) Changing Crawling Filter behaviour
See: http://www.yacy-forum.de/viewtopic.php?p=2631
*) Replacing some hardcoded strings with the proper constants of the httpHeader class
*) Adding new libs to libx directory. This libs are
- needed by new content parsers
- needed by new optional seed uploader
- needed by SOAP API (which will be committed later)
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@126 6c8d7289-2bf4-0310-a012-ef5d649a1542
- each additional parser must be in a subpackage
of plasma.parser
- each parser must have its own ant build file (which will
be called automatically from the main build file)
- Calling the main build file results in building a separate
zip file for each optional parser. This zip file includes:
+ sources of the Parser.java
+ compiled classes of the Parser.java
+ needed additional libs (libx)
- To install an additional parser the user simply needs to
extract the zip file listed above into his/her yacy directory.
- The configuration (enabling/disabling) of a parser can be done
via the webinterface (currently the settings dialoge) and is
done "on-the-fly". The installation can not be done "on-the-fly"
at the moment because of classpath issues.
- The classpath of the linux startup/stop scripts is generated
automatically now (including all libraries from lib and libx).
*) Bugfix: File Extension was not calculated correctly by the crawler
e.g.: file extension was accidentally: .php?param=value
Corrected.
*) Adding additional parser for parsing of rss/atom feeds
- added needed libs to do this.
TODO:
- automatic building classpath for windows startup scripts
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@78 6c8d7289-2bf4-0310-a012-ef5d649a1542