*) First version of a sitemap parser added
- currently only autodetection of sitemap files is supported
*) DB-Import restructured
- pause/resume should work again now
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3666 6c8d7289-2bf4-0310-a012-ef5d649a1542
this shall be used to store a fragment of the index on another physical device,
to split IO load and enhance access speed. The index is splitted in such a way
that the LURLs are stored to the secondary location, and the RWIs to the primary
location. This is especially useful for environments where symbolic links are
not possible and may cause IO access even if there is no write access to the
device which hosts the symbolic link.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3519 6c8d7289-2bf4-0310-a012-ef5d649a1542
- added an assortment importer. the old database structures can
be imported with
java -classpath classes yacy -migrateassortments
- modified wordmigration. The indexes from WORDS are now imported
to the collection database. The call is
java -classpath classes yacy -migratewords
(as it was)
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3044 6c8d7289-2bf4-0310-a012-ef5d649a1542
for indexing, the plasmaWordIndex.
The new data structure is ready-to-use, but currently disabled.
It can be activated by setting the static
plasmaWordIndex.useCollectionIndex
to true. This shall be done for testing purpose.
The new index is stored to
DATA/INDEX/PUBLIC/TEXT
The directory PLASMA shall be used only for crawler in the future.
Attention: during testing the data structure in INDEX may change,
and created indexes with the new data structure may get useless.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@2348 6c8d7289-2bf4-0310-a012-ef5d649a1542
@Martin: bitte schaue mal warum die Variablenzuweisung
in plasmaCrawlNURLImporter war. So wie sie waren, waren sie überflüssig.
Das hattest du dir bestimmt nicht so gedacht.
Sollten es ggf. globale Variablen sein?
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1529 6c8d7289-2bf4-0310-a012-ef5d649a1542
- restructuring to allow different import tasks to be controlled via one gui
- adding possibility to import a single assortment file
- adding possibility to set the cache size that should be used
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1504 6c8d7289-2bf4-0310-a012-ef5d649a1542
- dbImporter threads are now shutdown by the switchboard on server shutdown
- adding possibility to pause a importer thread via GUI
- Bugfix for abort function
See: http://www.yacy-forum.de/viewtopic.php?p=13363#13363
*) Modification of content parser configuration
- now it's possible to configure which parsers should be enabled for the proxy,
crawler, icap, etc. separately
-
*) htmlFilterContentScraper.java
- adding regular expression to normalize URLs containing /../ and /./ parts
*) httpc.java
- adding functionality to unzip gzipped content
- requested by roland: should be used later to allow gzipped seed lists
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@1170 6c8d7289-2bf4-0310-a012-ef5d649a1542
- remote proxy configuration can now be "really" changed on the fly and takes effect immediately
- adding possibility to disable remote proxy usage for yacy->yacy communication
- adding possibility to disable remote proxy usage for ssl
- restructuring proxy configuration so that it is stored in a single place now
*) Adding possibility to import a foreign word DB (or even more of them in parallel)
at runtime into the peers DB
- this can be done by calling IndexImport_p.html
- ATTENTION: please not that at the moment this thread must be aborted via gui
before a normal server shutdown is done.
- TODO: integrating IndexImport Thread into normal server shutdown
- TODO: Adding posibility to import crawl-queues, etc. from foreign peers
- TODO: removing old import function from yacy.java and calling the new routines instead
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@968 6c8d7289-2bf4-0310-a012-ef5d649a1542