- all web page parsing operations will now increase a web structure file
- the file is computed in memory and dumped at shutdown-time to PLASMASB/webStructure.map in readable form (not a database)
- the file can be used externally to analyse the link structure of the crawled pages
- the web structure can also be retrieved using a xml-interface at http://localhost:8080/xml/webstructure.xml
- the short-term purpose is the computation of a link-graph image (before linuxtag!)
- a long-term purpose could be a decentralized computation of the citation rank
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3746 6c8d7289-2bf4-0310-a012-ef5d649a1542
- added 7zip parser
- added 'text/sgml' to realtime parseable mimetypes (sometimes returned by the mime type parser)
- added new cached output stream class, very suitable for parsers because of limited memory
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3740 6c8d7289-2bf4-0310-a012-ef5d649a1542
*) Changed "Lost Handle" error to warning (masses of it if deleting crawl-profile)
*) Removed unnecessary code from Windows script
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3708 6c8d7289-2bf4-0310-a012-ef5d649a1542
*) First version of a sitemap parser added
- currently only autodetection of sitemap files is supported
*) DB-Import restructured
- pause/resume should work again now
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3666 6c8d7289-2bf4-0310-a012-ef5d649a1542
- cluster definitions can now contain an addition for local ip addresses
- cluster-cluster communication uses the local ip address instead the global address, if one is given
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3624 6c8d7289-2bf4-0310-a012-ef5d649a1542
automatically aquire release information from download archives
web pages from latest.yacy-forum.net and yacy.net are retrieved, parsed,
links wihin are analysed, sorted and the most recent developer and main
releases are provided as direct download link on the status page, if it was
discovered that a more recent version than the current version is available.
This process is done only once during run-time of a peer, to protect our
download archives from DoS by YaCy peers.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3606 6c8d7289-2bf4-0310-a012-ef5d649a1542
- the network configuration page shows a new option: robinson clusters
- when a global search is made, all robinson peers are excluded, but:
- robinson peers/clusters that provide peer tags and where search words match
such tags, they are included in global search. Therefore, robinson peers/clusters
support the global yacy network with their indexes, without doin DHT-exchange
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3598 6c8d7289-2bf4-0310-a012-ef5d649a1542
*) Marked two deprecated source-points
*) Added possibility to dump words from indexing to file. Should not affect performance in the current form.
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3592 6c8d7289-2bf4-0310-a012-ef5d649a1542
- new cluster functions will be available in this menu, but currently not enabled,
because corresponding interface methods are not ready yet
- shifted remote crawl settings to new network configuration menu
- shifted DHT distribution/receive to the new network configuration menu
- adopted some string constants
- added cluster configuration settings to yacy.init
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3589 6c8d7289-2bf4-0310-a012-ef5d649a1542
http://www.yacy-forum.de/viewtopic.php?t=3854
This is a serious problem that is caused by the database bug between 0.511 - 0.513
which produced a large number of double-entries in the RWI index. The uniq()-method
tries to fix this, and it does not terminate when the index is large and the number
of double-occurrences is also large. This patch does simply implement a time-controlled
termination, which does not heal the inconsistency problem. The uniq-method itself
is correct and does not need a bugfix, the non-termination is simply caused by the large number
of data that is shifted during the process. It was possible to reproduce this behaviour
in a test environment.
A real fix would need to:
- enhance the uniq()-method by using a recursive, binary segmentation of the array to be fixed
- uniq() must report the entries that are double
- the double-entries must be deleted from the collection index (from the index and the collections) to heal the problem
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3583 6c8d7289-2bf4-0310-a012-ef5d649a1542
- some bugs may have been fixed with wrong removal operations
- removed temporary storage of remove-positions and replaced by direct deletions
- changed synchronization
- added many assets
- modified dbtest to also test remove during threaded stresstest
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3576 6c8d7289-2bf4-0310-a012-ef5d649a1542
* removed divide by zero bug when 20_dhtdistribution_busysleep is 0
* replaced German comment with wrong charset in source/de/anomic/plasma/plasmaCrawlBalancer.java by an English one
* replaced the table-fix for floating behind snipped images by a br with clear
* removed unnecessary old xhtml-files (were not in use, they were created when we weren't having xhtml for testing)
* new layout for image-search results: replaced the old one with spans and tables inside (not valid) with new divs, now each image snippet container has the same size
TODO:
* the ids of the snippetLoading-divs aren't valid because ids must start with an alphabetic letter or an underscore, they have to be prefixed
* in the returned snippet-xml is an unresolved pattern for status (the status is only set for text snippets)
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@3566 6c8d7289-2bf4-0310-a012-ef5d649a1542