- balancer writes cause of robots.txt in log file for crawl delay
- removed log output for forced GC
- smaller RAM flush for RWI cache, should cause more usage of cache and faster crawling
git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@5228 6c8d7289-2bf4-0310-a012-ef5d649a1542
// in best case, this should never happen if the balancer works propertly
// in best case, this should never happen if the balancer works propertly
// this is only to protection against the worst case, where the crawler could
// this is only to protection against the worst case, where the crawler could
// behave in a DoS-manner
// behave in a DoS-manner
serverLog.logInfo("BALANCER","forcing fetch delay of " +sleeptime+" millisecond for " +crawlEntry.url().getHost());
serverLog.logInfo("BALANCER","forcing crawl-delay of " +sleeptime+" milliseconds for " +crawlEntry.url().getHost()+((sleeptime>Math.max(minimumLocalDelta,minimumGlobalDelta))?" (caused by robots.txt)":""));
System.out.println("^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ if you see this many times please report to forum");
//System.out.println("^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ if you see this many times please report to forum");