- removed file upload function in crawl start and replaced it with an input field for a file path where the crawl start file is loaded. This was necessary to support the API steering for file crawl starts, for two reasons:

1) if the file is changed for a re-crawl this is not reflected in the steering because it would take the previously uploaded crawl start file
2) browsers do not submit the full path of the selected file even if this path is shown in the input field because of security reasons. There is no work-around or hack to make the submission of the full path possible

- fixed deletion of crawl start point urls in crawl stack and balancer double-check
- fixed a problem with steering self-call (no resolving of localhost)
- added more logging for the crawler to supervise why crawl urls are not taken by the loader
- added a javascript onload-function to select domain restriction in all cases where a crawl is started from a file or from a url
- fixed the restrict-to-domain pattern computation, added a 'www.'-prefix and added this functionality also to a crawl start from file 

git-svn-id: https://svn.berlios.de/svnroot/repos/yacy/trunk@7574 6c8d7289-2bf4-0310-a012-ef5d649a1542
pull/1/head
orbiter 14 years ago
parent 96bb33ed9b
commit 7962d35425

@ -26,7 +26,7 @@
You can define URLs as start points for Web page crawling and start crawling here. "Crawling" means that YaCy will download the given website, extract all links in it and then download the content behind these links. This is repeated as long as specified under "Crawling Depth". You can define URLs as start points for Web page crawling and start crawling here. "Crawling" means that YaCy will download the given website, extract all links in it and then download the content behind these links. This is repeated as long as specified under "Crawling Depth".
</p> </p>
<form id="Crawler" action="Crawler_p.html" method="post" enctype="multipart/form-data"> <form id="Crawler" action="Crawler_p.html" method="post" enctype="multipart/form-data" accept-charset="UTF-8">
<table border="0" cellpadding="5" cellspacing="1"> <table border="0" cellpadding="5" cellspacing="1">
<tr class="TableHeader"> <tr class="TableHeader">
<td><strong>Attribute</strong></td> <td><strong>Attribute</strong></td>
@ -46,7 +46,7 @@
</tr> </tr>
<tr> <tr>
<td><label for="url"><span class="nobr">From Link-List of URL</span></label>:</td> <td><label for="url"><span class="nobr">From Link-List of URL</span></label>:</td>
<td><input type="radio" name="crawlingMode" id="sitelist" value="sitelist" disabled="disabled"/></td> <td><input type="radio" name="crawlingMode" id="sitelist" value="sitelist" disabled="disabled" onclick="document.getElementById('Crawler').rangeDomain.checked = true;"/></td>
<td> <td>
<div id="sitelistURLs"></div> <div id="sitelistURLs"></div>
</td> </td>
@ -60,8 +60,8 @@
</tr> </tr>
<tr> <tr>
<td><label for="file"><span class="nobr">From File</span></label>:</td> <td><label for="file"><span class="nobr">From File</span></label>:</td>
<td><input type="radio" name="crawlingMode" id="file" value="file" /></td> <td><input type="radio" name="crawlingMode" id="file" value="file" onclick="document.getElementById('Crawler').rangeDomain.checked = true;"/></td>
<td><input type="file" name="crawlingFile" size="18" onfocus="check('file')" /></td> <td><input type="text" name="crawlingFile" size="41" onfocus="check('file')"/><!--<input type="file" name="crawlingFile" size="18" onfocus="check('file')"/>--></td>
</tr> </tr>
<tr> <tr>
<td colspan="3" class="commit"> <td colspan="3" class="commit">
@ -138,10 +138,10 @@
<tr valign="top" class="TableCellLight"> <tr valign="top" class="TableCellLight">
<td><label for="mustmatch">Must-Match Filter</label>:</td> <td><label for="mustmatch">Must-Match Filter</label>:</td>
<td> <td>
<input type="radio" name="range" value="wide" checked="checked" />Use filter&nbsp;&nbsp; <input type="radio" name="range" id="rangeWide" value="wide" checked="checked" />Use filter&nbsp;&nbsp;
<input name="mustmatch" id="mustmatch" type="text" size="60" maxlength="100" value="#[mustmatch]#" /><br /> <input name="mustmatch" id="mustmatch" type="text" size="60" maxlength="100" value="#[mustmatch]#" /><br />
<input type="radio" name="range" value="domain" />Restrict to start domain<br /> <input type="radio" name="range" id="rangeDomain" value="domain" />Restrict to start domain<br />
<input type="radio" name="range" value="subpath" />Restrict to sub-path <input type="radio" name="range" id="rangeSubpath" value="subpath" />Restrict to sub-path
</td> </td>
<td> <td>
The filter is a <a href="http://java.sun.com/j2se/1.5.0/docs/api/java/util/regex/Pattern.html">regular expression</a> The filter is a <a href="http://java.sun.com/j2se/1.5.0/docs/api/java/util/regex/Pattern.html">regular expression</a>

@ -26,6 +26,7 @@
// Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA // Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
import java.io.File; import java.io.File;
import java.io.FileInputStream;
import java.io.Writer; import java.io.Writer;
import java.net.MalformedURLException; import java.net.MalformedURLException;
import java.util.Date; import java.util.Date;
@ -192,6 +193,13 @@ public class Crawler_p {
long crawlingIfOlder = recrawlIfOlderC(crawlingIfOlderCheck, crawlingIfOlderNumber, crawlingIfOlderUnit); long crawlingIfOlder = recrawlIfOlderC(crawlingIfOlderCheck, crawlingIfOlderNumber, crawlingIfOlderUnit);
env.setConfig("crawlingIfOlder", crawlingIfOlder); env.setConfig("crawlingIfOlder", crawlingIfOlder);
// remove crawlingFileContent before we record the call
final String crawlingFileName = post.get("crawlingFile");
final File crawlingFile = (crawlingFileName != null && crawlingFileName.length() > 0) ? new File(crawlingFileName) : null;
if (crawlingFile != null && crawlingFile.exists()) {
post.remove("crawlingFile$file");
}
// store this call as api call // store this call as api call
if (repeat_time > 0) { if (repeat_time > 0) {
// store as scheduled api call // store as scheduled api call
@ -255,7 +263,7 @@ public class Crawler_p {
sb.crawler.putActive(profile.handle().getBytes(), profile); sb.crawler.putActive(profile.handle().getBytes(), profile);
sb.pauseCrawlJob(SwitchboardConstants.CRAWLJOB_LOCAL_CRAWL); sb.pauseCrawlJob(SwitchboardConstants.CRAWLJOB_LOCAL_CRAWL);
final DigestURI url = crawlingStartURL; final DigestURI url = crawlingStartURL;
sb.crawlStacker.enqueueEntries(sb.peers.mySeed().hash.getBytes(), profile.handle(), "ftp", url.getHost(), url.getPort(), false); sb.crawlStacker.enqueueEntriesFTP(sb.peers.mySeed().hash.getBytes(), profile.handle(), url.getHost(), url.getPort(), false);
} catch (final PatternSyntaxException e) { } catch (final PatternSyntaxException e) {
prop.put("info", "4"); // crawlfilter does not match url prop.put("info", "4"); // crawlfilter does not match url
prop.putHTML("info_newcrawlingfilter", newcrawlingMustMatch); prop.putHTML("info_newcrawlingfilter", newcrawlingMustMatch);
@ -392,20 +400,26 @@ public class Crawler_p {
} else if ("file".equals(crawlingMode)) { } else if ("file".equals(crawlingMode)) {
if (post.containsKey("crawlingFile")) { if (post.containsKey("crawlingFile")) {
final String fileName = post.get("crawlingFile"); final String crawlingFileContent = post.get("crawlingFile$file", "");
try { try {
// check if the crawl filter works correctly // check if the crawl filter works correctly
Pattern.compile(newcrawlingMustMatch); Pattern.compile(newcrawlingMustMatch);
final File file = new File(fileName); final ContentScraper scraper = new ContentScraper(new DigestURI(crawlingFile));
final String fileString = post.get("crawlingFile$file");
final ContentScraper scraper = new ContentScraper(new DigestURI(file));
final Writer writer = new TransformerWriter(null, null, scraper, null, false); final Writer writer = new TransformerWriter(null, null, scraper, null, false);
FileUtils.copy(fileString, writer); if (crawlingFile != null && crawlingFile.exists()) {
FileUtils.copy(new FileInputStream(crawlingFile), writer);
} else {
FileUtils.copy(crawlingFileContent, writer);
}
writer.close(); writer.close();
// get links and generate filter
final Map<MultiProtocolURI, String> hyperlinks = scraper.getAnchors(); final Map<MultiProtocolURI, String> hyperlinks = scraper.getAnchors();
final DigestURI crawlURL = new DigestURI("file://" + file.toString()); if (fullDomain) newcrawlingMustMatch = siteFilter(hyperlinks.keySet());
final DigestURI crawlURL = new DigestURI("file://" + crawlingFile.toString());
final CrawlProfile profile = new CrawlProfile( final CrawlProfile profile = new CrawlProfile(
fileName, crawlingFileName,
crawlURL, crawlURL,
newcrawlingMustMatch, newcrawlingMustMatch,
CrawlProfile.MATCH_NEVER, CrawlProfile.MATCH_NEVER,
@ -431,7 +445,7 @@ public class Crawler_p {
} catch (final Exception e) { } catch (final Exception e) {
// mist // mist
prop.put("info", "7"); // Error with file prop.put("info", "7"); // Error with file
prop.putHTML("info_crawlingStart", fileName); prop.putHTML("info_crawlingStart", crawlingFileName);
prop.putHTML("info_error", e.getMessage()); prop.putHTML("info_error", e.getMessage());
Log.logException(e); Log.logException(e);
} }
@ -478,16 +492,8 @@ public class Crawler_p {
// String description = scraper.getDescription(); // String description = scraper.getDescription();
// get links and generate filter // get links and generate filter
final StringBuilder filter = new StringBuilder();
final Map<MultiProtocolURI, String> hyperlinks = scraper.getAnchors(); final Map<MultiProtocolURI, String> hyperlinks = scraper.getAnchors();
final Set<String> filterSet = new HashSet<String>(); if (fullDomain) newcrawlingMustMatch = siteFilter(hyperlinks.keySet());
for (final MultiProtocolURI uri: hyperlinks.keySet()) {
filterSet.add(new StringBuilder().append(uri.getProtocol()).append("://").append(uri.getHost()).append(".*").toString());
}
for (final String element : filterSet) {
filter.append('|').append(element);
}
newcrawlingMustMatch = filter.length() > 0 ? filter.substring(1) : "";
// put links onto crawl queue // put links onto crawl queue
final CrawlProfile profile = new CrawlProfile( final CrawlProfile profile = new CrawlProfile(
@ -581,4 +587,18 @@ public class Crawler_p {
sb.setPerformance(wantedPPM); sb.setPerformance(wantedPPM);
} }
private static String siteFilter(Set<MultiProtocolURI> uris) {
final StringBuilder filter = new StringBuilder();
final Set<String> filterSet = new HashSet<String>();
for (final MultiProtocolURI uri: uris) {
filterSet.add(new StringBuilder().append(uri.getProtocol()).append("://").append(uri.getHost()).append(".*").toString());
if (!uri.getHost().startsWith("www.")) {
filterSet.add(new StringBuilder().append(uri.getProtocol()).append("://www.").append(uri.getHost()).append(".*").toString());
}
}
for (final String element : filterSet) {
filter.append('|').append(element);
}
return filter.length() > 0 ? filter.substring(1) : "";
}
} }

@ -188,6 +188,10 @@ public class Balancer {
for (final byte[] urlhash: urlHashes) { for (final byte[] urlhash: urlHashes) {
final Row.Entry entry = urlFileIndex.remove(urlhash); final Row.Entry entry = urlFileIndex.remove(urlhash);
if (entry != null) removedCounter++; if (entry != null) removedCounter++;
// remove from double-check caches
ddc.remove(urlhash);
double_push_check.remove(urlhash);
} }
if (removedCounter == 0) return 0; if (removedCounter == 0) return 0;
assert urlFileIndex.size() + removedCounter == s : "urlFileIndex.size() = " + urlFileIndex.size() + ", s = " + s; assert urlFileIndex.size() + removedCounter == s : "urlFileIndex.size() = " + urlFileIndex.size() + ", s = " + s;
@ -216,8 +220,8 @@ public class Balancer {
for (final byte[] handle: urlHashes) stack.remove(handle); for (final byte[] handle: urlHashes) stack.remove(handle);
if (stack.isEmpty()) q.remove(); if (stack.isEmpty()) q.remove();
} }
return removedCounter; return removedCounter;
} }
public boolean has(final byte[] urlhashb) { public boolean has(final byte[] urlhashb) {
@ -248,15 +252,22 @@ public class Balancer {
return false; return false;
} }
public void push(final Request entry) throws IOException, RowSpaceExceededException { /**
* push a crawl request on the balancer stack
* @param entry
* @return null if this was successful or a String explaining what went wrong in case of an error
* @throws IOException
* @throws RowSpaceExceededException
*/
public String push(final Request entry) throws IOException, RowSpaceExceededException {
assert entry != null; assert entry != null;
final byte[] hash = entry.url().hash(); final byte[] hash = entry.url().hash();
synchronized (this) { synchronized (this) {
// double-check // double-check
if (this.double_push_check.has(hash) || this.ddc.has(hash) || this.urlFileIndex.has(hash)) { if (this.double_push_check.has(hash)) return "double occurrence in double_push_check";
//Log.logSevere("Balancer", "double push: " + UTF8.String(hash)); if (this.ddc.has(hash)) return "double occurrence in ddc";
return; if (this.urlFileIndex.has(hash)) return "double occurrence in urlFileIndex";
}
if (this.double_push_check.size() > 10000) this.double_push_check.clear(); if (this.double_push_check.size() > 10000) this.double_push_check.clear();
this.double_push_check.put(hash); this.double_push_check.put(hash);
@ -268,6 +279,7 @@ public class Balancer {
// add the hash to a queue // add the hash to a queue
pushHashToDomainStacks(entry.url().getHost(), entry.url().hash()); pushHashToDomainStacks(entry.url().getHost(), entry.url().hash());
return null;
} }
} }

@ -65,7 +65,6 @@ public final class CrawlStacker {
private final Log log = new Log("STACKCRAWL"); private final Log log = new Log("STACKCRAWL");
private final WorkflowProcessor<Request> fastQueue, slowQueue; private final WorkflowProcessor<Request> fastQueue, slowQueue;
//private long dnsHit;
private long dnsMiss; private long dnsMiss;
private final CrawlQueues nextQueue; private final CrawlQueues nextQueue;
private final CrawlSwitchboard crawler; private final CrawlSwitchboard crawler;
@ -242,13 +241,24 @@ public final class CrawlStacker {
final byte[] urlhash = url.hash(); final byte[] urlhash = url.hash();
if (replace) { if (replace) {
indexSegment.urlMetadata().remove(urlhash); indexSegment.urlMetadata().remove(urlhash);
this.nextQueue.noticeURL.removeByURLHash(urlhash); this.nextQueue.urlRemove(urlhash);
this.nextQueue.errorURL.remove(urlhash); String u = url.toNormalform(true, true);
if (u.endsWith("/")) {
u = u + "index.html";
} else if (!u.contains(".")) {
u = u + "/index.html";
}
try {
byte[] uh = new DigestURI(u, null).hash();
indexSegment.urlMetadata().remove(uh);
this.nextQueue.noticeURL.removeByURLHash(uh);
this.nextQueue.errorURL.remove(uh);
} catch (MalformedURLException e1) {}
} }
if (url.getProtocol().equals("ftp")) { if (url.getProtocol().equals("ftp")) {
// put the whole ftp site on the crawl stack // put the whole ftp site on the crawl stack
enqueueEntries(initiator, profileHandle, "ftp", url.getHost(), url.getPort(), replace); enqueueEntriesFTP(initiator, profileHandle, url.getHost(), url.getPort(), replace);
} else { } else {
// put entry on crawl stack // put entry on crawl stack
enqueueEntry(new Request( enqueueEntry(new Request(
@ -267,7 +277,7 @@ public final class CrawlStacker {
} }
} }
public void enqueueEntries(final byte[] initiator, final String profileHandle, final String protocol, final String host, final int port, final boolean replace) { public void enqueueEntriesFTP(final byte[] initiator, final String profileHandle, final String host, final int port, final boolean replace) {
final CrawlQueues cq = this.nextQueue; final CrawlQueues cq = this.nextQueue;
new Thread() { new Thread() {
public void run() { public void run() {
@ -280,10 +290,7 @@ public final class CrawlStacker {
// delete old entry, if exists to force a re-load of the url (thats wanted here) // delete old entry, if exists to force a re-load of the url (thats wanted here)
DigestURI url = null; DigestURI url = null;
try { try {
if (protocol.equals("ftp")) url = new DigestURI("ftp://" + host + (port == 21 ? "" : ":" + port) + MultiProtocolURI.escape(entry.name)); url = new DigestURI("ftp://" + host + (port == 21 ? "" : ":" + port) + MultiProtocolURI.escape(entry.name));
else if (protocol.equals("smb")) url = new DigestURI("smb://" + host + MultiProtocolURI.escape(entry.name));
else if (protocol.equals("http")) url = new DigestURI("http://" + host + (port == 80 ? "" : ":" + port) + MultiProtocolURI.escape(entry.name));
else if (protocol.equals("https")) url = new DigestURI("https://" + host + (port == 443 ? "" : ":" + port) + MultiProtocolURI.escape(entry.name));
} catch (MalformedURLException e) { } catch (MalformedURLException e) {
continue; continue;
} }
@ -359,10 +366,12 @@ public final class CrawlStacker {
} }
// check availability of parser and maxfilesize // check availability of parser and maxfilesize
String warning = null;
if (entry.size() > maxFileSize || if (entry.size() > maxFileSize ||
(entry.url().getFileExtension().length() > 0 && TextParser.supports(entry.url(), null) != null) (entry.url().getFileExtension().length() > 0 && TextParser.supports(entry.url(), null) != null)
) { ) {
nextQueue.noticeURL.push(NoticedURL.StackType.NOLOAD, entry); warning = nextQueue.noticeURL.push(NoticedURL.StackType.NOLOAD, entry);
if (warning != null) this.log.logWarning("CrawlStacker.stackCrawl of URL " + entry.url().toNormalform(true, false) + " - not pushed: " + warning);
return null; return null;
} }
@ -377,29 +386,18 @@ public final class CrawlStacker {
// it may be possible that global == true and local == true, so do not check an error case against it // it may be possible that global == true and local == true, so do not check an error case against it
if (proxy) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: global = true, proxy = true, initiator = proxy" + ", profile.handle = " + profile.handle()); if (proxy) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: global = true, proxy = true, initiator = proxy" + ", profile.handle = " + profile.handle());
if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: global = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle()); if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: global = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle());
//int b = nextQueue.noticeURL.stackSize(NoticedURL.StackType.LIMIT); warning = nextQueue.noticeURL.push(NoticedURL.StackType.LIMIT, entry);
nextQueue.noticeURL.push(NoticedURL.StackType.LIMIT, entry);
//assert b < nextQueue.noticeURL.stackSize(NoticedURL.StackType.LIMIT);
//this.log.logInfo("stacked/global: " + entry.url().toString() + ", stacksize = " + nextQueue.noticeURL.stackSize(NoticedURL.StackType.LIMIT));
} else if (local) { } else if (local) {
if (proxy) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: local = true, proxy = true, initiator = proxy" + ", profile.handle = " + profile.handle()); if (proxy) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: local = true, proxy = true, initiator = proxy" + ", profile.handle = " + profile.handle());
if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: local = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle()); if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: local = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle());
//int b = nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE); warning = nextQueue.noticeURL.push(NoticedURL.StackType.CORE, entry);
nextQueue.noticeURL.push(NoticedURL.StackType.CORE, entry);
//assert b < nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE);
//this.log.logInfo("stacked/local: " + entry.url().toString() + ", stacksize = " + nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE));
} else if (proxy) { } else if (proxy) {
if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: proxy = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle()); if (remote) this.log.logWarning("URL '" + entry.url().toString() + "' has conflicting initiator properties: proxy = true, remote = true, initiator = " + UTF8.String(entry.initiator()) + ", profile.handle = " + profile.handle());
//int b = nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE); warning = nextQueue.noticeURL.push(NoticedURL.StackType.CORE, entry);
nextQueue.noticeURL.push(NoticedURL.StackType.CORE, entry);
//assert b < nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE);
//this.log.logInfo("stacked/proxy: " + entry.url().toString() + ", stacksize = " + nextQueue.noticeURL.stackSize(NoticedURL.StackType.CORE));
} else if (remote) { } else if (remote) {
//int b = nextQueue.noticeURL.stackSize(NoticedURL.STACK_TYPE_REMOTE); warning = nextQueue.noticeURL.push(NoticedURL.StackType.REMOTE, entry);
nextQueue.noticeURL.push(NoticedURL.StackType.REMOTE, entry);
//assert b < nextQueue.noticeURL.stackSize(NoticedURL.STACK_TYPE_REMOTE);
//this.log.logInfo("stacked/remote: " + entry.url().toString() + ", stacksize = " + nextQueue.noticeURL.stackSize(NoticedURL.STACK_TYPE_REMOTE));
} }
if (warning != null) this.log.logWarning("CrawlStacker.stackCrawl of URL " + entry.url().toNormalform(true, false) + " - not pushed: " + warning);
return null; return null;
} }

@ -161,25 +161,29 @@ public class NoticedURL {
noloadStack.has(urlhashb); noloadStack.has(urlhashb);
} }
public void push(final StackType stackType, final Request entry) { /**
* push a crawl request on one of the different crawl stacks
* @param stackType
* @param entry
* @return null if this was successful or a String explaining what went wrong in case of an error
*/
public String push(final StackType stackType, final Request entry) {
try { try {
switch (stackType) { switch (stackType) {
case CORE: case CORE:
coreStack.push(entry); return coreStack.push(entry);
break;
case LIMIT: case LIMIT:
limitStack.push(entry); return limitStack.push(entry);
break;
case REMOTE: case REMOTE:
remoteStack.push(entry); return remoteStack.push(entry);
break;
case NOLOAD: case NOLOAD:
noloadStack.push(entry); return noloadStack.push(entry);
break; default:
default: break; return "stack type unknown";
} }
} catch (final Exception er) { } catch (final Exception er) {
Log.logException(er); Log.logException(er);
return "error pushing onto the crawl stack: " + er.getMessage();
} }
} }
@ -246,7 +250,12 @@ public class NoticedURL {
public void shift(final StackType fromStack, final StackType toStack, CrawlSwitchboard cs) { public void shift(final StackType fromStack, final StackType toStack, CrawlSwitchboard cs) {
try { try {
final Request entry = pop(fromStack, false, cs); final Request entry = pop(fromStack, false, cs);
if (entry != null) push(toStack, entry); if (entry != null) {
String warning = push(toStack, entry);
if (warning != null) {
Log.logWarning("NoticedURL", "shift from " + fromStack + " to " + toStack + ": " + warning);
}
}
} catch (final IOException e) { } catch (final IOException e) {
return; return;
} }

@ -234,6 +234,7 @@ public class WorkTables extends Tables {
if (row == null) continue; if (row == null) continue;
String url = "http://" + host + ":" + port + UTF8.String(row.get(WorkTables.TABLE_API_COL_URL)); String url = "http://" + host + ":" + port + UTF8.String(row.get(WorkTables.TABLE_API_COL_URL));
url += "&" + WorkTables.TABLE_API_COL_APICALL_PK + "=" + UTF8.String(row.getPK()); url += "&" + WorkTables.TABLE_API_COL_APICALL_PK + "=" + UTF8.String(row.getPK());
Log.logInfo("WorkTables", "executing url: " + url);
try { try {
client.GETbytes(url); client.GETbytes(url);
l.put(url, client.getStatusCode()); l.put(url, client.getStatusCode());

@ -265,8 +265,9 @@ public class HTTPClient {
*/ */
public byte[] GETbytes(final String uri, long maxBytes) throws IOException { public byte[] GETbytes(final String uri, long maxBytes) throws IOException {
final MultiProtocolURI url = new MultiProtocolURI(uri); final MultiProtocolURI url = new MultiProtocolURI(uri);
final HttpGet httpGet = new HttpGet(url.toNormalform(true, false, true, false)); boolean localhost = url.getHost().equals("localhost");
setHost(url.getHost()); // overwrite resolved IP, needed for shared web hosting DO NOT REMOVE, see http://en.wikipedia.org/wiki/Shared_web_hosting_service final HttpGet httpGet = new HttpGet(url.toNormalform(true, false, !localhost, false));
if (!localhost) setHost(url.getHost()); // overwrite resolved IP, needed for shared web hosting DO NOT REMOVE, see http://en.wikipedia.org/wiki/Shared_web_hosting_service
return getContentBytes(httpGet, maxBytes); return getContentBytes(httpGet, maxBytes);
} }
@ -488,7 +489,6 @@ public class HTTPClient {
httpResponse = httpClient.execute(httpUriRequest, httpContext); httpResponse = httpClient.execute(httpUriRequest, httpContext);
} catch (Exception e) { } catch (Exception e) {
//e.printStackTrace();
ConnectionInfo.removeConnection(httpUriRequest.hashCode()); ConnectionInfo.removeConnection(httpUriRequest.hashCode());
httpUriRequest.abort(); httpUriRequest.abort();
throw new IOException("Client can't execute: " + e.getMessage()); throw new IOException("Client can't execute: " + e.getMessage());

@ -165,45 +165,43 @@ public class RasterPlotter {
public void plot(final int x, final int y, final int intensity) { public void plot(final int x, final int y, final int intensity) {
if ((x < 0) || (x >= width)) return; if ((x < 0) || (x >= width)) return;
if ((y < 0) || (y >= height)) return; if ((y < 0) || (y >= height)) return;
synchronized (cc) { if (this.defaultMode == DrawMode.MODE_REPLACE) {
if (this.defaultMode == DrawMode.MODE_REPLACE) { if (intensity == 100) synchronized (cc) {
if (intensity == 100) { cc[0] = defaultColR;
cc[0] = defaultColR; cc[1] = defaultColG;
cc[1] = defaultColG; cc[2] = defaultColB;
cc[2] = defaultColB; grid.setPixel(x, y, cc);
grid.setPixel(x, y, cc); } else synchronized (cc) {
} else {
final int[] c = grid.getPixel(x, y, cc);
c[0] = (intensity * defaultColR + (100 - intensity) * c[0]) / 100;
c[1] = (intensity * defaultColG + (100 - intensity) * c[1]) / 100;
c[2] = (intensity * defaultColB + (100 - intensity) * c[2]) / 100;
grid.setPixel(x, y, c);
}
} else if (this.defaultMode == DrawMode.MODE_ADD) {
final int[] c = grid.getPixel(x, y, cc);
if (intensity == 100) {
c[0] = (0xff & c[0]) + defaultColR; if (cc[0] > 255) cc[0] = 255;
c[1] = (0xff & c[1]) + defaultColG; if (cc[1] > 255) cc[1] = 255;
c[2] = (0xff & c[2]) + defaultColB; if (cc[2] > 255) cc[2] = 255;
} else {
c[0] = (0xff & c[0]) + (intensity * defaultColR / 100); if (cc[0] > 255) cc[0] = 255;
c[1] = (0xff & c[1]) + (intensity * defaultColG / 100); if (cc[1] > 255) cc[1] = 255;
c[2] = (0xff & c[2]) + (intensity * defaultColB / 100); if (cc[2] > 255) cc[2] = 255;
}
grid.setPixel(x, y, c);
} else if (this.defaultMode == DrawMode.MODE_SUB) {
final int[] c = grid.getPixel(x, y, cc); final int[] c = grid.getPixel(x, y, cc);
if (intensity == 100) { c[0] = (intensity * defaultColR + (100 - intensity) * c[0]) / 100;
c[0] = (0xff & c[0]) - defaultColR; if (cc[0] < 0) cc[0] = 0; c[1] = (intensity * defaultColG + (100 - intensity) * c[1]) / 100;
c[1] = (0xff & c[1]) - defaultColG; if (cc[1] < 0) cc[1] = 0; c[2] = (intensity * defaultColB + (100 - intensity) * c[2]) / 100;
c[2] = (0xff & c[2]) - defaultColB; if (cc[2] < 0) cc[2] = 0;
} else {
c[0] = (0xff & c[0]) - (intensity * defaultColR / 100); if (cc[0] < 0) cc[0] = 0;
c[1] = (0xff & c[1]) - (intensity * defaultColG / 100); if (cc[1] < 0) cc[1] = 0;
c[2] = (0xff & c[2]) - (intensity * defaultColB / 100); if (cc[2] < 0) cc[2] = 0;
}
grid.setPixel(x, y, c); grid.setPixel(x, y, c);
} }
} else if (this.defaultMode == DrawMode.MODE_ADD) synchronized (cc) {
final int[] c = grid.getPixel(x, y, cc);
if (intensity == 100) {
c[0] = (0xff & c[0]) + defaultColR; if (cc[0] > 255) cc[0] = 255;
c[1] = (0xff & c[1]) + defaultColG; if (cc[1] > 255) cc[1] = 255;
c[2] = (0xff & c[2]) + defaultColB; if (cc[2] > 255) cc[2] = 255;
} else {
c[0] = (0xff & c[0]) + (intensity * defaultColR / 100); if (cc[0] > 255) cc[0] = 255;
c[1] = (0xff & c[1]) + (intensity * defaultColG / 100); if (cc[1] > 255) cc[1] = 255;
c[2] = (0xff & c[2]) + (intensity * defaultColB / 100); if (cc[2] > 255) cc[2] = 255;
}
grid.setPixel(x, y, c);
} else if (this.defaultMode == DrawMode.MODE_SUB) synchronized (cc) {
final int[] c = grid.getPixel(x, y, cc);
if (intensity == 100) {
c[0] = (0xff & c[0]) - defaultColR; if (cc[0] < 0) cc[0] = 0;
c[1] = (0xff & c[1]) - defaultColG; if (cc[1] < 0) cc[1] = 0;
c[2] = (0xff & c[2]) - defaultColB; if (cc[2] < 0) cc[2] = 0;
} else {
c[0] = (0xff & c[0]) - (intensity * defaultColR / 100); if (cc[0] < 0) cc[0] = 0;
c[1] = (0xff & c[1]) - (intensity * defaultColG / 100); if (cc[1] < 0) cc[1] = 0;
c[2] = (0xff & c[2]) - (intensity * defaultColB / 100); if (cc[2] < 0) cc[2] = 0;
}
grid.setPixel(x, y, c);
} }
} }

Loading…
Cancel
Save