diff --git a/htroot/EditProfile_p.html b/htroot/EditProfile_p.html index 4c3803e9e..d1d959435 100644 --- a/htroot/EditProfile_p.html +++ b/htroot/EditProfile_p.html @@ -18,19 +18,19 @@ You do not need to provide any personal data here, but if you want to distribute - + - + - + - - + + @@ -43,15 +43,15 @@ You do not need to provide any personal data here, but if you want to distribute - + - + - + diff --git a/htroot/IndexCreate_p.html b/htroot/IndexCreate_p.html index d5fe020b9..d3e8f3b86 100644 --- a/htroot/IndexCreate_p.html +++ b/htroot/IndexCreate_p.html @@ -28,15 +28,15 @@ You can define url's as start points for Web page crawling and start that crawli @@ -45,7 +45,7 @@ You can define url's as start points for Web page crawling and start that crawli @@ -71,7 +71,7 @@ You can define url's as start points for Web page crawling and start that crawli @@ -104,8 +104,8 @@ You can define url's as start points for Web page crawling and start that crawli - diff --git a/htroot/MessageSend_p.html b/htroot/MessageSend_p.html index c79ee39f9..a55d9e1d4 100644 --- a/htroot/MessageSend_p.html +++ b/htroot/MessageSend_p.html @@ -1,13 +1,13 @@ -YaCy: Send Message +YaCy: Send message #[metas]# #[header]#

-

Send Message


+

Send message


#[body]# diff --git a/htroot/Network.html b/htroot/Network.html index 824c7ccc3..596864ef5 100644 --- a/htroot/Network.html +++ b/htroot/Network.html @@ -38,7 +38,7 @@ - + #(complete)#:: diff --git a/htroot/env/templates/metas.template b/htroot/env/templates/metas.template index 9356e37ad..db1fa47ae 100644 --- a/htroot/env/templates/metas.template +++ b/htroot/env/templates/metas.template @@ -1,7 +1,7 @@ - + diff --git a/htroot/proxymsg/error.html b/htroot/proxymsg/error.html index 9239617c2..b8124ecbc 100644 --- a/htroot/proxymsg/error.html +++ b/htroot/proxymsg/error.html @@ -38,7 +38,7 @@
Name
Nick Name
Homepage
EMaileMail
Jabber
Yahoo!
MSN
A minimum of 1 is recommended. Be careful with the prefetch number. Consider a branching factor of average 20; - A prefect-depth of 8 would index 25.600.000.000 pages, maybe the whole WWW. + A prefetch-depth of 8 would index 25.600.000.000 pages, maybe the whole WWW.
Crawling Filter: - This is an emacs-like regular expression that must match with the crawled url. - Use this i.e. to crawl a single domain. If you set this filter is would make sense to increase + This is an emacs-like regular expression that must match with the crawled URL. + Use this i.e. to crawl a single domain. If you set this filter it would make sense to increase the crawl depth.
URL's pointing to dynamic content should usually not be crawled. However, there are sometimes web pages with static content that - is accessed with URL's containing question marks. If you are unshure, do not check this to avoid crawl loops. + is accessed with URL's containing question marks. If you are unsure, do not check this to avoid crawl loops.
If checked, the crawl will try to assign the leaf nodes of the search tree to remote peers. If you need your crawling results locally, you must switch this off. - Only senior and principal peers can initiate or receive remote crawls. + Only senior and principal peer's can initiate or receive remote crawls.
Start Point: Existing start url's are re-crawled. - Other already visited url's are sorted out as 'double'. + Existing start URL's are re-crawled. + Other already visited URL's are sorted out as 'double'. A complete re-crawl will be available soon.
Profile
 
Message
 
Name*
 
Name
 
Address
 
Hash
 










-

Error with url '#[url]#':



+

Error with URL '#[url]#':



#[httperror]#

#(errormessage)#