diff --git a/htroot/ConfigBasic.html b/htroot/ConfigBasic.html
index f56353e13..c6e823dee 100644
--- a/htroot/ConfigBasic.html
+++ b/htroot/ConfigBasic.html
@@ -114,7 +114,7 @@
Your basic configuration is complete! You can now (for example):
::
diff --git a/htroot/CrawlStartSite.html b/htroot/CrawlStartSite.html
index 3142a51d8..257d80da5 100644
--- a/htroot/CrawlStartSite.html
+++ b/htroot/CrawlStartSite.html
@@ -102,7 +102,7 @@
Target Balancer
A second crawl for a different host increases the throughput to a maximum of 240 documents per minute since the crawler balances the load over all hosts.
High Speed Crawling
A 'shallow crawl' which is not limited to a single host (or site)
can extend the pages per minute (ppm) rate to unlimited documents per minute when the number of target hosts is high.
- This can be done using the Expert Crawl Start servlet.
+ This can be done using the Expert Crawl Start servlet.
Scheduler Steering
The scheduler on crawls can be changed or removed using the API Steering.
#%env/templates/footer.template%#
diff --git a/htroot/Status.html b/htroot/Status.html
index 18a14e694..ae8ba87ba 100644
--- a/htroot/Status.html
+++ b/htroot/Status.html
@@ -160,7 +160,7 @@
#(hintCrawlStart)#::
- Your Web Page Indexer is idle. You can start your own web crawl here.
+ Your Web Page Indexer is idle. You can start your own web crawl here.
#(/hintCrawlStart)#
diff --git a/htroot/index.html b/htroot/index.html
index bc7eb9c55..9eb0b6370 100644
--- a/htroot/index.html
+++ b/htroot/index.html
@@ -147,7 +147,7 @@
/http
only resources from http or https servers
/ftp
- only resources from ftp servers (they are rare, crawl them yourself)
+ only resources from ftp servers (they are rare, crawl them yourself)
/smb
only resources from smb servers (Intranet Indexing must be selected)
/file
diff --git a/htroot/terminal_p.html b/htroot/terminal_p.html
index f8630e123..467bcad6e 100644
--- a/htroot/terminal_p.html
+++ b/htroot/terminal_p.html
@@ -42,7 +42,7 @@ function init() {