Arthur de Jong

Open Source / Free Software developer

summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorArthur de Jong <arthur@arthurdejong.org>2006-06-04 23:28:26 +0200
committerArthur de Jong <arthur@arthurdejong.org>2006-06-04 23:28:26 +0200
commit4cdffb4fdb0061e4730c53acc2af6e9a4914f968 (patch)
treebb81abd35a039af5b16048b02b15a2df5a378a56
parentf5c80543c74dc0d031fa29bfba7fc911afa59475 (diff)
fix typos and fix example explanation
git-svn-id: http://arthurdejong.org/svn/webcheck/webcheck@289 86f53f14-5ff3-0310-afe5-9b438ce3f40c
-rw-r--r--webcheck.17
1 files changed, 4 insertions, 3 deletions
diff --git a/webcheck.1 b/webcheck.1
index 6aa81e6..e67e2bb 100644
--- a/webcheck.1
+++ b/webcheck.1
@@ -108,7 +108,7 @@ as the previous run.
.br
Note that this option is experimental and it's semantics may change
with coming releases (especially in relation to other options).
-Also note that the stored files are not quaranteed to be compatible
+Also note that the stored files are not guaranteed to be compatible
between releases.
.TP
@@ -160,14 +160,15 @@ URLs of unsupported schemes are also considered yanked.
.SH "EXAMPLES"
-Check the site www.example.com but exclude any path with "/webcheck" in it.
+Check the site www.example.com but consider any path with "/webcheck" in it
+to be external.
.ft B
webcheck http://www.example.com/ \-x /webcheck
.ft R
.SH "NOTES"
-When checking internal URLs webcheck honours the robots.txt file, identifying
+When checking internal URLs webcheck honors the robots.txt file, identifying
itself as user-agent webcheck. Disallowed links will not be checked at all as
if the \-y option was specified for that URL. To allow webcheck to crawl parts
of a site that other robots are disallowed, use something like: