diff options
author | Arthur de Jong <arthur@arthurdejong.org> | 2006-06-04 23:28:26 +0200 |
---|---|---|
committer | Arthur de Jong <arthur@arthurdejong.org> | 2006-06-04 23:28:26 +0200 |
commit | 4cdffb4fdb0061e4730c53acc2af6e9a4914f968 (patch) | |
tree | bb81abd35a039af5b16048b02b15a2df5a378a56 | |
parent | f5c80543c74dc0d031fa29bfba7fc911afa59475 (diff) |
fix typos and fix example explanation
git-svn-id: http://arthurdejong.org/svn/webcheck/webcheck@289 86f53f14-5ff3-0310-afe5-9b438ce3f40c
-rw-r--r-- | webcheck.1 | 7 |
1 files changed, 4 insertions, 3 deletions
@@ -108,7 +108,7 @@ as the previous run. .br Note that this option is experimental and it's semantics may change with coming releases (especially in relation to other options). -Also note that the stored files are not quaranteed to be compatible +Also note that the stored files are not guaranteed to be compatible between releases. .TP @@ -160,14 +160,15 @@ URLs of unsupported schemes are also considered yanked. .SH "EXAMPLES" -Check the site www.example.com but exclude any path with "/webcheck" in it. +Check the site www.example.com but consider any path with "/webcheck" in it +to be external. .ft B webcheck http://www.example.com/ \-x /webcheck .ft R .SH "NOTES" -When checking internal URLs webcheck honours the robots.txt file, identifying +When checking internal URLs webcheck honors the robots.txt file, identifying itself as user-agent webcheck. Disallowed links will not be checked at all as if the \-y option was specified for that URL. To allow webcheck to crawl parts of a site that other robots are disallowed, use something like: |