From 4cdffb4fdb0061e4730c53acc2af6e9a4914f968 Mon Sep 17 00:00:00 2001 From: Arthur de Jong Date: Sun, 4 Jun 2006 21:28:26 +0000 Subject: fix typos and fix example explanation git-svn-id: http://arthurdejong.org/svn/webcheck/webcheck@289 86f53f14-5ff3-0310-afe5-9b438ce3f40c --- webcheck.1 | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/webcheck.1 b/webcheck.1 index 6aa81e6..e67e2bb 100644 --- a/webcheck.1 +++ b/webcheck.1 @@ -108,7 +108,7 @@ as the previous run. .br Note that this option is experimental and it's semantics may change with coming releases (especially in relation to other options). -Also note that the stored files are not quaranteed to be compatible +Also note that the stored files are not guaranteed to be compatible between releases. .TP @@ -160,14 +160,15 @@ URLs of unsupported schemes are also considered yanked. .SH "EXAMPLES" -Check the site www.example.com but exclude any path with "/webcheck" in it. +Check the site www.example.com but consider any path with "/webcheck" in it +to be external. .ft B webcheck http://www.example.com/ \-x /webcheck .ft R .SH "NOTES" -When checking internal URLs webcheck honours the robots.txt file, identifying +When checking internal URLs webcheck honors the robots.txt file, identifying itself as user-agent webcheck. Disallowed links will not be checked at all as if the \-y option was specified for that URL. To allow webcheck to crawl parts of a site that other robots are disallowed, use something like: -- cgit v1.2.3