summaryrefslogtreecommitdiffstats
path: root/upstream/debian-unstable/man1/suck.1
diff options
context:
space:
mode:
Diffstat (limited to 'upstream/debian-unstable/man1/suck.1')
-rw-r--r--upstream/debian-unstable/man1/suck.11380
1 files changed, 1380 insertions, 0 deletions
diff --git a/upstream/debian-unstable/man1/suck.1 b/upstream/debian-unstable/man1/suck.1
new file mode 100644
index 00000000..e0dc8c0f
--- /dev/null
+++ b/upstream/debian-unstable/man1/suck.1
@@ -0,0 +1,1380 @@
+.\" $Revision: 4.2.0 $
+.TH SUCK 1
+.SH NAME
+suck - Pull a small newsfeed from an NNTP server, avoiding the NEWNEWS command.
+.SH SYNOPSIS
+.I suck
+[
+.BI
+hostname
+]
+[
+.BI @filename
+]
+[
+.BI \-V
+]
+[
+.BI \-K
+]
+[
+.BI \-L[SL]
+]
+[
+.BI \-LF
+filename
+]
+[
+.BI \-H
+]
+[
+.BI \-HF
+filename
+]
+[
+.BI \-d[tmd]
+dirname
+]
+[
+.BI \-s\ |\ \-S
+filename
+]
+[
+.BI \-e\ |\ \-E
+filename
+]
+[
+.BI \-a
+]
+[
+.BI \-m
+]
+[
+.BI \-b[irlf]
+batchfile
+]
+[
+.BI \-r
+filesize
+]
+[
+.BI \-p
+extension
+]
+[
+.BI \-U
+userid
+]
+[
+.BI \-P
+password
+]
+[
+.BI \-Q
+]
+[
+.BI \-c
+]
+[
+.BI \-M
+]
+[
+.BI \-N
+port_number
+]
+[
+.BI \-W
+pause_time pause_nr_msgs
+]
+[
+.BI \-w
+pause_time pause_nr_msgs
+]
+[
+.BI \-l
+phrase_file
+]
+[
+.BI \-D
+]
+[
+.BI \-R
+]
+[
+.BI \-q
+]
+[
+.BI \-C
+count
+]
+[
+.BI \-k
+]
+[
+.BI \-A
+]
+[
+.BI \-AL
+activefile
+]
+[
+.BI \-hl
+localhost
+]
+[
+.BI \-bp
+]
+[
+.BI \-T
+timeout
+]
+[
+.BI \-n
+]
+[
+.BI \-u
+]
+[
+.BI \-z
+]
+[
+.BI \-x
+]
+[
+.BI \-B
+]
+[
+.BI \-O
+]
+[
+.BI \-G
+]
+[
+.BI \-X
+]
+[
+.BI \-f
+]
+[
+.BI \-y
+post_filter
+]
+[
+.BI \-F
+]
+[
+.BI \-g
+]
+[
+.BI \-i
+number_to_read
+]
+[
+.BI \-Z
+]
+[
+.BI \-rc
+]
+[
+.BI \-lr
+]
+[
+.BI \-sg
+]
+[
+.BI \-ssl
+]
+[
+.BI \-SSL
+]
+
+Options valid in all modes
+\hostname
+
+The hostname may optionally include the port number, in the form
+.BI Host:Port.
+If this option is used, any port number specified
+via the -N option is ignored.
+
+\@filename
+
+This option tells suck to read other options from a file in addition to the
+commandline.
+
+\-a
+
+This option forces suck to always batch up any downloaded articles,
+even if suck aborts for any reason. Without this option, suck will
+only batch up articles if it finishes successfully or is cancelled by
+a signal (see below).
+
+\-A
+
+This option tells suck to scan the localhost (specified with the \-hl option) and use its active file
+to build and update the sucknewsrc. If you add a group to your local server, suck will add it to
+sucknewsrc and download articles. Or, if you delete a group from your local server, it will be deleted
+from sucknewsrc. If posting is not allowed to a particular group, then the line in sucknewsrc is
+just commented out. With this option, you should never have to edit your sucknewsrc. In case you have
+newsgroups (like control and junk) that you don't want downloaded, you can put these newsgroups in a
+file "active-ignore", one per line, and suck will ignore these newsgroups when it scans the localhost.
+If your system supports regex(), you may use regular expressions in the active-ignore file to skip multiple groups, eg: fred\.*.
+If you use the -p (postfix) option, suck will check for the existence of an active-ignore file with the
+postfix. If that doesn't exist, then suck will check for the existence of the file without the postfix.
+
+NOTE: If the localhost is on a non-standard port, the port number may be specified as part of the hostname,
+in the form
+.BI Host:Port.
+
+NOTE: If you use regular expressions, suck will silently add a "^" to the beginning of the group name,
+and a "$" to the end of the group name if they aren't already present, so that if you have "comp.os.linux",
+it won't match "comp.os.linux.answers" or if you have "alt.test" it doesn't match "comp.alt.test".
+
+\-AL activefile
+
+This option is identical to the -A option, except it reads the active file from the local file specified instead of
+reading it from the localhost. All the caveats from the -A option apply to this option as well. If both
+options are used on the command line, suck first tries to use the -A option, then if that fails it uses
+this option.
+
+\-B
+
+This option tells suck to attempt to batch up any articles in its directory
+BEFORE starting to download messages. This can be useful if you have a
+problem with the previous download. This option will only work if you specify
+a batch option (see below). If there are no messages to batch up, some
+of the batch options may produce warning messages. They may be safely ignored.
+Also, if the batch files exist at the end of the run, in inn-batch mode, it
+will be overwritten, since the new batch file will contain all messages. In
+rnews mode, if the batch file exists, it will abort and not batch up any messages.
+
+\-c
+
+If this option is specified, suck will clean up after itself. This includes:
+.RS
+1. Moving sucknewsrc to sucknewsrc.old
+.RE
+.RS
+2. Moving suck.newrc to sucknewsrc
+.RE
+.RS
+3. rm suck.sorted and suckothermsgs.
+.RE
+
+\-C count
+
+This option tells suck to drop the connection and reopen it every count number of articles.
+This is designed to battle INN's LIKE_PULLERS=DONT option, that some folks compile in. With
+LIKE_PULLERS=DONT, after 100 messages INN will pause between every message, dramatically
+reducing your download speed. I don't recommend the use of this, but if you have no other choice....
+
+\-dd dirname
+
+\-dm dirname
+
+\-dt dirname
+
+Specify the location of the various files used by suck.
+
+\-dd dirname = directory of data files used by suck (sucknewsrc suckkillfile suckothermsgs active-ignore sucknodownload)
+
+\-dm dirname = directory for storage of articles created in Multifile mode
+or batch mode. DO NOT make this the same as the directories used for the
+\-dt or -\dd options, or you will lose all your configuration files.
+
+\-dt dirname = directory of temp files created by suck (suck.newrc, suck.sort, suck.restart, suck.killlog, suck.post).
+
+\-D
+
+This option tells suck to log various debugging messages to "debug.suck", primarily
+for use by the maintainer.
+
+\-e | \-E filename
+
+These options will send all error messages (normally displayed on stderr), to
+an alternate file. The lower case version, -e, will send the error messages
+to the compiled-in default defined in suck_config.h. The default is suck.errlog.
+The upper case version, -E, requires the filename parameter. All error messages
+will then be sent to this file.
+
+\-f
+
+This option tells suck to reconnect after deduping, and before downloading the articles. This is in case
+long dedupe times cause timeouts on the remote end.
+
+\-F
+
+This option tells suck to reconnect after reading the local active file, and before downloading the Msg-IDs.
+This is in case of a large active file, which causes timeouts on the remote end.
+
+\-g
+
+This option causes suck to only download the headers of any selected articles.
+As a result of this, any batching of articles is skipped. This option does
+work with killfiles, however, killfile options such as BODYSIZE> will be
+ignored, since the body of the article will never be downloaded.
+
+\-G
+
+This option causes suck to display the message count and BPS status lines in a slightly different format,
+more suitable for use by a filter program (such as a GUI).
+
+\-H
+
+This option will cause suck to bypass the history check.
+
+\-HF history_file_name
+
+This option tells suck the location of the history file. The default is at /var/lib/news/history.
+
+\-hl localhost
+
+This option specifies the localhost name. This option is required with both the \-A and the \-bp option.
+
+\-i number_to_read
+
+This option tells suck the number of articles to download if you are using the -A
+or -AL option, and a new group is added. The default is defined in suck_config.h (ACTIVE_DEFAULT_LASTREAD, currently -100). NOTE: This must be a negative
+number (eg -100, -50), or 0, to download all articles currently available in
+the group.
+
+\-k
+
+This option tells suck to NOT attach the postfix from the \-p option to the names of the killfiles,
+both the master killfile and any group files. This allows you to maintain one set of killfiles for
+multiple servers.
+
+\-K
+
+This option will cause suck to bypass checking the killfile(s).
+
+\-l phrase_file
+
+This option tells suck to load in an alternate phrase file, instead of using
+the built-in messages. This allows you to have suck print phrases in another
+language, or to allow you to customize the messages without re-building suck.
+See below.
+
+\-lr
+
+This option, is used in conjunction with the highest article option in the sucknewsrc, to
+download the oldest articles, vice the newest articles. See that section for more details.
+
+\-L
+
+This option tells suck to NOT log killed articles to suck.killlog.
+
+\-LF filename
+
+This option allows you to override the built-in default of "suck.killlog" for the
+file which contains the log entries for killed articles.
+
+\-LL
+
+This option tells suck to create long log entries for each killed article. The long
+entry contains the short log entry and the header for the killed message.
+
+\-LS
+
+This option tells suck to create short log entries for each killed article. The short
+entry contains which group and which pattern was matched, as well as the MsgID of the
+killed article.
+
+\-M
+
+This option tells suck to send the "mode reader" command to the remote
+server. If you get an invalid command message immediately
+after the welcome announcement, then try this option.
+
+\-n
+
+This option tells suck to use the article number vice the MsgId to retrieve the articles. This
+option is supposedly less harsh on the remote server. It can also eliminate problems if your
+ISP ages off articles quickly and you frequently get "article not found" errors.
+Also, if your ISP uses DNEWS, you might need this option so that it knows you're reading articles in a group.
+
+\-N port_number
+
+This option tells suck to use an alternate NNRP port number when connecting
+to the host, instead of the default, 119.
+
+\-O
+
+This option tells suck to skip the first article upon restart. This is used whenever
+there is a problem with an article on the remote server. For some reasons, some
+NNTP servers, when they have a problem with a particular article, they time out.
+Yet, when you restart, you're back on the same article, and you time out again.
+This option tells suck to skip the first article upon restart, so that you can
+get the rest of the articles.
+
+\-p extension
+
+This extension is added to all files so that you can have multiple site feeds.
+For example, if you specify -p .dummy, then suck looks for sucknewsrc.dummy, suckkillfile.dummy,
+etc, and creates its temp files with the same extension. This will allow you to keep
+multiple sucknewsrc files, one for each site.
+
+\-q
+
+This option tells suck to not display the BPS and article count messages during download.
+Handy when running suck unattended, such as from a crontab.
+
+\-R
+
+This option tells suck to skip a rescan of the remote newserver upon a restart. The
+default is to rescan the newserver for any new articles whenever suck runs, including
+restarts.
+
+\-rc
+
+This option tells suck to change its behavior when the remote server resets its article
+counters. The default behavior is to reset the lastread in sucknewsrc to the current
+high article counter. With this option, suck resets the lastread in sucknewsrc to the
+current low article counter, causing it to suck all articles in the group, and using
+the historydb routines to dedupe existing articles.
+
+\-s | \-S filename
+
+These options will send all status messages (normally displayed on stdout), to
+an alternate file. The lower case version, -s, will send the status messages
+to the compiled-in default defined in suck_config.h. The default is /dev/null,
+so no status messages will be displayed. The upper case version, -S, requires
+the filename parameter. All status messages will then be sent to this file.
+
+\-sg
+
+This option tells suck to add the name of the current group being downloaded, if known,
+to the BPS display. Typically the only time suck doesn't know the group name is if
+an article is downloaded via the suckothermsgs file.
+
+\-ssl
+
+This option tells suck to use SSL to talk to the remote server, if suck was compiled with
+SSL support.
+
+\-SSL
+
+This option tells suck to use SSL to talk to the local server, if suck was compiled with
+SSL support.
+
+\-T timeout
+
+This option overrides the compiled-in TIMEOUT value. This is how long suck waits for data from the
+remote host before timing out and aborting. The timeout value is in seconds.
+
+\-u
+
+This option tells suck to send the AUTHINFO USER command immediately upon connect to the
+remote server, rather than wait for a request for authorization. You must supply the
+\-U and \-P options when you use this option.
+
+\-U userid
+
+\-P password
+
+These two options let you specify a userid and password, if your NNTP server
+requires them.
+
+\-Q
+
+This option tells suck to get the userid and password for NNTP authentication from
+the environment variables "NNTP_USER" and "NNTP_PASS" vice the -U or -P password.
+This prevents a potential security problem where someone doing a ps command can
+see your userid and password.
+
+\-V
+
+This option will cause suck to print out the version number and then exit.
+
+\-w pause_timer pause_nr_msgs
+
+This option allows you to slow down suck while pulling articles. If you
+send suck a predefined signal (default SIGUSR1, see suck_config.h),
+suck will swap the default pause options (if specified by the -W option),
+with the values from this option. For example, you run suck with -w 2 2,
+and you send suck a SIGUSR1 (using kill), suck will then pause 2 seconds
+between every other message, allowing the server to "catch its breath."
+If you send suck another SIGUSR1, then suck will put back the default
+pause options. If no pause options were specified on the command line
+(you omitted -W), then suck will return to the default full speed pull.
+
+\-W pause_time pause_nr_msgs
+
+This option tells suck to pause between the download of articles. You need
+to specify how long to pause (in seconds), and how often to pause (every X nr
+of articles). Ex: \-W 10 100 would cause suck to pause for 10 seconds every
+100 articles. Why would you want to do this? Suck can cause heavy loads on
+a remote server, and this pause allows the server to "catch its breath."
+
+\-x
+
+This option tells suck to not check the Message-IDs for the ending > character. This option
+is for brain dead NNTP servers that truncate the XHDR information at 72 characters.
+
+\-X
+
+This option tells suck to bypass the XOVER killfiles.
+
+\-y post_filter
+
+This option is only valid when using any of batch modes. It allows you to edit any or all of
+the articles downloaded before posting to the local host. See below for more details.
+
+\-z
+
+This option tells suck to bypass the normal deduping process. This is primarily for
+slow machines where the deduping takes longer than the download of messages would. Not
+recommended.
+
+\-Z
+
+This option tells suck to use the XOVER command vice the XHDR command to retrieve the
+information needed to download articles. Use this if your remote news server doesn't
+support the XHDR command.
+
+.SH LONG OPTION EQUIVALENTS
+.RS
+\-a \-\-always_batch
+.RE
+.RS
+\-bi \-\-batch-inn
+.RE
+.RS
+\-br \-\-batch_rnews
+.RE
+.RS
+\-bl \-\-batch_lmove
+.RE
+.RS
+\-bf \-\-batch_innfeed
+.RE
+.RS
+\-bp \-\-batch_post
+.RE
+.RS
+\-c \-\-cleanup
+.RE
+.RS
+\-dt \-\-dir_temp
+.RE
+.RS
+\-dd \-\-dir_data
+.RE
+.RS
+\-dm \-\-dir_msgs
+.RE
+.RS
+\-e \-\-def_error_log
+.RE
+.RS
+\-f \-\-reconnect_dedupe
+.RE
+.RS
+\-g \-\-header_only
+.RE
+.RS
+\-h \-\-host
+.RE
+.RS
+\-hl \-\-localhost
+.RE
+.RS
+\-k \-\-kill_no_postfix
+.RE
+.RS
+\-l \-\-language_file
+.RE
+.RS
+\-lr \-\-low_read
+.RE
+.RS
+\-m \-\-multifile
+.RE
+.RS
+\-n \-\-number_mode
+.RE
+.RS
+\-p \-\-postfix
+.RE
+.RS
+\-q \-\-quiet
+.RE
+.RS
+\-r \-\-rnews_size
+.RE
+.RS
+\-rc \-\-resetcounter
+.RE
+.RS
+\-s \-\-def_status_log
+.RE
+.RS
+\-sg \-\-show_group
+.RE
+.RS
+\-ssl \-\-use_ssl
+.RE
+.RS
+\-w \-\-wait_signal
+.RE
+.RS
+\-x \-\-no_chk_msgid
+.RE
+.RS
+\-y \-\-post_filter
+.RE
+.RS
+\-z \-\-no_dedupe
+.RE
+.RS
+\-A \-\-active
+.RE
+.RS
+\-AL \-\-read_active
+.RS
+.RE
+\-B \-\-pre-batch
+.RE
+.RS
+\-C \-\-reconnect
+.RE
+.RS
+\-D \-\-debug
+.RE
+.RS
+\-E \-\-error_log
+.RE
+.RS
+\-G \-\-use_gui
+.RE
+.RS
+\-H \-\-no_history
+.RE
+.RS
+\-HF \-\-history_file
+.RE
+.RS
+\-K \-\-killfile
+.RE
+.RS
+\-L \-\-kill_log_none
+.RE
+.RS
+\-LS \-\-kill_log_short
+.RE
+.RS
+\-LL \-\-kill_log_long
+.RE
+.RS
+\-M \-\-mode_reader
+.RE
+.RS
+\-N \-\-portnr
+.RE
+.RS
+\-O \-\-skip_on_restart
+.RE
+.RS
+\-P \-\-password
+.RE
+.RS
+\-Q \-\-password_env
+.RE
+.RS
+\-R \-\-no_rescan
+.RE
+.RS
+\-S \-\-status_log
+.RE
+.RS
+\-SSL \-\-local_use_ssl
+.RS
+\-T \-\-timeout
+.RE
+.RS
+\-U \-\-userid
+.RE
+.RS
+\-V \-\-version
+.RE
+.RS
+\-W \-\-wait
+.RE
+.RS
+\-X \-\-no_xover
+.RE
+.RS
+\-Z \-\-use_xover
+.RE
+
+.SH DESCRIPTION
+
+.SH MODE 1 \- stdout mode
+.RS
+%suck
+.RE
+.RS
+%suck myhost.com
+.RE
+.PP
+Suck grabs news from an NNTP server and sends the articles to
+stdout. Suck accepts as argument the name of an NNTP server or
+if you don't give an argument it will take the environment variable
+NNTPSERVER. You can redirect the articles to a file or compress them
+on the fly like "suck server.domain | gzip \-9 > output.gz".
+Now it's up to you what you do with the articles. Maybe
+you have the output already on your local machine because you
+used a slip line or you still have to transfer the output to your
+local machine.
+.SH MODE 2 \- Multifile mode
+.RS
+%suck \-m
+.RE
+.RS
+%suck myhost.com \-m
+.RE
+.PP
+Suck grabs news from an NNTP server and stores each article in a
+separate file. They are stored in the directory specified in suck_config.h or
+by the \-dm command line option.
+.SH MODE 3 \- Batch mode
+.RS
+%suck myhost.com \-b[irlf] batchfile
+.RE
+.RS
+or %suck myhost.com \-bp -hl localhost
+.RE
+.RS
+or %suck myhost.com \-bP NR -hl localhost
+.RE
+.RS
+%suck myhost.com \-b[irlf] batchfile
+.RE
+.PP
+Suck will grab news articles from an NNTP server and store them
+into files, one for each article (Multifile mode). The location of the files
+is based on the defines in suck_config.h and the command line \-dm.
+Once suck is done downloading the articles, it will build a batch file
+which can be processed by either innxmit or rnews, or it will call lmove
+to put the files directly into the news/group/number format.
+
+\-bi \- build batch file for innxmit. The articles are left intact,
+and a batchfile is built with a one\-up listing of the full path of each article.
+Then innxmit can be called:
+
+.RS
+%innxmit localhost batchfile
+.RE
+
+\-bl \- suck will call lmove to put the articles into
+news/group/number format. You must provide the name of the
+configuration file on the command line. The following arguments from suck
+are passed to lmove:
+
+.RS
+The configuration file name (the batchfile name provided with this option)
+.RE
+.RS
+The directory specified for articles (-dm or built-in default).
+.RE
+.RS
+The errorlog to log errors to (-e or -E), if provided on the command line.
+.RE
+.RS
+The phrases file (-l), if provided on the command line.
+.RE
+.RS
+The Debug option, if provided on the command line.
+.RE
+
+\-br \- build batch file for rnews. The articles are
+concatenated together, with the #!rnews size
+article separator. This can the be fed to rnews:
+
+.RS
+%rnews \-S localhost batchfile
+.RE
+
+\-r filesize specify maximum batch file size for rnews. This option
+allows you to specify the maximum size of a batch file to be fed to rnews.
+When this limit is reached, a new batch file is created AFTER I finish
+writing the current article to the old batch file. The second and
+successive batch files get a 1 up sequence number attached to the
+file name specified with the -br. Note that since I have to finish
+writing out the current article after reaching the limit, the
+max file size is only approximate.
+
+\-bf \- build a batch file for innfeed. This batchfile contains the
+MsgID and full path of each article. The main difference between this
+and the innxmit option is that the innfeed file is built as the articles
+are downloaded, so that innfeed can be posting the articles, even while
+more articles are downloaded.
+
+\-bp \- This option tells suck to build a batch file, and post the articles
+in that batchfile to the localhost (specified with the \-hl option). This option
+uses the IHAVE command to post all downloaded articles to the local host.
+The batch file is called suck.post, and is put in the temporary directory (-dt).
+It is deleted upon completion, as are the successfully posted articles.
+If the article is not wanted by the server (usually because it already exists on
+the server, or it is too old), the article is also deleted. If other errors
+occur, the article is NOT deleted.
+With the following command line, you can download and post articles without
+worrying if you are using INND or CNEWS.
+
+.RS
+%suck news.server.com -bp -hl localhost -A -c
+.RE
+
+\-bP NR \- This option works identically to \-bp above, except instead of
+waiting until all articles are downloaded, it will post them to the local
+server after downloading NR of articles.
+
+.RS
+%suck news.server.com -bP 100 -hl localhost -A -c
+.RE
+
+.SH SUCK ARGUMENT FILE
+.PP
+If you specify @filename on the command line, suck will read from filename and
+parse it for any arguments that you wish to pass to suck. You specify the
+same arguments in this file as you do on the command line. The arguments
+can be on one line, or spread out among more than one line. You may also
+use comments. Comments begin with '#' and go to the end of a line. All
+command line arguments override arguments in the file.
+
+.RS
+# Sample Argument file
+.RE
+.RS
+-bi batch # batch file option
+.RE
+.RS
+-M # use mode reader option
+.RE
+
+.SH SUCKNEWSRC
+.PP
+Suck looks for a file
+.I sucknewsrc
+to see what articles you want and
+which you already received. The format of sucknewsrc is very simple. It
+consists of one line for each newsgroup. The line contains two or
+three fields.
+
+The first field is the name of the group.
+
+The second field is the highest article number that was in the group
+when that group was last downloaded.
+
+The third field, which is optional, limits the number of articles which
+can be downloaded at any given time. If there are more articles than this
+number, only the newest are downloaded. If the third field is 0, then
+no new messages are downloaded. If the command line option \-lr is specified,
+instead of downloading the newest articles, suck will download the oldest
+articles instead.
+
+The fields are separated by a space.
+
+.RS
+comp.os.linux.announce 1 [ 100 ]
+.RE
+.PP
+When suck is finished, it creates the file suck.newrc which contains the
+new sucknewsrc with the updated article numbers.
+.PP
+To add a new newsgroup, just stick it in sucknewsrc, with a
+highest article number of \-1 (or any number less than 0).
+Suck will then get the newest X number of messages for that newsgroup.
+For example, a -100 would cause suck to download the newest 100
+articles for that newsgroup.
+.PP
+To tell suck to skip a newsgroup, put a # as the first
+character of a line.
+
+.SH SUCKKILLFILE and SUCKXOVER
+There are two types of killfiles supported in suck. The first, via
+the file suckkillfile, kills articles based on information in the
+actual article header or body. The second, via the file suckxover,
+kills articles based on the information retreived via the NNTP command
+XOVER. They are implemented in two fundamentally different ways. The
+suckkillfile killing is done as the articles are downloaded, one at a
+time. The XOVER killing is done while suck is getting the list of articles
+to download, and before a single article is downloaded. You may use
+either, none or both type of killfiles.
+
+.SH SUCKKILLFILE and GROUP KEEP/KILLFILES
+If
+.I suckkillfile
+exists, the headers of all articles will be scanned and the article downloaded or not,
+based on the parameters in the files. If no logging option is specified (see the -L options
+above), then the long logging option is used.
+.PP
+Comments lines are allowed in the killfiles. A comment line has a "#" in the first position.
+Everything on a comment line is ignored.
+.PP
+Here's how the whole keep/delete package works. All articles are checked against the
+master kill file (suckkillfile). If an article is not killed by the master kill file,
+then its group line is parsed. If a group file exists for one of the groups then the
+article is checked against that group file. If it matches a keep file, then it is
+kept, otherwise it is flagged for deletion. If it matches a delete file, then it is
+flagged for deletion, otherwise it is kept. This is done for every group on the group line.
+.PP
+NOTES: With the exception of the USE_EXTENDED_REGEX parameter, none of these parameters are
+passed from the master killfile to the individual group file. Each killfile is separate
+and independant. Also, each search is case-insensitive unless specifically specified by starting the
+search string with the QUOTE character (see below). However, the parameter part of the
+search expression (the LOWLINE=, HILINE= part) is case sensitive.
+.SH
+PARAMETERS
+.RS
+LOWLINES=#######
+.RE
+.RS
+HILINES=#######
+.RE
+.RS
+NRGRPS=####
+.RE
+.RS
+NRXREF=####
+.RE
+.RS
+QUOTE=c
+.RE
+.RS
+NON_REGEX=c
+.RE
+.RS
+GROUP=keep groupname filename OR
+GROUP=delete groupname filename
+.RE
+.RS
+PROGRAM=pathname
+.RE
+.RS
+PERL=pathname
+.RE
+.RS
+TIEBREAKER_DELETE
+.RE
+.RS
+GROUP_OVERRIDE_MASTER
+.RE
+.RS
+USE_EXTENDED_REGEX
+.RE
+.RS
+XOVER_LOG_LONG
+.RE
+.RS
+HEADER:
+.RE
+.RS
+Any Valid Header Line:
+.RE
+.RS
+BODY:
+.RE
+.RS
+BODYSIZE>
+.RE
+.RS
+BODYSIZE<
+.RE
+
+.PP
+All parameters are valid in both the master kill file and the group files, with the
+exception of GROUP, PROGRAM, PERL, TIEBREAKER_DELETE, and GROUP_OVERRIDE_MASTER.
+These are only valid in the master kill file.
+
+.SH KILL/KEEP Files Parameters
+.PP
+.I HILINES=
+Match any article longer than the number of lines specified.
+.PP
+.I LOWLINES=
+Match any article shorter than the number of lines specified.
+.PP
+.I NRGRPS=
+This line will match any article which has more groups than the number specified
+on the Newsgroups: line.
+Typically this is used in a killfile to prevent spammed articles.
+(A spammed article is one that is posted to many many groups, such
+as those get-rich quick schemes, etc.)
+.PP
+.I NRXREF=
+This line will match any article that has more groups than than the number specified
+on the Xref: line. This is another spamm stopper. WARNING: the Xref: line is not
+as accurate as the Newsgroups: line, as it only contains groups known to the news
+server. This option is most useful in an xover killfile, as in Xoverviews don't
+typically provide the Newsgroups: line, but do provide the Xref: line.
+.PP
+.I HEADER:
+.I Any Valid Header Line:
+Suck allows you to scan any single header line for a particular pattern/string, or
+you may scan the entire article header. To scan an individual line, just specify
+it, for example to scan the From line for boby@pixi.com, you would put
+
+.RS
+From:boby@pixi.com
+.RE
+
+Note that the header line EXACTLY matches what is contained in the article. To scan
+the Followup-To: line, simply put \"Followup-To:\" as the parameter.
+To search the same header line for multiple search items, then each search
+item must be on a separate line, eg:
+.RS
+From:boby@xxx
+.RE
+.RS
+From:nerd@yyy
+.RE
+.RS
+Subject:suck
+.RE
+.RS
+Subject:help
+.RE
+The parameter HEADER: is a special case of the above. If you use the HEADER: parameter,
+then the entire header is searched for the item. You are allowed multiple HEADER: lines
+in each killfile.
+.PP
+When suck searches for the pattern, it only searches for what follows
+the :, and spaces following the : are significant. With the above example "Subject:suck",
+we will search the Subject header line for the string "suck". If the example had read "Subject: suck",
+suck would have searched for the string " suck". Note the extra space.
+.PP
+If your system has regex() routines on it, then the items searched for can be POSIX
+regular expressions, instead of just strings. Note that the QUOTE= option is still
+applied, even to regular expressions.
+.PP
+.I BODY:
+This parameter allows you to search the body of an article for text. Again,
+if your system has regex(), you can use regular expressions, and the QUOTE= option is
+also applied. You are allowed multiple BODY: lines in each killfile.
+WARNING: Certain regex combinations, especially with .* at the beginning,
+(eg BODY:.*jpg), in combination with large articles, can cause the regex code
+to eat massive amounts of CPU, and suck will seem like it is doing nothing.
+.PP
+.I BODYSIZE>
+This parameter will match an article if the size of its body (not including the
+header) is greater than this parameter. The size is specified in bytes.
+.PP
+.I BODYSIZE<
+This parameter will match an article if the size of its body, is less than this parameter.
+The size is specified in bytes.
+.PP
+.I QUOTE=
+This item specifies the character that defines a quoted string. The default
+for this is a ". If an item starts with the QUOTE character, then the item is
+checked as-is (case significant). If an item does not start with the QUOTE character,
+then the item is checked with out regard to case.
+.PP
+.I NON_REGEX=
+This items specifies the character that defines a non-regex string. The default
+for this is a %. If an item starts with the NON_REGEX character, then the item
+is never checked for regular expressions. If the item doesn't start with the QUOTE
+character, then suck tries to determine if it is a regular expression, and if it
+is, use regex() on it. This item is so that you can tell suck to treat strings
+like "$$$$ MONEY $$$$" as non-regex items. IF YOU USE BOTH QUOTE and NON_REGEX
+characters on a string, the NON_REGEX character MUST appear first.
+.PP
+.I GROUP=
+This line allows you to specify either keep or delete parameters on a group
+by group basis. There are three parts to this line. Each part of this line
+must be separated by exactly one space. The first part is either
+"keep" or "delete". If it is keep, then only articles in that group which match
+the parameters in the group file are downloaded. If it is delete, articles in that
+group which match the parameters are not downloaded. The second part, the group name
+is the full group name for articles to check against the group file. The group name
+may contain an * as the last character, to match multiple groups, eg: "comp.os.linux.*"
+would match comp.os.linux.announce, comp.os.linux.answers, etc.. The third part
+specifies the group file which contains the parameters to check the articles against.
+Note, that if you specified a postfix with the \-p option, then this postfix is attached
+to the name of the file when suck looks for it, UNLESS you use the \-k option above.
+.PP
+.I GROUP_OVERRIDE_MASTER
+This allows you to override the default behavior of the master kill file. If this
+option is in the master kill file, then even if an article is flagged for deletion
+by the master kill file, it is checked against the group files. If the group files
+says to not delete it, then the article is kept.
+.PP
+.I TIEBREAKER_DELETE
+This option allows you to override the built-in tie-breaker default. The potential
+exists for a message to be flagged by one group file as kept, and another group
+file as killed. The built-in default is to then keep the message. The TIEBREAKER_DELETE
+option will override that, and caused the article to be deleted.
+.PP
+.I USE_EXTENDED_REGEX
+This option tells suck to use extended regular expressions vice standard regular expressions.
+It may used in the master killfile, in which case it applies to all killfiles, or in an
+individual killfile, where it only applies to the parameters that follow it in the
+killfile.
+.PP
+.I XOVER_LOG_LONG
+This option tells suck to format the killfile generated by from an Xover killfile so that
+it looks like an article header. The normal output is to just print the Xover line
+from theserver.
+.PP
+.I PROGRAM=
+This line allows suck to call an external program to check each article.
+You may specify any arguments in addition to the program name on this line.
+If this line is in your suckkillfile, all other lines are ignored. Instead, the
+headers are passed to the external program, and the external program determines
+whether or not to download the article. Here's how it works. Suck will fork
+your program, with stdin and stdout redirected. Suck will feed the headers
+to your program thru stdin, and expect a reply back thru stdout. Here's the
+data flow for each article:
+
+.RS
+1. suck will write a 8 byte long string, which represents the length of the
+header record on stdin of the external program. Then length is in ascii,
+is left-aligned, and ends in a newline (example: "1234 \\n").
+.RE
+.RS
+2. suck will then write the header on stdin of the external program.
+.RE
+.RS
+3. suck will wait for a 2 character response code on stdout. This response code is
+either "0\\n" or "1\\n" (NOT BINARY ZERO OR ONE, ASCII ZERO OR ONE). If the return
+code is zero, suck will download the article, if it is one, suck won't.
+.RE
+.RS
+4. When there are no more articles, the length written down (for step 1) will be zero
+(again in ascii "0 \\n"). Suck will then wait for the external program to
+exit before continuing on. The external program can do any clean up it needs,
+then exit. Note: suck will not continue processing until the external program exits.
+.RE
+
+.PP
+.I PERL=
+This line allows suck to call a perl subroutine to check each article. In order
+to use this option, you must edit the Makefile, specifically the PERL* options.
+If the PERL=
+line is in your suckkillfile, all other lines are ignored. Instead, the header
+is sent to your perl subroutine, and your subroutine determines if the article
+is downloaded or not. The parameter on the PERL= line specifies the file name
+of the perl routine eg:
+
+.RS
+PERL=perl_kill.pl
+.RE
+
+.PP
+See the sample/perl_kill.pl for a sample perl subroutine. There are a couple of
+key points in this sample. The "package Embed::Persistant;" must be in the perl
+file. This is so that any variable names you create will not conflict with variable
+names in suck. In addition, the subroutine you define must be "perl_kill", unless
+you change the PERL_PACKAGE_SUB define in suck_config.h. Also, your subroutine must
+return exactly one value, an integer, either 0 or 1. If the subroutine returns
+0, then the article is downloaded, otherwise, the article is not downloaded.
+
+.PP
+NOTES: The perl file is only compiled once, before any articles are downloaded.
+This is to prevent lengthy delays between articles while the perl routine
+is re-compiled. Also, you must use Perl 5.003 or newer. In addition, you
+are advised to run 'perl -wc filter' BEFORE using your filter, in order
+to check for syntax errors and avoid problems.
+
+.SH SUCKXOVER
+If the file
+.I suckxover
+exists, then suck uses the XOVER command to get information
+on the articles and decide whether or not to download the article.
+Xover files use the same syntax as suckkillfiles, but supports a subset
+of the commands.
+.PP
+The following killfile commands are not supported in suckxover files:
+.RS
+NRGROUPS:
+.RE
+.RS
+HEADER:
+.RE
+.RS
+BODY:
+.RE
+.RS
+TIEBREAKER_DELETE:
+.RE
+.PP
+Only the following header lines will be checked:
+.RS
+Subject:
+.RE
+.RS
+From:
+.RE
+.RS
+Message-ID:
+.RE
+.RS
+References:
+.RE
+.PP
+The behaviour of the size commands (
+.I BODYSIZE>, BODYSIZE<, HILINES, and LOWLINES
+) specify the total size of the article (not just the body) in
+bytes or lines, respectively.
+.PP
+All other parameters are allowed. However, if you use an invalid parameter,
+it is silently ignored.
+.SH SUCKXOVER and PROGRAM= or PERL= parameters
+These parameters are supported in a suckxover file, however they work slightly
+differently than described above. The key difference is that prior to sending
+each individual xoverview line to your program, suck will send you the
+overview.fmt listing that it retrieves from the server. This overview.fmt
+is a tab-separated line, describing the fields in each overview.fmt line.
+.PP
+For the PROGRAM= parameter, suck will first send your program an 8 byte long
+string, which is the length of the overview.fmt. This length is formatted
+as the lengths above (see nr1 under PROGRAM=). Suck will then send the overview.fmt.
+After that, the flow is as described above. See sample/killxover_child.c for
+an example.
+.PP
+For the PERL= parameter, Your program must have two subroutines. The first
+is perl_overview, which will recieve the overview.fmt, and not return anything.
+The second subroutine is perl_xover, which will recieve the xoverview line,
+and return 0 or 1, as described in the PERL= above. See sample/perl_xover.pl
+for an example.
+
+.SH SUCKOTHERMSGS
+If
+.I suckothermsgs
+exists, it must contain lines formatted in one of three ways. The first way
+is a line containing a Message-ID, with the <> included, eg:
+
+.RS
+ <12345@somehost.com>
+.RE
+
+This will cause the article with that Message-ID to be retrieved.
+.PP
+The second way is to put a group name and article number on a line starting
+with an !, eg:
+.RS
+ !comp.os.linux.announce 1
+.RE
+
+This will cause that specific article to be downloaded.
+.PP
+You can also get a group of articles from a group by using the following syntax:
+.RS
+ !comp.os.linux.announce 1-10
+.RE
+.PP
+Whichever method you use, if the article specified exists, it will be downloaded,
+in addition to any articles retreived via the
+.I sucknewsrc.
+These ways can be used to get a specific article in other groups,
+or to download an article that was killed. These articles
+.B ARE NOT
+processed through the kill articles routines.
+
+.SH SUCKNODOWNLOAD
+If
+.I sucknodownload
+exists, it must consist of lines contaning a Message-ID, with the <> included, eg:
+
+.RS
+ <12345@somehost.com>
+.RE
+
+This will cause the article with that Message-ID to NEVER be downloaded. The
+Message-ID must begin in the first column of the line (no leading spaces). This
+file overrides
+.I suckothermsgs
+so if an article is in both, it will not be downloaded.
+
+.SH POST FILTER
+if the
+.BI "-y post_filter"
+option is specified on the command line in conjunction with any of the batch modes,
+then suck will call the post filter specified, after downloading the articles, and
+before batching/posting the articles.
+The filter is passed the directory where the articles are stored (the -dm option).
+The filter program is responsible for parsing the contents of the directory. See
+sample/post_filter.pl for a sample post filter. This option was designed to
+allow you to add your own host name to the Path: header, but if you need to
+do anything else to the messages, you can.
+
+.SH FOREIGN LANGUAGE PHRASES
+If the
+.BI "-l phrases"
+option is specified or the file /usr/local/lib/suck.phrases (defined in suck_config.h)
+exists, then suck will load an alternate language phrase file, and use
+it for all status & error messages, instead of the built-in defaults. The command line
+overrides the build in default, if both are present.
+The phrase file contains all messages used by suck, rpost, testhost,
+and lmove, each on a separate line and enclosed in quotes. To generate
+a sample phrase file, run
+.BI "make phrases"
+from the command line. This will create "phrases.engl", which is a list
+of the default phrases. Simply edit this file, changing the english
+phrases to the language of your choosing, being sure to keep the phrases
+within the quotes. These phrases may contain variables to print items
+provided by the program, such as hostname. Variables are designated
+by %vN% where N is a one-up sequence per phrase. These variables may
+exist in any order on the phrase line, for example,
+.RS
+"Hello, %v1%, welcome to %v2%" or
+.RE
+.RS
+"Welcome to %v2%, %v1%"
+.RE
+are both valid phrases. Phrases may contain, \\n, \\r, or \\t to print a newline, carriage return,
+or tab, respectively. Note that the first line of the phrase file is the current version
+number. This is checked against the version of suck running, to be sure that the phrases
+file is the correct version.
+
+If you modify any of the source code, and add in new phrases, you will need to regenerate
+phrases.h, so that everything works correctly. To recreate, just run
+.BI "make phrases.h"
+from the command line.
+.SH SIGNAL HANDLING
+Suck accepts two signals, defined in
+.I suck_config.h.
+The first signal (default SIGTERM) will cause Suck to finish downloading the
+current article, batch up whatever articles were downloaded, and
+exit, without an error.
+
+The second signal (default SIGUSR1) will cause suck to use the pause values defined with
+the -w option (see above).
+
+.SH EXIT CODES
+Suck will exit with the following return codes:
+.RS
+0 = success
+.RE
+.RS
+1 = no articles available for download.
+.RE
+.RS
+2 = suck got an unexpected answer to a command it issued to the remote server.
+.RE
+.RS
+3 = the -V option was used.
+.RE
+.RS
+4 = suck was unable to perform NNTP authorization with the remote server.
+.RE
+.RS
+-1 = general error.
+.RE
+.SH HISTORY
+.RS
+Original Author - Tim Smith (unknown address)
+.RE
+.RS
+Maintainers -
+.RE
+.RS
+March 1995 - Sven Goldt (goldt@math.tu-berlin.de)
+.RE
+.RS
+July 1995 - Robert A. Yetman (boby@pixi.com)
+.RE
+.de R$
+Revision \\$$3, \\$$4
+..
+.SH "SEE ALSO"
+testhost(1), rpost(1), lpost(1).