Re: [exim] Limit concurrent deliveries

Etusivu
Poista viesti
Vastaa
Lähettäjä: Peter Barker
Päiväys:  
Vastaanottaja: exim-users
Aihe: Re: [exim] Limit concurrent deliveries
On Mon, 16 Jun 2008 07:20:31 am Heiko Schlittermann wrote:
> Peter Barker (So 15 Jun 2008 04:38:13 CEST):
> > I am running exim4 on a server with very limited resources. and using a
> > local pipe transport for spam filtering. Is there any way to limit the
> > number of processes run when doing local deliveries. At present a
> > transport process is run for every queued message whenever the queue is
> > scanned. I have tried queue_run_max = 1
> > smtp_accept_queue = 2
> > but these do not limit the concurrent transports run.
>
> According to the spec the following should ensure that there a not more
> then 5 parallel deliveries at a time, iff your queue runners are started
> by the daemon (and not by cron):
>
>     queue_run_max = 5
>     remote_max_parallel = 1
>     queue_smtp_domains = 1

>
> Especially the queue_smtp_domains is important, otherwise each received
> message will be delivered immediatly, even if there are already other
> deliveries.
>
>     Best regards from Dresden
>     Viele Grüße aus Dresden
>     Heiko Schlittermann


I tried the above, and also queue_smtp_domains = *, but I still get one
spam-filtering (spambayes) process for each queued message, all running
concurrently. I don't think the problem is the number of queue runners -
there only seems to be one of these, but it does not finish delivering one
message before starting the next one.

Looking at section the Exim specification, section 3.13 "Delivery in detail",
it seems all messages will be passed to all the routers. Then, to quote "When
all the routing has been done, addresses that have been successfully handled
are passed to their assigned transports". This seems to hand off all messages
simultaneously. My spambayes transport is a pipe transport, which passes the
message back to exim for delivery after filtering (command = exim -oMr
spam-scanned -bS)

I tried also queue_only = true, which does only process one message at a time,
but not immediately. I am using fetchmail to collect mail for the server, and
it operates as follows:
1. fetchmail collects all messages, which are immediately placed in queue
2. the next time the queue is run, each message is filtered, one at a time and
placed back in the queue
3. when the queue is run again, the messages are delivered to the local users
So the mail is eventually delivered, but only after 2 extra queue runs. With
the default setup, fetchmail collected the messages, which were immediately
queued and then processed in parallel.

Does anyone have any suggestions to make this processing to happen one at a
time? If not, I will just run the queue multiple times.

Thanks,
Peter Barker