On Wed, 29 Jul 1998, Jim Tittsler wrote:
> Is there a cookbook way to have an Exim filter keep a database of the last
> 'n' seen message IDs and discard duplicate messages? (Similar to the
> feature in procmail's formail.)
Not that I know of. It is also a highly dangerous thing to do, if you
are talking about the system filter. You may well lose messages.
Consider: A message is sent to two users; one is on your host, the other
is somewhere else. Two copies of the message, with the same message ID,
are sent out. One gets to your host. You save the ID. The other reaches
another host, but the user there has forwarded it to your host, so
sometime later it reaches your host, but it has the same message ID that
you have already saved...
If you are talking about a user's private filter, then maybe this would
work.
> I can probably logwrite the the $message_id and lookup/lsearch that file...
> but the only way I know to keep it from growing without bound would be to
> have a separate cron process. Is there a better way? And/or is there a way
> to insert the IDs into a database instead of a text file (and keep the
> database size limited)?
You could use a cron job to build a database from the linear file and
empty it, and search both files. You would have to keep timestamps and
arrange for periodic tidying. In other words, it's all DIY.
--
Philip Hazel University of Cambridge Computing Service,
ph10@??? Cambridge, England. Phone: +44 1223 334714.
--
*** Exim information can be found at
http://www.exim.org/ ***