On Wed, 11 May 2005, Tony Finch wrote:
| On Wed, 11 May 2005, Chris Edwards wrote:
|
| > For detecting local users sending out spam, the OP might like to check out
| > Richard Clayton's paper:
| > http://www.cl.cam.ac.uk/~rnc1/extrusion.pdf
| > which suggests watching failure rates makes a better metric than simple
| > submission rate.
|
| This is true if you have a very heterogeneous user population, as is the
| case with Demon (where Richard works). I'm expecting this will be less of
| a problem for us, since our user-base is more uniform and we have
| separated message submission for MUAs from outgoing relays for MTAs. We
| will still need to classify senders according to their typical sending
| rate, but this should be fairly manageable.
At our uni. the smarthosts have a very crude log-watcher script that
alarms on higher-than-normal rates from any given submission client. In
18 months it's caught one real spam-zombie and loads of false positives,
mostly due to folk who for some reason like to expand mailing lists on
their desktop PC.
There'd probably be less FPs if we spent more time tweaking. But
Richard's plan seems elegant in that less tweaking / classifying of users
is needed, with the hit boxes standing out like sore thumbs.
But we've not had the time to code this - unlike a simple rate counter you
have to keep track of message ids...