[Snort-users] Feedback on rule testing

James Dickenson jdickenson at ...11827...
Fri Dec 20 15:58:22 EST 2013


Thanks for the feedback!  That sounds like a pretty solid process to me, I
like the keeping score of false positive rate as well, as I'm a big fan of
metrics.  Verifying the signatures with a pcap either read from file or
fired using tcpreplay on lab network would be awesome, though I'm not sure
how much work it would be to implement something like that.

As far as review I've played around with the idea of giving rules a life
and have an automatic process notify the maintainers of the ruleset that
X,Y and Z rules have expired and need to be reviewed.  I'm not sure if that
would be effective or not, regardless I think like a lot of things its
really just a matter of discipline.

Well again, thanks for the input!


On Fri, Dec 20, 2013 at 12:04 PM, Rob MacGregor <rob.macgregor at ...11827...>wrote:

> On Fri, Dec 20, 2013 at 5:12 PM, James Dickenson <jdickenson at ...11827...>
> wrote:
> > Hey snort users,
> >
> > I've been talking with some co-workers recently about our in house rule
> > development and about ways we could possibly improve it.  I was
> wondering if
> > any of you on the snort user list could give us your experience in
> regards
> > to the process of creating rule you use at where you work or that you
> submit
> > to ET or VRT.  How do you sanity check the rules before you push them to
> > your sensors?  Do you have a formal lifecycle process and what does that
> > entail?  Do you automate the process somewhat with scripting or software
> and
> > if so how?
> >
> > Your suggestions and comments are much appreciated,
> We run things through 3 automatic steps before we deploy them:
> 1) Syntax checking (dumbpig and similar)
> 2) Run through snort with -T to ensure it compiles
> 3) Deploy to a testing sensor (with live traffic) for 5 minutes and
> check the volume of alerts - anything above a defined volume is
> automatically rejected and whatever happens the submitter is provided
> the flows that hit if any did (this can be over-ridden by an admin if
> it turns out they're all true positives and our network is hosed)
> We're looking at the option of providing a pcap of known malicious
> traffic to confirm the signature fires on the traffic - haven't got
> there yet though.
> After a signature has deployed we track the true/false positive ratio
> (according to the analyst interface), anything above a certain FP
> ratio or volume gets flagged automatically for attention, there are
> other limits for simply removing the signature. Every 6 months they
> have to be reviewed to confirm they should remain deployed (ok,
> there's an assumption it's actually reviewed and that the author
> hasn't just claimed they have) - that's still a manual process though.
> This has, overall, kept our in house signatures to a fairly high
> standard. There are still issues, but mandatory training, having
> experienced staff check other's signatures and using the ban-hammer on
> repeat offenders means that those are minimised these days. Nobody
> wants to be the one person in the team who isn't allowed to write
> signatures ;)
> --
>                  Please keep list traffic on the list.
> Rob MacGregor
>       Whoever fights monsters should see to it that in the process he
>         doesn't become a monster.                  Friedrich Nietzsche
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.snort.org/pipermail/snort-users/attachments/20131220/5b67fc56/attachment.html>

More information about the Snort-users mailing list