[Snort-users] Rules Downloads and Scalability

Mike Guiterman mike.guiterman at ...1935...
Mon Sep 18 10:28:57 EDT 2006

Hi everyone,

In order to prevent excess resource consumption Sourcefire announced a new
download policy in August.  Each file may be downloaded once in 15 minutes.

As rule updates are not released on a daily basis, the 15 minute policy
should not impact anyone's ability to maintain up to date coverage.

Subscribers for VRT rules updates are currently exempt from this policy.

Thanks to Jason for suggesting an alternative for managing the resource
issue. The Snort Team is always thankful for feedback.

If you have questions on the policy or have other feedback please don't
hesitate to contact us at snort-site at ...3990...

Keep Snorting!

Mike Guiterman
Snort Community Manager
Sourcefire, Inc.


-------- Original Message --------
Subject: Re: [Snort-users] rules downloads and scalability
Date: Sun, 17 Sep 2006 21:59:27 -0500
From: Eric Hines <eric.hines at ...8860...>
To: Jason Haar <Jason.Haar at ...294...>
CC: snort-users at lists.sourceforge.net
References: <450E05B0.5020002 at ...294...>


Its not limiting specific to Oinkmaster. Applied Watch began seeing this
a few weeks ago through regular rule downloads with our Command Center
using specific Oink Code. Sourcefire seems to be limiting user-specific
Oink Code to download rules only once a day.

Eric Hines, GCIA, CISSP
CEO, President
Applied Watch Technologies, LLC
1095 Pingree Road
Suite 221
Crystal Lake, IL 60014
Tel: (877) 262-7593
Web: http://www.appliedwatch.com

Jason Haar wrote:
> I notice the "www.snort.org/pub-bin/oinkmaster.cgi" script has some form
> of download limiting component (to stop people like me repeatably
> downloading the same live data while editing/updating local scripts - ahem).
> Anyway, such scaling issues happen. I'd like to suggest that Sourcefire
> look to ClamAV to see how they handled people hammering their servers
> looking for updates that didn't exist (i.e. they were already up to
> date). Their rules basically have a serial number and they put that into
> a DNS record, and then their freshclam update daemon looks to that DNS
> record before deciding to actually do a HTTP connection to download an
> update. Than plus some time-of-day randomization and load sharing should
> go a loooong way on the scalability side...
> Just an idea.

More information about the Snort-users mailing list