[Snort-users] http content host matching rule optimization

Greg j.greg.k at ...11827...
Mon Dec 7 11:22:01 EST 2009

I am curious if I can optimize this rule any further. I have a Perl
script that runs once every few days that takes a manual download from
MalwareURL.com and converts the data into a file that I include into
the snort config.

Since the file is long (around 3k entries) I am trying to minimize the
alarms and overhead costs. I figure since I am focusing on the
http_header and not the entire payload I gain some efficiency. Also
using HTTP_PORTS as defined in snort.conf instead of ANY. I had to
create unique SIDS for each URL though so I could use destination
tracking to suppress extra hits. I only need to know that the access
occurred in snort and then I go to a tshark capture device I built to
replay the events to see the details.

Below is the script segment that generates all the rules from the data
file. Is this the most efficient? Is there a better way?


while (<IN>) {
  chomp ($_);
  print "alert tcp \$HOME_NET any -> \$EXTERNAL_NET \$HTTP_PORTS
(msg:\"MalURL $_\"; flow:from_client; content:\"$_\"; http_header;
nocase; threshold: type limit, track by_dst, seconds 3600, count 1;
sid:$sid; rev:1;)\n";
close (IN);

More information about the Snort-users mailing list