[Snort-users] problem with snort 2.01 and disabled rules

Michael Scheidell scheidell at ...5171...
Sat Aug 2 02:55:02 EDT 2003

Thanks for fixing that libnet problem with broken libnet-config programs.
Compiling on a FBSD system with flexresp enabled went fine.

Both on a FBSD 4.8 and a legacy 3.51 specifying the --with-libraries and
--with-includes worked.

I have a problem that showed up on snort 1.9 and 2.0 and it involves snort
procesing disabled rules.

Specificly, its in the processing of the disabled 'robots' text rule.

On snort 1.9 and 2.0 I modified all my disabled rules from:

#alert yada yada yada

# alert yada yada yada 
and it SEEMED to work. (it stopped processing disabled rules)

however, last night I upgraded to the newly compiled snort 2.01, and
shortly after got these alerts: 

08/02-06:52:49 GMT TCP 64.6882.45:18968 --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

08/02-04:49:06 GMT TCP --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

08/02-03:18:08 GMT TCP --> 208.237.xxx.xxx:80
[1:1852:3] WEB-MISC robots.txt access

rule file shows it disabled: (same rule file I used in 2.0)

grep robots web-misc.rules

robots.txt access"; flow:to_server,established; uricontent:"/robots.txt";
nocase; reference:nessus,10302; classtype:web-application-activity;
sid:1852; rev:3;)

a grep of access log shows robots txt accesses prior to upgrading to snort
2.01 (and no alerts in snort) and after that, every robots.txt access is
then logged in snort.

(all times adjusted to GMT)

ls -l /usr/local/bin/snort
-rwxr-xr-x  1 root  wheel  1336633 Aug  1 22:05 /usr/local/bin/snort - - [01/Aug/2003:20:16:09 -0400] "GET /robots.txt HTTP/1.1"
200 140 "-" "Scooter/3.2"

(prior to upgrade, no snort alert)

after upgrade, three robots.txt accesses, three alerts. - - [02/Aug/2003:03:18:07 -0400] "GET /robots.txt HTTP/1.1"
200 140 "-" "NPBot (http://www.nameprotect.com/botinfo.html)" - - [02/Aug/2003:04:49:06 -0400] "GET /robots.txt HTTP/1.0"
200 140 "-" "http://www.almaden.ibm.com/cs/crawler   [c01]" - - [02/Aug/2003:06:52:49 -0400] "GET /robots.txt HTTP/1.0"
200 140 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"

compiled with ./configure --enable-flexresp 

preprocessor frag2

preprocessor stream4: noinspect, disable_evasion_alerts, ttl_limit 0

preprocessor stream4_reassemble: noalerts

preprocessor http_decode: 80 unicode iis_alt_unicode double_encode \
 iis_flip_slash full_whitespace

preprocessor telnet_decode

snort started thus:

echo "snort_wan"
/usr/local/bin/snort -doDI -m 022 -z \
-c /etc/snort/snort_wan.conf -i $wan -l /var/log/snort_wan \
-F /etc/snort/snort_wan.bpf 2>&1

system is FBSD 4.8, 768 MB ram, IBM x300 1.0 GHZ PIII

a grep of /usr/local/src/snort/rules (new rules) shows that # alert seems
to be the right way to disable a rule, so what am I doing wrong?

# alert tcp $EXTERNAL_NET any -> $HOME_NET $HTTP_PORTS (msg:"WEB-MISC
Lotus Notes .csp script source download attempt";
flow:to_server,established; uricontent:".csp.";
classtype:web-application-attack; sid:2065; rev:1;)

Michael Scheidell,
Main: 561-368-9561 / www.secnap.net

More information about the Snort-users mailing list