[Snort-sigs] (no subject)

Adam Hogan hoga4008 at ...2957...
Fri Jan 14 13:09:01 EST 2005


Alex,

I did test the rules and they have been working for me.  I
also tested the method myself, and sure enough I was able to
use GET http://foo to have the server fetch another site's
page.  Judging from the traffic spike in my logs my server
must be on a proxy list somewhere.  I don't know how prevelent
this kind of misconfiguration is, or if this rule would be of
much use to anybody else, but it sure has been helping me out.

Here are a few examples of what's in my logs.  If you need
more I have hundreds of thousands of these, but won't post
them because a very large number of the URLs are pretty
explicit.  I'll see what I can do about a pcap.  Since I shut
down the proxy the requests are diminishing, but I'm guessing
there will be a few I can capture and sanitize later tonight.

xxx.xxx.xxx.xxx - - [14/Jan/2005:16:08:57 -0500] "GET
http://hpcgi1.nifty.com/trino/ProxyJ/prxjdg.cgi HTTP/1.1" 404
429 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"
xxx.xxx.xxx.xxx - - [14/Jan/2005:16:08:39 -0500] "HEAD
http://www.sexyclaire.com/members.shtml HTTP/1.0" 404 -
"http://www.sexyclaire.com/members.shtml" "Mozilla/5.0 (
compatible; MSIE 5.0; Windows XP; NetCaptor )"
xxx.xxx.xxx.xxx - - [14/Jan/2005:16:09:24 -0500] "GET
http://ar.atwola.com/html/93159197/852786655/aol?SNM=HIDBF&width=468&height=60&target=_blank&TZ=-60&WM=window&CT=I
HTTP/1.0" 404 433 "-" "Mozilla/4.0 (compatible; MSIE 5.5;
Windows NT 5.0)"
xxx.xxx.xxx.xxx - - [14/Jan/2005:16:09:42 -0500] "GET
http://cn.edit.cnb.yahoo.com/config/login?.redir_from=PROFILES?&.tries=1&.src=jpg&.last=&promo=&.intl=us&.bypass=&.partner=&.chkP=Y&.done=http://jpager.yahoo.com/jpager/pager2.shtml&login=JesusReaper&passwd=gemini
HTTP/1.0" 404 418 "-" "-"
xxx.xxx.xxx.xxx - - [14/Jan/2005:16:10:40 -0500] "HEAD
http://www.fitdina.com/members/index.html HTTP/1.0" 404 -
"http://www.fitdina.com/members/index.html" "Mozilla/4.7 (
compatible; MSIE 5.5; Windows NT4.0; Compaq )"
xxx.xxx.xxx.xxx - - [14/Jan/2005:16:11:59 -0500] "POST
http://who.blackplanet.com/who_main.html HTTP/1.1" 404 419
"http://who.blackplanet.com/who_main.html" "Mozilla/5.0
(Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20041107
Firefox/1.0"


-----Original Message-----
From: snort-sigs-admin at lists.sourceforge.net
[mailto:snort-sigs-admin at lists.sourceforge.net] On Behalf Of
Alex Kirk
Sent: Friday, January 14, 2005 10:50 AM
To: Chris Kronberg
Cc: snort-sigs at lists.sourceforge.net
Subject: Re: [Snort-sigs] Apache Proxy


Chris,

>
>   Hiho,
>
>
>> Actually, these rules are never going to fire -- at least
not under
>> normal circumstances where clients are following RFC 2616 
>> (http://www.faqs.org/rfcs/rfc2616.html). GET requests look
like this 
>> when they come across the wire:
>>
>> GET /path/to/file.html HTTP/1.1
>>
>> Clients never need to transmit the http://<host> portion of
a URL:
>> http:// is obvious because the packet is being received by
an HTTP 
>> server, and <host> is either implied from wherever the
packet is 
>> being sent, or stated explicitly with a directive of
>>
>> Host: www.example.com
>>
>> somewhere following the GET request (most likely
immediately after 
>> it).
>
>
>   That true for all normal requests. Yet I believe that Adam
wants
>   to trigger for the following attempts:
>
>   GET http://www.ebay.com/ HTTP/1.1" 200 2986
>
>   I see hat every now an then in the logfiles of the servers
I manage.
>   Don't be confused by the "200" - its the homepage that is
delivered.
>
Hmmm...I'd never seen such a request. Is there any chance you
might be 
able to forward along an example or two from your logfiles?
I'd be very 
curious to see such a thing occuring in the wild. Also, Adam,
if you 
happen to have collected any PCAPs with requests like this,
I'd *love* 
to see them.

>> Even if they did fire, though, I'm a bit confused as to how
these
>> rules differentiate between a misconfigured proxy server and a 
>> standard web server.
>
>
>   In the example above you get www.ebay.com instead of the
original
>   requested webserver if that webserver is indeed misconfigured.
>   A nice way to cirumvent URL Scanners.
>
What servers allow this? Again, never having observed such a
thing in 
the wild, more information would be greatly appreciated --
from anyone 
who might have it.

>> In theory, these would fire on every single request that
comes into
>> one of your web servers -- and if you run even a moderately
busy 
>> server, you're
>
>
>   Not even in theory. You wrote yourself that a legitimate
request is
>   GET /path/to/file.html
>   There is only a small number of "GET http://blabla..."
requests. None
>   of them was ever legitimate. Maybe other people have different
>   experiences?
>
Actually, I'd meant that a rule that looked for RFC-correct
requests 
would fire like crazy. Could have been more clear there.




More information about the Snort-sigs mailing list