[Snort-sigs] Apache Proxy

Alex Kirk alex.kirk at ...435...
Fri Jan 14 07:50:23 EST 2005


>   Hiho,
>> Actually, these rules are never going to fire -- at least not under 
>> normal circumstances where clients are following RFC 2616 
>> (http://www.faqs.org/rfcs/rfc2616.html). GET requests look like this 
>> when they come across the wire:
>> GET /path/to/file.html HTTP/1.1
>> Clients never need to transmit the http://<host> portion of a URL: 
>> http:// is obvious because the packet is being received by an HTTP 
>> server, and <host> is either implied from wherever the packet is 
>> being sent, or stated explicitly with a directive of
>> Host: www.example.com
>> somewhere following the GET request (most likely immediately after it).
>   That true for all normal requests. Yet I believe that Adam wants
>   to trigger for the following attempts:
>   GET http://www.ebay.com/ HTTP/1.1" 200 2986
>   I see hat every now an then in the logfiles of the servers I manage.
>   Don't be confused by the "200" - its the homepage that is delivered.
Hmmm...I'd never seen such a request. Is there any chance you might be 
able to forward along an example or two from your logfiles? I'd be very 
curious to see such a thing occuring in the wild. Also, Adam, if you 
happen to have collected any PCAPs with requests like this, I'd *love* 
to see them.

>> Even if they did fire, though, I'm a bit confused as to how these 
>> rules differentiate between a misconfigured proxy server and a 
>> standard web server.
>   In the example above you get www.ebay.com instead of the original
>   requested webserver if that webserver is indeed misconfigured.
>   A nice way to cirumvent URL Scanners.
What servers allow this? Again, never having observed such a thing in 
the wild, more information would be greatly appreciated -- from anyone 
who might have it.

>> In theory, these would fire on every single request that comes into 
>> one of your web servers -- and if you run even a moderately busy 
>> server, you're
>   Not even in theory. You wrote yourself that a legitimate request is
>   GET /path/to/file.html
>   There is only a small number of "GET http://blabla..." requests. None
>   of them was ever legitimate. Maybe other people have different
>   experiences?
Actually, I'd meant that a rule that looked for RFC-correct requests 
would fire like crazy. Could have been more clear there.

More information about the Snort-sigs mailing list