[Snort-sigs] Apache Proxy

Joe Patterson jpatterson at ...2901...
Fri Jan 14 12:10:11 EST 2005


simple way to generate your own capture files of this:  set up apache as a
proxy server (or squid, or any other sort of http proxy...), open up a
browser, configure it to use the proxy server, and start surfing.  Capture
the traffic from the client to the proxy server.  You'll see exactly what
he's talking about.

-Joe

> -----Original Message-----
> From: snort-sigs-admin at lists.sourceforge.net
> [mailto:snort-sigs-admin at lists.sourceforge.net]On Behalf Of Alex Kirk
> Sent: Friday, January 14, 2005 10:50 AM
> To: Chris Kronberg
> Cc: snort-sigs at lists.sourceforge.net
> Subject: Re: [Snort-sigs] Apache Proxy
>
>
> Chris,
>
> >
> >   Hiho,
> >
> >
> >> Actually, these rules are never going to fire -- at least not under
> >> normal circumstances where clients are following RFC 2616
> >> (http://www.faqs.org/rfcs/rfc2616.html). GET requests look like this
> >> when they come across the wire:
> >>
> >> GET /path/to/file.html HTTP/1.1
> >>
> >> Clients never need to transmit the http://<host> portion of a URL:
> >> http:// is obvious because the packet is being received by an HTTP
> >> server, and <host> is either implied from wherever the packet is
> >> being sent, or stated explicitly with a directive of
> >>
> >> Host: www.example.com
> >>
> >> somewhere following the GET request (most likely immediately after it).
> >
> >
> >   That true for all normal requests. Yet I believe that Adam wants
> >   to trigger for the following attempts:
> >
> >   GET http://www.ebay.com/ HTTP/1.1" 200 2986
> >
> >   I see hat every now an then in the logfiles of the servers I manage.
> >   Don't be confused by the "200" - its the homepage that is delivered.
> >
> Hmmm...I'd never seen such a request. Is there any chance you might be
> able to forward along an example or two from your logfiles? I'd be very
> curious to see such a thing occuring in the wild. Also, Adam, if you
> happen to have collected any PCAPs with requests like this, I'd *love*
> to see them.
>
> >> Even if they did fire, though, I'm a bit confused as to how these
> >> rules differentiate between a misconfigured proxy server and a
> >> standard web server.
> >
> >
> >   In the example above you get www.ebay.com instead of the original
> >   requested webserver if that webserver is indeed misconfigured.
> >   A nice way to cirumvent URL Scanners.
> >
> What servers allow this? Again, never having observed such a thing in
> the wild, more information would be greatly appreciated -- from anyone
> who might have it.
>
> >> In theory, these would fire on every single request that comes into
> >> one of your web servers -- and if you run even a moderately busy
> >> server, you're
> >
> >
> >   Not even in theory. You wrote yourself that a legitimate request is
> >   GET /path/to/file.html
> >   There is only a small number of "GET http://blabla..." requests. None
> >   of them was ever legitimate. Maybe other people have different
> >   experiences?
> >
> Actually, I'd meant that a rule that looked for RFC-correct requests
> would fire like crazy. Could have been more clear there.
>
>
>
>
> -------------------------------------------------------
> The SF.Net email is sponsored by: Beat the post-holiday blues
> Get a FREE limited edition SourceForge.net t-shirt from ThinkGeek.
> It's fun and FREE -- well, almost....http://www.thinkgeek.com/sfshirt
> _______________________________________________
> Snort-sigs mailing list
> Snort-sigs at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/snort-sigs
>
>





More information about the Snort-sigs mailing list