[Snort-users] HTTP robot detection?

Sheahan, Paul (PCLN-NW) Paul.Sheahan at ...2218...
Thu Jan 24 13:47:05 EST 2002


Actually I mean a robot that you download....like a utility you can use to
download an entire website to your local drive so you can browse it later
etc. These type of robot tools grab all URL's at a very fast pace from a web
site. I'm wondering if this can be detected somehow......thanks

-----Original Message-----
From: Hyoung-Kee Choi [mailto:hkchoi at ...4677...]
Sent: Thursday, January 24, 2002 4:42 PM
To: Sheahan, Paul (PCLN-NW)
Subject: RE: [Snort-users] HTTP robot detection?


If one does not want a robot to copy its site one can specify that order in
"robot.txt". Hence, robots are supposed to look for "robot.txt".

> -----Original Message-----
> From: snort-users-admin at lists.sourceforge.net
> [mailto:snort-users-admin at lists.sourceforge.net]On Behalf Of
> Sheahan, Paul (PCLN-NW)
> Sent: Thursday, January 24, 2002 3:55 PM
> To: Snort List (E-mail)
> Subject: [Snort-users] HTTP robot detection?
>
>
>
> Anyone have any ideas on this one?
>
> I was wondering if there was a way to make Snort detect someone running an
> automated script or robot against a website the way it checks for
> portscans?
> For example, Snort flags traffic as a portscan when there are
> connections to
> a certain number of ports on one host within a certain time
> period. Is there
> way to do this with URLs? For example, so many URLs accessed at one IP
> address within a certain time period would be flagged as some sort of
> automated tool or robot scanning a site?
>
> Thanks!
>
>
> _______________________________________________
> Snort-users mailing list
> Snort-users at lists.sourceforge.net
> Go to this URL to change user options or unsubscribe:
> https://lists.sourceforge.net/lists/listinfo/snort-users
> Snort-users list archive:
> http://www.geocrawler.com/redir-sf.php3?list=snort-users




More information about the Snort-users mailing list