Re: Checking Log files

Cees Hek (hekc@phoenix.cis.mcmaster.ca)
Mon, 18 Dec 1995 10:42:39 -0600 (CST)


Now that things have calmed down a bit on this list..... :-)

What I was looking for was something that actually did some analysis on
the log file, like a log statistics package but one that was geared
toward robots.

It would check to see if the robots are actually following the standard
for Robot exclusion. It could check if there are multiple accesses to the
server and how far apart they are, how many times in a month the robot
returns, and how often the robots.txt file is accessed to name a few.

If nothing like this has been written, I may just write it myself (if I
can find some free time). I would welcome any suggestions as to what a
program like this should contain.

For now though I guess I will have to live with grepping the log file....

Cees Hek
Computing & Information Services Email: hekc@mcmaster.ca
McMaster University
Hamilton, Ontario, Canada

On Fri, 15 Dec 1995, Mark Schrimsher wrote:

> You can make a robots.txt file that permits all accesses, and then check
> the log for requests for that file. But it won't catch robots that don't
> check for robots.txt.
>
> --Mark