The above sounds like a good idea. Volunteers to set it up?
I'd also be interested in making a list of sites that you could index as part
of the testing process. With more and more robot developers, it could be a
great resource to have a page of URLsof people with servers they don't mind
getting hit a lot with robot traffic, or which have special traps or other
devices setup on them for testing purposes.
Issac
>
> I am using the data to do clustering and some economic models of the web.
> I'll send mail to this list when I have my query engine up.
>
> Sorry for any problems,
> -Larry
>
> >FYI, the following robot
> >
> >huron.stanford.edu backrub@pcd.stanford.edu:BackRub/0.5
> > and
> >grand.stanford.edu backrub@pcd.stanford.edu:BackRub/0.5
> >
> >is hitting a site once a second and isn't waiting for responses
> >before firing off new requests. The owner has been notified.
> >
>
>