A new robot -- ask for advice

Hrvoje Niksic (hniksic@srce.hr)
05 Nov 1996 00:36:54 +0100


About a year ago, I have started working on a free utility to retrieve
files over slow connections (with the intention to be very stubborn,
and to keep getting a document until the whole of it came).

I have written it, adding various functionality -- like support for
FTP, and automated traversal of HTML pages. Combined with support for
time-stamping and the ability to retrieve FTP trees, I imagined it to
be a universal tool for non-interactive retrieval of things on the
Web.

However, having the option of automatic "recursive" retrieval of HTML
pages, my program counts as one of the robots, so I have informed
somewhat about robots, implemented support for "/robots.txt", having
read the guidelines for new robot writers etc. However, I feel it is
not sufficient.

In a few days/weeks I intend to announce a new version of the program
on the net (and register it to the list of robots), but before doing
so, I would like to hear opinion of people on this mailing list
regarding the behaviour of this robot, in the sense of it being
ethical to the servers and users.

The URL of the test version is
<URL:ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/wget-1.4.0-test1.tar.gz>.

I will greatly appreciate all your advice, opinion and/or criticism.

-- 
Hrvoje Niksic <hniksic@srce.hr> | Student at FER Zagreb, Croatia
--------------------------------+--------------------------------
main(){printf(&unix["\021%six\012\0"],(unix)["have"]+"fun"-0x60);}