ErrorDocument 404 /cgi-bin/error.cgi
or something like it to be added to the srm.conf file. This could prove
to be expecially confusing because error.cgi could do just about anything
with $REDIRECT_URL, such as attempt to guess the correct page, or convert
the URL all to one case as with AOL (I have heard).
Jeff Drost
On Tue, 21 Jan 1997, Theo Van Dinter wrote:
> On Tue, 21 Jan 1997, Sigfrid Lundberg wrote:
>
> > gungner:~$ HEAD -S http://www.sb.gov.se/robots.txt
> > HEAD http://www.sb.gov.se/robots.txt --> 302 Moved Temporarily
> > HEAD http://www.sb.gov.se/ --> 200 OK
> > Date: Tue, 21 Jan 1997 09:22:01 GMT
> > Server: Netscape-Enterprise/2.0a
> > Content-Type: text/html
> > Client-Date: Tue, 21 Jan 1997 09:22:42 GMT
>
> By spec, robots.txt can be a redirect, but to the main page, hmmmm ...
> <raising left eyebrow> Reminds me of www.isn.com. If you request
> anything which actually doesn't exist (like /robots.txt), it'll just send
> you (w/out redirect) the main page.
>
> Different ...
>
>
> --
> ---------------------------------------------------------------------------
> Theo Van Dinter www: http://www.kluge.net/~felicity/
> (Vice)President WPI Lens and Lights Active Member in SocComm Films
> Member of WPI ACM AME for the Masque C-Term Show
>
> "Remember: no matter where you go, there you are."
> ---------------------------------------------------------------------------
>
> _________________________________________________
> This messages was sent by the robots mailing list. To unsubscribe, send mail
> to robots-request@webcrawler.com with the word "unsubscribe" in the body.
> For more info see http://info.webcrawler.com/mak/projects/robots/robots.html
>
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html