Re: robots.txt buffer question.

Brent Boghosian (brentb@opentext.com)
Tue, 22 Oct 1996 21:29:08 -0400 (EDT)


On Tue, 22 Oct 1996, Jeremy Sigmon wrote:

>
> Do most people read in a limited size for the robots.txt file?
> I was just wondering is someone linked /dev/urandom to robots.txt and
> you requested the file would your robot crash?
>
> ======================================================================
> Jeremy Sigmon B.S. ChE |
> Web Developer of the Robert C. Byrd Health | Use
> Sciences Center of West Virginia University | FreeBSD
> WWW.HSC.WVU.EDU | Now
> Graduate Student in Computer Science |
> Office : 293-1060 |
>

Well our robot Open Text Livelink Spider
would be fine parsing the file, no matter the format;
However, currently we limit the number of parsed 'Disallows'
for a specific User-Agent for each site:port to 8KB.

Hope that helps.
******************************************************
Brent Boghosian | Email: BrentB@OpenText.com
|
Open Text Corp. | Phone: (519)888-7111 Ext.279
180 Columbia St. West | FAX: (519)888-0677
Waterloo, ON N2L 3L3 | http://www.opentext.com
******************************************************