specialized searches

Mike Fresener (mfres@usscreen.com)
Thu, 6 Feb 1997 09:53:50 -0700 (MST)


I have a UNIX box running Apachie with perl4 and 5.
I want to have a specific file with a list of addresses that a search engine
will traverse say 3 or 4 levels deep, once a week, index it all and search
it from a web page.
Basically a regular search engine but I want to control the sites searched.
What, in anyones opinion would be the best rout to accomplish this? Best
program? Easiest setup?

Am I asking to do the impossible?

At this point ANY help would be GREATLY appriciated!

May your life flourish!
Many Thanks!
Mike
_____________________
A day in the life of today!
Don't breath the air. Don't drink the water. Don't eat fruit, vegetables,
meat, fish, sugar, salt, fat, caffeine, anything artificial, dairy or grain.
Don't have sex ever again!
Have a nice day!;)

Rev. M. Fresener
mfres@usscreen.com

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html