Otis
==========================================================================
POPULUS People Locator - The Intelligent White Pages - http://POPULUS.net/
==========================================================================
On Wed, 22 Jan 1997, Neil Cotty wrote:
> Date: Wed, 22 Jan 1997 11:59:30 +1100
> From: Neil Cotty <neilc@tradesrv.com.au>
> To: robots@webcrawler.com
> Subject: Crawling & DNS issues
>
> Hi There!,
>
> I'm new to this list and I've been hanging around trying to gauge if
> this is the correct place to ask questions. Currently it's the only one
> I've found that comes close so please excuse me if I'm asking the wrong
> thing in the wrong place. If I am I'd appreciate someone referring me to
> another list.
>
> I'm in the processing of planning a spider which will be written for NT
> using Delphi or C++. My primary aim is to collate a list of majordomo
> services running on servers in a small area of the net. NO this is not
> for some spamming exercise but to provide a free index for people to
> browse online via the Web. (Yes I know there are a few lists around
> already !, but hey I like to reinvent the wheel! :)
>
> I have figured out pretty much how I can achieve this but I'm just stuck
> on one crucial aspect as I'm not a UNIX or DNS guru (yet!). Is it
> possible to gather a list of Domains ? Using nslookup I can query
> individual domains but seemingly only if I know the name of the domain.
> (is this right?) I'm only interested in locating MX records. For
> instance, How could I compile a list of all .org.au domains ? Any advice
> on this area or pointers to reference sites around the net would be
> greatly appreciated.
>
> I gather Web robots require the same information. But I have read
> somewhere that this sometimes starts out by pointing the robot at an
> existing list of sites.
>
> Thank you for your help,
>
> Kind Regards,
>
> Neil Cotty
>
> _________________________________________________
> This messages was sent by the robots mailing list. To unsubscribe, send mail
> to robots-request@webcrawler.com with the word "unsubscribe" in the body.
> For more info see http://info.webcrawler.com/mak/projects/robots/robots.html
>
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html