We have a list of about 60,000 URLs that we need to index, for searching
purposes.
We are looking to subcontract a programmer that:
1*) Will write a program to verify those links, and build a database (for
searching purposes) containing the url, title and first 200 words (cleaning
the HTML).
2) Build a search interface for the web and install it on our unix server.
3) Update the database once a month.
*note: We prefer that you run the spider on your computers.
Pretty easy, I am sure ;-)
Let me know if you are interested or have any further questions.
Richard