> At 08:07 PM 3/25/96 -0700, you wrote:
> >At 3:14 PM 3/25/96, Jared Williams <williams@earthlink.net> wrote:
> >
> >
>
> If you want go ahead and check out my sight and reply me with e-mail telling
> me if you think it deserves to be higher on the list.
...then you're someone trying to make money from it..
> Thanks...
To save you all the bother it's just like the
<http://metasearch.com/> Metasearch page, but it charges you $130 for the
service as opposed to free.. and I think they possibly also sell a web
authoring tool, but the page wasn't sufficiently clear about what it was
exactly they were doing..
Then, at the end of his page he had the following keywords repeated
California
Northwest
Animation
Promotion Web Site
Development Web Site
Knitter Web
Money
about 50 times each..
okay, so the action to take in his case is obvious (deindex any siGHt(e)
beloning to him), but what is the general case to stop this? It's very
difficult as far as I can see. It's deliberate worthless junk, trying to
get in at the level of people who are providing worthwhile information
about california/web sites etc.
Seems quite similar to the problem of spammed email/netnews to me. In
that case the answer was for responsible sites to ban users who do the
spamming and for irresponsible sites to be separated from all their
neighbourhood, cutting them off from the network.. Dosen't really work
here, because this is too trivial a matter (he's not doing any active
damage) for that kind of response, but, what he is doing is messing up
the catologues.
I suggest that the major catalogue brokers might want to start removing
sites/domains that don't protect people from this. Then the peer pressure
would become strong from neigbourhood sites so that this would be controlled.
Obviously anything which is put as not to access in a robots.txt would be
something that shouldn't be complained about?
Okay, it's not a big problem now, but it's much easier to write hundreds
of junk pages than one properly researched web page, so the swamp is
coming upon us pretty fast.
Actually cutting off a few domains (as they are noticed or complained
about) might be worthwhile..
What about each search page coming up with a rating button for each
page (relevance to search/quality).. Pages which are consistently marked
down would eventually get examined somehow and downrated?? Do censorship
issues come in here, or would that just be stopped from being a problem
by the level of competition?
<http://www.tardis.ed.ac.uk/~mikedlr/biography.html>
Scottish Climbing Archive: <http://www.tardis.ed.ac.uk/~mikedlr/climbing/>
Linux/Unix clone@ftp://src.doc.ic.ac.uk/packages/linux/sunsite.unc-mirror/docs/