Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Why Comment Spam Can Never Be Stopped

[John had comment spam]

__/ [Karim] on Saturday 03 September 2005 20:29 \__

On Sat, 03 Sep 2005 08:19:42 +0100, Roy Schestowitz wrote:

>> Join the club, John. I tried changing default fields (are you using
>> 'in-house' forms by the way?), added CAPTCHA, added many filters
>> including Spaminator. I still struggle to purge about 50-100 messages per
>> day in 2 sites. Only 10% will bubble through, but it is enough to upset a
>> webmaster. If not erased quickly enough, it only encourages more.
>> 
>> Save the trouble, John. Changes will only cater as a temporary solution.
>> Consider closing comments altogether. I know I do.
> 
> The best thing to do other than closing comments is use a seriers of
> filters:
> - Have users log in before posting comments
> - Using Capcha
> - Reject all comments with URls
> - Use something like http://www.angrypets.com/tools/rdos/
> 
> and after all these filters, you should be left by comments left by
> persistent human posters, hopefully very few or none, then do a full
> moderation. Only usefull comments will be posted then.
> 
> I prefer to read blogs with good comments than blogs with no comments.
> 
> Karim

Thanks Karim.

- Logging on before posting is a process tedious enough (from the user's
perspective) as to result in no comments at all.

- CAPTCHA likewise, but it also appears ineffective. Have a look at the
following:

   http://sam.zoy.org/pwntcha/

-IP blocking is not a possibility. You may end up blocking too many benign
visitors (see Dougal Campbell on spam). Spammers have gathered and
capitalised on many unique addresses by now.

-"All comments blocked if URI is contained within" - well, what about the
URI of the commenter's homepage? I currently have a limit set to at most
one URI. URI (or URL if you prefer) is the motive for most when leaving a
comment in the first place.

-Moderation - people dislike being put in a moderation queue and it still
involves filtering work by the webmaster. I currently add to moderation
queue anything that matches a sensitive word. The spammers are now hitting
with different encodings, which requires yet another 'upgrade'. It simply
isn't worth the investment (time).

I sometimes think to myself: how long will it take to develop a 'serum' to
x? If I tolerate x, how much effort will be spent 'tolerating it' over the
timespan y? I learned from experience that for any x, soon will emerge x_1
and x_2 and x_3 where x is yet another hack that the spammers find (e.g.
trackback spam, encodings, proxies, long intervals between posts)...

Roy

-- 
Roy S. Schestowitz      | Useless fact: Women blink twice as much as men
http://Schestowitz.com  |    SuSE Linux    |     PGP-Key: 74572E8E
  6:20am  up 10 days 18:31,  3 users,  load average: 0.42, 0.43, 0.41

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index