I have a request that the user engine and the wiki engines should
incorporate some sort of spam protection, ideally using images.

You should know that CAPTCHAs, as they’re called, are controversial,
because (a) they’re very unfriendly to the disabled, and (b) they’re
actually easy to automate - assuming you can’t OCR them yourself, which
is
a bad assumption these days, you just put up a porn site, display any
CAPTCHAs your spambots see to the end users, and let THEM figure it out
for
you.

CAPTCHAs your spambots see to the end users, and let THEM figure it out for
you.

It’s not terribly difficult to parse domains out of posted URLs and
check them against SURBL (http://www.surbl.org.) I’ve written
proof-of-concept code to do this in perl, there’s Net_DNSBL for PHP - I
can’t imagine it’s that difficult to port either of these to Ruby.

– Bob

PS: I used the following to extract the TLD recognition regexes from
Mail::SpamAssassin. With a PCRE engine and a little adjustment (add
‘(^.*.)?’ at the front and ‘$’ at the back), the regexes are fairly
portable.

#!/usr/bin/perl
use strict;
use Mail::SpamAssassin::Util::RegistrarBoundaries;