Manual talk:$wgSpamRegex


 * Large Example shows on article:

"\< \s*a\s*href|". # This blocks all href links entirely, forcing wiki syntax

in Source:

"\<\s*a\s*href|". # This blocks all href links entirely, forcing wiki syntax

So this is a parser issue? First will not work because of "/" as delimiter ends the regex. Fails with error "Unknown modifier 'p').

--Martin


 * Are there other categories which this could/should go into? ex. security or spam protection?

Sy Ali 17:48, 19 April 2006 (UTC)


 * On my MediaMiki, using the "Large Example," spam is getting through the regex for "overflow" by dropping the closing semi-colon. So, I deleted the semi-colon and that seems to be working (for now). It might be useful to others to remove it since it's not necessary. I can't, I tried (the spam protection used here won't let me save). Latrippi 02:28, 22 July 2006 (UTC)

blocking lots of links
Most wikispam I've encountered has taken the form  keyword keyword keyword2 keyword2  etc. So, what about using this to block many links in a row? I'm thinking something like...

$wgSpamRegex = '/(\[http:\/\/[a-z0-9\.\/\%_-]+(\s+[a-z0-9\.-]+)+\]\s+){10}/i'; or $wgSpamRegex = '/(\[http:\/\/[^\s]+\s+[^\[]+\s*?){10}/i';

Comments? (Handy PHP regex tester...)

--Alxndr 03:18, 26 November 2006 (UTC)

How does one stop the 'MediaWiki:Spamprotectiontext' telling the spammer what words just got banned and therefore rewording their spam to get passed it?

I'd love to know.

--Quatermass 20:43, 9 May 2007 (UTC)
 * You can change that message in Special:Allmessages Jonathan3 18:09, 8 September 2007 (UTC)


 * You read you can delete the "$1" on MediaWiki:Spamprotectionmatch in order to achieve that. w:User:JanCK10:52, 18 November 2007 (UTC)

log
Is there a log that shows how ofter the my mediawiki denies edits? w:User:JanCK00:56, 18 November 2007 (UTC)

$wgSpamRegex is not working in my wiki
Maybe someone can help me. I have configured the variable wgSpamRegex like but if i try to test the filter with words of spam nothing happens. Is there something else to do? The version of mediawiki is 1.13.0. Thx! --88.65.198.156 18:24, 5 October 2008 (UTC)
 * You can try Extension:SpamRegex. i Alex  18:35, 5 October 2008 (UTC)
 * Thx - now it's working but the only problem is that I get a php warning if the spamregex filter alerts. Here the output from html

Warning: preg_match [function.preg-match]: Delimiter must not be alphanumeric or backslash in /../htdocs/includes/EditPage.php on line 747

What can I do against this output? Thx again!
 * Is there nobody who has the same problem? --82.113.113.161 15:24, 13 March 2009 (UTC)

blocking by number of links
I have tried to add a limit for number of links to 15 as mentioned in the article, but am still able to add articles with more than 15 links. This is my regex in its entirety:

$wgSpamRegex = "/". # The "/" is the opening wrapper "s-e-x|zoofilia|sexyongpin|grusskarte|geburtstagskarten|animalsex|". "sex-with|dogsex|adultchat|adultlive|camsex|sexcam|livesex|sexchat|". "chatsex|onlinesex|adultporn|adultvideo|adultweb.|hardcoresex|hardcoreporn|". "teenporn|xxxporn|lesbiansex|livegirl|livenude|livesex|livevideo|camgirl|". "spycam|voyeursex|casino-online|online-casino|kontaktlinsen|cheapest-phone|". "laser-eye|eye-laser|fuelcellmarket|lasikclinic|cragrats|parishilton|". "paris-hilton|paris-tape|fuel-dispenser|fueling-dispenser|". "jinxinghj|telematicsone|telematiksone|a-mortgage|diamondabrasives|". "reuterbrook|sex-plugin|sex-zone|lazy-stars|eblja|liuhecai|". "buy-viagra|-cialis|-levitra|boy-and-girl-kissing|". # These match spammy words "dirare\.com|". # This matches dirare.com a spammer's domain name "overflow\s*:\s*auto|". # This matches against overflow:auto (regardless of whitespace on either side of the colon) "height\s*:\s*[0-4]px|". # This matches against height:0px (most CSS hidden spam) (regardless of whitespace on either side of the colon) "(http:.*){16}|". # ***** Limit total number of external links allowed per page / to 15  DOESN'T WORK! "display\s*:\s*none". # This matches against display:none (regardless of whitespace on either side of the colon) "/i";                    # The "/" ends the regular expression and the "i" switch which follows makes the test case-insensitive

It does block the other expressions, but I can still save articles with more than 15 links! I don't see what I'm doing wrong, Please help...


 * MediaWiki: 1.11.0
 * PHP: 5.2.6 (cgi-fcgi)
 * MySQL: 5.0.45-community-log

Thanks, Nathanael Bar-Aur L. 17:22, 7 October 2008 (UTC)

Not working for me
i simply put the following line in my settings.

$wgSpamRegex = "/suyash jain/i";

but it is not working

Any help..

Profanity
Hey, anyone got any regex profanity checks out there?