Extension:SpamBlacklist

The SpamBlacklist extension prevents edits that contain URLs whose domains match regular expression patterns defined in specified files or wiki pages. When someone tries to save a page, SpamBlacklist checks the text against a (potentially very large) list of illegal host names. If there is a match, the extension displays an error message to the user and refuses to save the page.

Installation and setup
SpamBlacklist works with MediaWiki version 1.6.0 or greater. however that SpamBlacklist requires PHP 5.3 or higher, which means it may not be compatible with an otherwise supported install of MediaWiki.

Installation

 * 1) Save the SpamBlacklist files to a subdirectory called SpamBlacklist in your extensions directory. You should have at least the following three files in that SpamBlacklist directory.
 * 2) * SpamBlacklist/SpamBlacklist.php
 * 3) * SpamBlacklist/SpamBlacklist_body.php
 * 4) * SpamBlacklist/SpamBlacklist.i18n.php
 * 5) Add the following line to LocalSettings.php in your MediaWiki root directory:

Setting the blacklist
The default source for SpamBlacklists list of forbidden URLs is the Wikimedia spam blacklist on Meta-Wiki, at Spam blacklist. By default, the extension uses this list, and reloads it once every 10-15 minutes. For many wikis, using this list will be enough to block most spamming attempts. However, since the Wikimedia blacklist is used by a diverse group of large wikis with hundreds of thousands of external links, it is comparatively conservative in the links it blocks.

The Wikimedia spam blacklist can only be edited by administrators; but you can suggest modifications to the blacklist at m:Talk:Spam blacklist.

You can add other bad URLs on your own wiki. List them in the global variable $wgSpamBlacklistFiles in LocalSettings.php, AFTER the require_once( "$IP/extensions/SpamBlacklist/SpamBlacklist.php" ); see examples below.

$wgSpamBlacklistFiles is an array, with each value containing either a URL, a filename or a database location.

If you use $wgSpamBlacklistFiles in LocalSettings.php, the default value of " Spam blacklist " will no longer be used - if you want that blacklist to be accessed, you will have to add it in manually, see examples below.

The local pages MediaWiki:Spam-blacklist and MediaWiki:Spam-whitelist will always be used, whatever additional files are listed.

Specifying a database location allows you to draw the blacklist from a page on your wiki.

The format of the database location specifier is "DB: [db name] [title]". [db name] should exactly match the value of $wgDBname in LocalSettings.php. You should create the required page name [title] in the default namespace of your wiki. If you do this, it is strongly recommended that you protect the page from general editing. Besides the obvious danger that someone may add a regex that matches everything, please note that an attacker with the ability to input arbitrary regular expressions may be able to generate segfaults in the PCRE library.

Examples
If you want to, for instance, use the English-language Wikipedia's spam blacklist in addition to the standard Meta-Wiki one, you could call the following in LocalSettings.php, AFTER the require_once( "$IP/extensions/SpamBlacklist/SpamBlacklist.php" ):

...or this, which functions the same:

Here's an example of an entirely local set of blacklists: the administrator is using the update script to generate a local file called "wikimedia_blacklist" that holds a copy of the Meta-Wiki blacklist, and has an additional blacklist on the wiki page "My spam blacklist":

Issues
Because the blacklist may be long, the following line may need to be added to LocalSettings.php, probably BEFORE the require_once( "$IP/extensions/SpamBlacklist/SpamBlacklist.php" ) line:

Whitelist
A corresponding whitelist can be maintained by editing the MediaWiki:Spam-whitelist page. This is useful if you would like to override certain entries from another wiki's blacklist that you are using.

Author and license
SpamBlacklist was written by Tim Starling and is (deliberately) ambiguously licensed.

Blacklist syntax
If you would like to create a blacklist of your own, or modify an existing one, here is the syntax:

Everything on a line after a '#' character is ignored (for comments). All other strings are regex fragments which will only match inside URLs.


 * Notes:
 * Do not add "http://"; this would fail, since the regex will match after "http://" (or "https://") inside URLs.
 * Furthermore "www" is unneeded, since the regex will match any subdomains. By giving "www\." explicitly one can match specific subdomains.
 * The '^' and '$' anchors match the beginning and end of the domain name, not the beginning and end of the URL.
 * Slashes don't need to be escaped by Backslashes. This will be done automatically by the script.

The following line will block all URLs that contain the string "example.com", except where it is immediately preceded or followed by a letter. \bexample\.com\b These are blocked: These are not blocked:
 * Example:
 * http://www.example.com
 * http://www.this-example.com
 * http://www.google.de/search?q=example.com
 * http://www.goodexample.com
 * http://www.google.de/search?q=example.commodity

Performance
The extension creates a single regex statement which looks like  (where all slashes within the lines are escaped automatically). It saves this in a small "loader" file to avoid loading all the code on every page view. Page view performance will not be affected even if you're not using a bytecode cache like eAccelerator, although using a cache is strongly recommended for any MediaWiki installation.

The regex match itself generally adds an insignificant overhead to page saves (on the order of 100ms in our experience). However, loading the spam file from disk or the database, and constructing the regex, may take a significant amount of time depending on your hardware. If you find that enabling this extension slows down saves excessively, try installing a supported bytecode cache. The SpamBlacklist extension will cache the constructed regex if such a system is present.

If you're sharing a server and cache with several wikis, you may improve your cache performance by modifying getSharedBlacklists and clearCache in SpamBlacklist_body.php to use $wgSharedUploadDBname (or a specific DB if you do not have a shared upload DB) rather than $wgDBname. Be sure to get all references! The regexes from the separate MediaWiki:Spam-blacklist and MediaWiki:Spam-whitelist pages on each wiki will still be applied.

External blacklist servers (RBL's)
In its standard form, this extension requires that the blacklist be constructed manually. While regular expression wildcards are permitted, and a blacklist originated on one wiki may be re-used by many others, there is still some effort required to add new patterns in response to spam or remove patterns which generate false-positives.

Much of this effort may be reduced by supplementing the spam regex with lists of known domains advertised in spam e-mail. The regex will catch common patterns (like "casino-" or "-viagra") while the external blacklist server will automatically update with names of specific sites being promoted through spam.

In the filter function in SpamBlacklist_body.php, approximately halfway between the file start and end, are the lines:

Directly above this section (which does the actual regex test on the extracted links), one could add additional code to check the external RBL servers:

This ensures that, if an edit contains URLs from already-blacklisted spam domains, an error is returned to the user indicating which link cannot be saved due to its appearance on an external spam blacklist. If nothing is found, the remaining regex tests are allowed to run normally, so that any manually-specified 'suspicious pattern' in the URL may be identified and blocked.

Note that the RBL servers list just the base domain names - not the full URL path - so http://example.com/casino-viagra-lottery.html will trigger RBL only if "example.com" itself were blacklisted by name by the external server. The regex, however, would be able to block on any of the text in the URL and path, from "example" to "lottery" and everything in between. Both approaches carry some risk of false-positives - the regex because of the use of wildcard expressions, and the external RBL as these servers are often created for other purposes - such as control of abusive spam e-mail - and may include domains which are not engaged in forum, wiki, blog or guestbook comment spam per se.

Other spam-fighting tools
There are various helpful manuals on mediawiki.org on combating spam and other vandalism:
 * Anti-spam features - includes link to the built-in $wgSpamRegex anti-spam mechanism.
 * Combating spam
 * combating vandalism

Other anti-spam, anti-vandalism extensions include:
 * ConfirmEdit
 * Bad Behavior

This page also includes a rather thorough listing of the different tools available.


 * System messages