Extension:SpamBlacklist/fr

L'extension SpamBlacklist empêche les modifications qui contiennent des URLs dont le domaine correspond à une expression régulière définie dans des fichier ou des pages définies et la création de comptes utilisateurs utilisant des adresses email spécifiées.

Quand un contributeur tente d'enregistrer la page, l'extension vérifie le texte par rapport à la (potentiellement très longue) liste de noms d'hôtes interdits. Si une correspondance est trouvée, l'extension affiche un message d'erreur à l'utilisateur et refuse l'enregistrement de la page.



Installation


Définir la liste de blocage
Les pages locales suivantes sont toujours utilisées, quelque soit les sources supplémentaires listées :
 * MediaWiki:Spam-blacklist
 * MediaWiki:Spam-whitelist
 * MediaWiki:Email-blacklist
 * MediaWiki:Email-whitelist

La source supplémentaire par défaut de la liste de blocage des URL interdites est la Liste noire anti-spam de Meta-Wiki. Par défaut l'extension utilise cette liste et la recharge toutes les 10 à 15 minutes. Pour beaucoup de wikis, l'utilisation de cette liste sera suffisante pour bloquer la plupart des tentatives de spam. However, since the Wikimedia block list is used by a diverse group of large wikis with hundreds of thousands of external links, it is comparatively conservative in the links it blocks.

The Wikimedia spam block list can only be edited by administrators; but you can suggest modifications to the block list at m:Talk:Spam blacklist.

You can add other bad URLs on your own wiki. List them in the global variable  in. See examples below.

is an two level array. Top level key is  or. They take an array with each value containing either a URL, a filename or a database location.

If you use  in "LocalSettings.php", the default value of " Spam blacklist " will no longer be used - if you want that block list to be accessed, you will have to add it in manually, see examples below.

Specifying a database location allows you to draw the block list from a page on your wiki.

The format of the database location specifier is ">DB: [db name] [title]". [db name] should exactly match the value of  in. You should create the required page name [title] in the default namespace of your wiki. If you do this, it is strongly recommended that you protect the page from general editing. Besides the obvious danger that someone may add a regex that matches everything, please note that an attacker with the ability to input arbitrary regular expressions may be able to generate segfaults in the PCRE library.

Exemples
If you want to, for instance, use the English-language Wikipedia's spam block list in addition to the standard Meta-Wiki one, you could call the following in, AFTER  call:

Here is an example of an entirely local set of block lists: the administrator is using the to generate a local file called "wikimedia_blacklist" that holds a copy of the Meta-Wiki blacklist, and has an additional block list on the wiki page "My spam block list":

Journaux
By default, the extension does not log hits into the spam blacklist log. Pour autoriser la journalisation, intialisez. You can use the  user right to control access to the logs. Every signed-in user can view the logs by default.

Problèmes


Limites de traçabilité arrière
If you encounter issues with the block list, you may want to increase the backtrack limit. However on the other hand, this can reduce your security against DOS attacks, as the backtrack limit is a performance limit:



Wikis renforcés
The SpamBlacklist will not allow editing if the wiki is hardened. Hardening includes limiting  so that   is not on-path, and setting   in.

In the hardened case, SpamBlacklist will cause an exception when Guzzle attempts to make a network request. The Guzzle exception message is, GuzzleHttp requires cURL, the allow_url_fopen ini setting, or a custom HTTP handler.



Liste sécurisée
Une liste de sécurité correspondante peut être maintenue en tenant à jour la page MediaWiki:Spam-whitelist. This is useful if you would like to override certain entries from another wiki's block list that you are using. Wikimedia wikis, for instance, sometimes use the spam block list for purposes other than combating spam.

It is questionable how effective the Wikimedia spam block lists are at keeping spam off of third-party wikis. Some spam might be targeted only at Wikimedia wikis, or only at third-party wikis, which would make Wikimedia's blacklist of little help to said third-party wikis in those cases. Also, some third-party wikis might prefer that users be allowed to cite sources that Wikipedia does not allow. Sometimes what one wiki considers useless spam, another wiki might consider useful.

Users may not always realize that, when a link is rejected as spammy, it does not necessarily mean that the individual wiki they are editing has specifically chosen to ban that URL. Therefore, wiki system administrators may want to edit the at MediaWiki:Spamprotectiontext and/or MediaWiki:Spamprotectionmatch on your wiki to invite users to make suggestions at MediaWiki talk:Spam-whitelist for pages that should be added by a  to the safe list. For example, you could put, for MediaWiki:Spamprotectiontext:


 * The text you wanted to save was blocked by the spam filter. This is probably caused by a link to a blacklisted external site.  maintains its own block list ; however, most blocking is done by means of Meta-Wiki's block list, so this block should not necessarily be construed as an indication that  made a decision to block this particular text (or URL). If you would like this text (or URL) to be added to the local spam safe list , so that  users will not be blocked from adding it to pages, please make a request at MediaWiki talk:Spam-whitelist . A sysop will then respond on that page with a decision as to whether it should be listed as safe.

Syntaxe
If you would like to create a block list of your own, or modify an existing one, here is the syntax:

Tout ce qui suit le caractère '#' sur une ligne est considéré comme du commentaire. Toutes les autres chaînes sont des parties de l'expression régulière qui ne s'appliquent qu'aux URLs.


 * Notes :

Le caractère d'ancrage  dans l'expression régulière, n'aura aucune utilité.
 * Do not add "http://"; this would fail, since the regex will match after "http://" (or "https://") inside URLs.
 * De plus "www" est inutile car l'expression régulière couvre tous les sous-domaines. By giving "www\." explicitly one can match specific subdomains.
 * The  and   anchors match the beginning and end of the domain name, not the beginning and end of the URL.
 * Slashes don't need to be escaped by backslashes, this will be done automatically by the script.
 * The spam blacklist functions prior to abuse filters, so blacklisted domains will not show in the entries in abuse filter log (special:abuselog), and will only show in (special:log/spamblacklist).


 * Exemple:

The following line will block all URLs that contain the string "example.com", except where it is immediately preceded or followed by a letter or a number.

\bexample\.com\b

Ceux-ci sont bloqués :


 * http://www.example.com
 * http://www.this-example.com
 * http://www.google.de/search?q=example.com

Ceux-ci ne sont pas bloqués :


 * http://www.goodexample.com
 * http://www.google.de/search?q=example.commodity

Performances
The extension creates a single regex statement which looks like  (where all slashes within the lines are escaped automatically). It saves this in a small "loader" file to avoid loading all the code on every page view. Page view performance will not be affected even if you're not using a bytecode cache although using a cache is strongly recommended for any MediaWiki installation.

La comparaison avec les expressions régulières ajoute un temps minime à l'enregistrement des pages (de l'ordre de 100ms dans nos essais). Néenmoins, le chargement du fichier de spam à partir du disque ou de la base de données et la construction des expressions régulières peuvent demander un certain temps en fonction de votre matériel. Si vous pensez qu'en activant cette extension le temps d'enregistrement est ralenti, essayez d'installer un cache compatible de bytecode. Cette extension mettra en cache les expressions régulières construites si un tel système est présent.

If you're sharing a server and cache with several wikis, you may improve your cache performance by modifying getSharedBlacklists and clearCache in SpamBlacklist_body.php to use (or a specific DB if you do not have a shared upload DB) rather than. Be sure to get all references! The regexes from the separate MediaWiki:Spam-blacklist and MediaWiki:Spam-whitelist pages on each wiki will still be applied.



Liste des serveurs externes bloqués (RBL)
In its standard form, this extension requires that the block list be constructed manually. While regular expression wildcards are permitted, and a block list originated on one wiki may be re-used by many others, there is still some effort required to add new patterns in response to spam or remove patterns which generate false-positives.

Much of this effort may be reduced by supplementing the spam regex with lists of known domains advertised in spam email. The regex will catch common patterns (like "casino-" or "-viagra") while the external block list server will automatically update with names of specific sites being promoted through spam.

In the filter function in includes/SpamBlacklist.php, approximately halfway between the file start and end, are the lines:

Directly above this section (which does the actual regex test on the extracted links), one could add additional code to check the external RBL servers:

This ensures that, if an edit contains URLs from already blocked spam domains, an error is returned to the user indicating which link cannot be saved due to its appearance on an external spam block list. If nothing is found, the remaining regex tests are allowed to run normally, so that any manually-specified 'suspicious pattern' in the URL may be identified and blocked.

Note that the RBL servers list just the base domain names - not the full URL path - so http://example.com/casino-viagra-lottery.html will trigger RBL only if "example.com" itself were blocked by name by the external server. The regex, however, would be able to block on any of the text in the URL and path, from "example" to "lottery" and everything in between. Both approaches carry some risk of false-positives - the regex because of the use of wildcard expressions, and the external RBL as these servers are often created for other purposes - such as control of abusive spam email - and may include domains which are not engaged in forum, wiki, blog or guestbook comment spam per se.



Autres outils pour combattre le spam
Vous trouverez différents manuels utiles sur mediawiki.org qui vous permettrons de combattre le spam et d'autres types de vandalisme :


 * - includes link to the built-in anti-spam mechanism.

Les autres extensions anti-spam et anti-vandalisme comprennent :


 * Messages système
 * Messages système