Extension:SpamBlacklist/es

From MediaWiki.org
Jump to navigation Jump to search
This page is a translated version of the page Extension:SpamBlacklist and the translation is 73% complete.

Other languages:
English • ‎dansk • ‎español • ‎中文 • ‎日本語 • ‎한국어
This extension comes with MediaWiki 1.21 and above. Thus you do not have to download it again. However, you still need to follow the other instructions provided.
MediaWiki extensions manual
OOjs UI icon advanced.svg
SpamBlacklist
Release status: estable
SpamBlacklist.gif
Implementación Page action
Descripción Proporciona un filtro de spam basado en expresiones regulares
Autor(es) Tim Starlingtalk
Última versión Actualizaciones continuas
Política de compatibilidad release branches
MediaWiki 1.26+
PHP 5.4+
Cambios de la base de datos No
Licencia GNU General Public License 2.0 or later
Descarga
README
Derechos añadidos
  • spamblacklistlog
Hooks que usa
EditFilterMergedContent
APIEditBeforeSave
EditFilter
ArticleSaveComplete
UserCanSendEmail
AbortNewAccount
Translate the SpamBlacklist extension if it is available at translatewiki.net
Verificar uso y versión de la matriz.
Asuntos Tareas abiertas · Reportar un bug

The SpamBlacklist extension prevents edits that contain URLs whose domains match regular expression patterns defined in specified files or wiki pages and registration by users using specified email addresses.

When someone tries to save a page, SpamBlacklist checks the text against a (potentially very large) list of illegal host names. Si hay una coincidencia, la extensión muestra un mensaje de error e impide guardar la página.

Instalación y configuración

Instalación

  • Descarga y extrae los archivos en el directorio «SpamBlacklist» dentro del directorio extensions/ existente.

Para quienes usan MediaWiki 1.24 o versiones anteriores:

Estas instrucciones describen la nueva forma de instalar extensiones usando wfLoadExtension(). Si necesitas instalar esta extensión en versiones anteriores (MediaWiki 1.24 y anteriores), debes usar lo siguiente en lugar de wfLoadExtension( 'SpamBlacklist' );:

require_once "$IP/extensions/SpamBlacklist/SpamBlacklist.php";

Setting the blacklist

The local pages MediaWiki:Spam-blacklist, MediaWiki:Spam-whitelist, MediaWiki:Email-blacklist and MediaWiki:Email-whitelist are always used, whatever additional sources are listed.

The default additional source for SpamBlacklists list of forbidden URLs is the Wikimedia spam blacklist on Meta-Wiki, at m:Spam blacklist. By default, the extension uses this list, and reloads it once every 10-15 minutes. For many wikis, using this list will be enough to block most spamming attempts. However, since the Wikimedia blacklist is used by a diverse group of large wikis with hundreds of thousands of external links, it is comparatively conservative in the links it blocks.

The Wikimedia spam blacklist can only be edited by administrators; but you can suggest modifications to the blacklist at m:Talk:Spam blacklist.

You can add other bad URLs on your own wiki. List them in the global variable $wgSpamBlacklistFiles in LocalSettings.php , AFTER the require_once "$IP/extensions/SpamBlacklist/SpamBlacklist.php"; see examples below.

$wgSpamBlacklistFiles is an array, with each value containing either a URL, a filename or a database location.

If you use $wgSpamBlacklistFiles in LocalSettings.php, the default value of "[[m:Spam blacklist]]" will no longer be used - if you want that blacklist to be accessed, you will have to add it in manually, see examples below.

Specifying a database location allows you to draw the blacklist from a page on your wiki.

The format of the database location specifier is "DB: [db name] [title]". [db name] should exactly match the value of $wgDBname in LocalSettings.php. You should create the required page name [title] in the default namespace of your wiki. If you do this, it is strongly recommended that you protect the page from general editing. Besides the obvious danger that someone may add a regex that matches everything, please note that an attacker with the ability to input arbitrary regular expressions may be able to generate segfaults in the PCRE library.

Ejemplos

If you want to, for instance, use the English-language Wikipedia's spam blacklist in addition to the standard Meta-Wiki one, you could call the following in LocalSettings.php , AFTER the require_once "$IP/extensions/SpamBlacklist/SpamBlacklist.php" or wfLoadExtension( 'SpamBlacklist' ) call:

$wgSpamBlacklistFiles = array(
   "[[m:Spam blacklist]]",
   "https://en.wikipedia.org/wiki/MediaWiki:Spam-blacklist"
);

...or this, which functions the same:

$wgBlacklistSettings = [
	'spam' => [
		'files' => [
			"https://meta.wikimedia.org/w/index.php?title=Spam_blacklist&action=raw&sb_ver=1",
			"https://en.wikipedia.org/w/index.php?title=MediaWiki:Spam-blacklist&action=raw&sb_ver=1"
		],
	],
];

Here's an example of an entirely local set of blacklists: the administrator is using the update script to generate a local file called "wikimedia_blacklist" that holds a copy of the Meta-Wiki blacklist, and has an additional blacklist on the wiki page "My spam blacklist":

require_once "$IP/extensions/SpamBlacklist/SpamBlacklist.php";
$wgBlacklistSettings = [
	'spam' => [
		'files' => [
			"$IP/extensions/SpamBlacklist/wikimedia_blacklist", // Wikimedia's list
			   //  database      title
			"DB: wikidb My_spam_blacklist",    
		],
	],
];

Incidencias

If you encounter issues with the blacklist, you may want to increase the backtrack limit. However on the other hand, this can reduce your security against DOS attacks, as the backtrack limit is a performance limit:

// Bump the Perl Compatible Regular Expressions backtrack memory limit                                                                                  
// (PHP 5.3.x default, 1000K, is too low for SpamBlacklist)                                                                                              
ini_set( 'pcre.backtrack_limit', '8M' );

Lista blanca

Se puede mantener una lista blanca correspondiente editando la página MediaWiki:Spam-whitelist. Esto es útil si deseas invalidar algunas entradas de la lista negra de otro wiki que estás usando. Wikimedia wikis, for instance, sometimes use the spam blacklist for purposes other than combatting spam.

It is questionable how effective the Wikimedia spam blacklists are at keeping spam off of third-party wikis. Some spam might be targeted only at Wikimedia wikis, or only at third-party wikis, which would make Wikimedia's blacklist of little help to said third-party wikis in those cases. Also, some third-party wikis might prefer that users be allowed to cite sources that are not considered reliable on Wikipedia, or that Wikipedia has considered so ideologically offensive as to warrant blacklisting. Sometimes what one wiki considers useless spam, another wiki might consider useful.

Users may not always realize that, when a link is rejected as spammy, it does not necessarily mean that the individual wiki they are editing has specifically chosen to ban that URL. Therefore, wiki system administrators may want to edit the system messages at MediaWiki:Spamprotectiontext and/or MediaWiki:Spamprotectionmatch on your wiki to invite users to make suggestions at MediaWiki talk:Spam-whitelist for pages that should be added by a sysop to the whitelist. For example, you could put, for MediaWiki:Spamprotectiontext:

The text you wanted to save was blocked by the spam filter. This is probably caused by a link to a blacklisted external site. {{SITENAME}} maintains [[MediaWiki:Spam-blacklist|its own blacklist]]; however, most blacklisting is done by means of [[metawikimedia:Spam-blacklist|Meta-Wiki's blacklist]], so this block should not necessarily be construed as an indication that {{SITENAME}} made a decision to block this particular text (or URL). If you would like this text (or URL) to be added to [[MediaWiki:Spam-whitelist|the local spam whitelist]], so that {{SITENAME}} users will not be blocked from adding it to pages, please make a request at [[MediaWiki talk:Spam-whitelist]]. A [[Project:Sysops|sysop]] will then respond on that page with a decision as to whether it should be whitelisted.

Autor y licencia

SpamBlacklist was originally written by Tim Starling and licensed under the GPL version 2 or any later version (at your option).

Notas

  • Esta extensión examina solamente los enlaces externos nuevos añadidos por editores del wiki. To check user agents, add Bad Behaviour or Akismet, and to check an editor's IP address against lists of known spambots, supplement this with Check Spambots. As the various tools for combating spam on MediaWiki use different methods to spot abuse, the safeguards are best used in combination.
  • The Extension:SpamBlacklist/update script is a cron script that can automate updates from shared blacklists. If you are using memcached, you will also have to delete the spam_blacklist_regexes key (for example, using maintenance/mcc.php).
  • No es posible dejar que algunos usuarios queden exentos de la lista negra de spam. Véase bugzilla:34928.

Uso

Sintaxis de la lista negra

Si deseas crear una lista negra personalizada o modificar una lista negra existente, aquí está la sintaxis:

Todo lo que aparezca en una línea después del carácter «#» se ignora (se usa para comentarios). Cualquier otra cadena de texto se interpretará como un fragmento de expresión regular que solamente se evaluará contra direcciones URL.

Notas
  • No añadas «http://». Provocaría un error, ya que la expresión regular solamente se evaluará después del «http://» (o «https://») en cada URL.
  • Además, «www» tampoco hace falta, ya que la expresión regular se evaluará contra cualquier subdominio. Al indicar explícitamente «www\.», se puede evaluar contra subdominios específicos.
  • Los anclajes (?<=//|\.) y $ coinciden con el principio y el final del nombre del dominio, no el principio y el final de la URL. El anclaje regular ^ no tiene ningún uso.
  • No hace falta escapar las barras con barras invertidas. Esto lo hace automáticamente el script.
Ejemplo

The following line will block all URLs that contain the string "example.com", except where it is immediately preceded or followed by a letter or a number.

\bexample\.com\b

These are blocked:

  • http://www.example.com
  • http://www.this-example.com
  • http://www.google.de/search?q=example.com

These are not blocked:

  • http://www.goodexample.com
  • http://www.google.de/search?q=example.commodity

Rendimiento

La extensión crea una sola expresión regular similar a /https?:\/\/[a-z0-9\-.]*(línea 1|línea 2|línea 3|...)/Si (donde todas las barras diagonales dentro de las líneas están escapadas automáticamente). La almacena en un pequeño fichero «cargador» para evitar cargar todo el código cada vez que se visita una página. Page view performance will not be affected even if you're not using a bytecode cache although using a cache is strongly recommended for any MediaWiki installation.

The regex match itself generally adds an insignificant overhead to page saves (on the order of 100ms in our experience). However, loading the spam file from disk or the database, and constructing the regex, may take a significant amount of time depending on your hardware. If you find that enabling this extension slows down saves excessively, try installing a supported bytecode cache. The SpamBlacklist extension will cache the constructed regex if such a system is present.

If you're sharing a server and cache with several wikis, you may improve your cache performance by modifying getSharedBlacklists and clearCache in SpamBlacklist_body.php to use $wgSharedUploadDBname (or a specific DB if you do not have a shared upload DB) rather than $wgDBname. Be sure to get all references! The regexes from the separate MediaWiki:Spam-blacklist and MediaWiki:Spam-whitelist pages on each wiki will still be applied.

Servidores externos de listas negras (RBL)

In its standard form, this extension requires that the blacklist be constructed manually. While regular expression wildcards are permitted, and a blacklist originated on one wiki may be re-used by many others, there is still some effort required to add new patterns in response to spam or remove patterns which generate false-positives.

Much of this effort may be reduced by supplementing the spam regex with lists of known domains advertised in spam e-mail. The regex will catch common patterns (like "casino-" or "-viagra") while the external blacklist server will automatically update with names of specific sites being promoted through spam.

In the filter() function in SpamBlacklist_body.php, approximately halfway between the file start and end, are the lines:

       # Do the match
       wfDebugLog( 'SpamBlacklist', "Checking text against " . count( $blacklists ) .
           " regexes: " . implode( ', ', $blacklists ) . "\n" );

Directly above this section (which does the actual regex test on the extracted links), one could add additional code to check the external RBL servers:

        # Do RBL checks
        $retVal = false;
        $wgAreBelongToUs = array('l1.apews.org.', 'multi.surbl.org.', 'multi.uribl.com.');
        foreach( $addedLinks as $link ) {
              $link_url=parse_url($link);
              $link_url=$link_url['host'];
              if ($link_url) {
                   foreach( $wgAreBelongToUs as $base ) {
                        $host = "$link_url.$base";
                        $ipList = gethostbynamel( $host );
                        if( $ipList ) {
                           wfDebug( "RBL match: Hostname $host is {$ipList[0]}, it's spam says $base!\n" );
                           $ip = wfGetIP();
                           wfDebugLog( 'SpamBlacklistHit', "$ip caught submitting spam: {$link_url} per RBL {$base}\n" );
                           $retVal = $link_url . ' (blacklisted by ' . $base .')';
                           wfProfileOut( $fname );
                           return $retVal;
                        }
                   }
              }
        }

        # if no match found on RBL server, continue normally with regex tests...

This ensures that, if an edit contains URLs from already-blacklisted spam domains, an error is returned to the user indicating which link cannot be saved due to its appearance on an external spam blacklist. If nothing is found, the remaining regex tests are allowed to run normally, so that any manually-specified 'suspicious pattern' in the URL may be identified and blocked.

Note that the RBL servers list just the base domain names - not the full URL path - so http://example.com/casino-viagra-lottery.html will trigger RBL only if "example.com" itself were blacklisted by name by the external server. The regex, however, would be able to block on any of the text in the URL and path, from "example" to "lottery" and everything in between. Both approaches carry some risk of false-positives - the regex because of the use of wildcard expressions, and the external RBL as these servers are often created for other purposes - such as control of abusive spam e-mail - and may include domains which are not engaged in forum, wiki, blog or guestbook comment spam per se.


Otras herramientas antispam

There are various helpful manuals on mediawiki.org on combating spam and other vandalism:

Other anti-spam, anti-vandalism extensions include:

Véase también