Manual talk:Combating spam

Pages on meta.wikimedia.org
Were you not aware of these pages? :
 * Anti-spam Features
 * Wiki Spam

Seems like a lot of duplicated effort creating this explanation.

Could move those onto MediaWiki.org ?

-- Halz 10:11, 9 October 2007 (UTC)


 * By the way I am responsible for writing most of the text on those pages, and would be delighted if more people read them and linked to them (not because I wrote them, but because I want to increase awareness of the issues/remedies). So if they're more likely to be found on here, lets move them here.
 * I am also very happy for people to redistribute that content, so if we wanted move those pages into the Public Domain Help Pages that would be fine by me. I'm not the sole contributor. Others made various minor tweaks, so maybe there's legal problems with stripping the license.... but I did write the bulk of the text.
 * -- Halz 10:55, 26 March 2008 (UTC)


 * It's a nice thought, but anti-spam information is probably out of the public domain help pages' scope (considering their target audience). Would other wikis really benefit from duplicating that content (though they still can under the GFDL)? —Emufarmers(T 02:21, 27 March 2008 (UTC)

Anti-spam features is on this wiki now and yes, its content is highly duplicative of information which is also on this page. I'm also tempted to remove much of the commentary on "rel=nofollow" as (1) it is already 'on' in standard MediaWiki installations by default (and there's no useful purpose served by instructing on how to turn it off here) and (2) it doesn't stop spam. --Carlb 05:31, 8 September 2009 (UTC)

Edit Filtering
With refference to the section on edit blocking, is there a way to block a specific string inside new page titles? For instance, I'm getting a lot of spam that has "jobs" or "job" in the title. Can I block new pages with this specific text in the title? FrankCarroll 05:57, 6 February 2011 (UTC)

Importing the spam list
There's a little hitch with the described process. I followed the process to the letter and uploaded it to my wiki, though there still seemed to be bots registering from ip's banned with that extension. After some searching, the only thing i can find is that php.net mentions that require and include need a 'proper' php file to work, and the described process doesn't mention the trailing ?> that i would assume should be added.

Does the wiki just skip incomplete extensions in such cases and is that why bots can still register, or is there something else going on? Reiisha 05:07, 15 March 2011 (UTC)


 * No, ?> is not required; only  is. What makes you think bots are registering with IPs you've blocked?  Have you used CheckUser to check? —Emufarmers(T 06:29, 15 March 2011 (UTC)


 * I did use CheckUser and looked up the IP's in the php. They're still registering with them. Reiisha 13:25, 15 March 2011 (UTC)
 * Since i added in the closing ?>, the extension started working correctly... Reiisha 23:07, 18 March 2011 (UTC)


 * I'm using MW 1.18, and tested things by blocking my own IP address. It turns out that I could still register, but that I only had read access.  Once I took my IP address out of $wgProxyList and refreshed the page, I could edit again.  Mr3641 19:32, 13 September 2011 (UTC)