Extension talk:Replace Text/Archive 2018

Large Wikis
For wikis with numerous pages, an approach based on jobs might be more appropriate. Just a thought. Jean-Lou Dupont 15:48, 29 April 2008 (UTC)


 * Asking administrators to set up a cron job might be overkill - what do you think about the idea of including a command-line script instead, for the case of large wikis? Yaron Koren 19:25, 7 May 2008 (UTC)
 * I was thinking about a MediaWiki custom job written as an extension: each page request (from any user) would trigger X number of replace tasks. This is the general way jobs under MediaWiki work. Jean-Lou Dupont 19:38, 7 May 2008 (UTC)


 * Oh, I wasn't even aware of that functionality. I just looked it up; that definitely clarifies some things about how templates and categories work... do you know if there's documentation somewhere on how to add jobs? Yaron Koren 19:44, 7 May 2008 (UTC)
 * I usually just look at the code in those cases ;-) Look at JobQueue.php. Jean-Lou Dupont 19:49, 7 May 2008 (UTC)


 * Okay, thanks; that looks ideal. It could be useful not just for this extension, but for functionality like SMW's refresh-semantic-data action, which currently runs only as a command-line script. Is there any code of yours I can look at that uses the job queue? Yaron Koren 19:55, 7 May 2008 (UTC)
 * Unfortunately no but it looks straight forward enough. Jean-Lou Dupont 20:02, 7 May 2008 (UTC)

Error
I have this error: Fatal error:

Maximum execution time of 30 seconds exceeded in path\includes\Database.php on line 681

--85.18.14.29 10:53, 4 May 2008 (UTC)
 * See Jean-Lou Dupont 11:04, 4 May 2008 (UTC)

Regexp and others
Hi Yaron. Could you make this excellent extension use Regexp and make some check box on giving it a preview mode or a list of data which will be changed. :)--Roc michael 20:19, 8 May 2008 (UTC)
 * Hi Roc, I totally agree with you. +: I think we could have a way to select in which namespaces or categories we want to do the search, and could be a multiselect box, something like this: (first field) http://lab.arc90.com/tools/multiselect/

Finally!
Was looking for something like this as this would be the only reason to use bots what I'm too dumb to. Thanks!

Still I get memory problems: On localhost I get this error after ~3 seconds:

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 3672 bytes) in C:\xampp\htdocs\wiki12clean\includes\Title.php on line 142 On my sever where I have a php memory limit of 30MB it gives me a blank page. Error log:

PHP Fatal error: Maximum execution time of 30 seconds exceeded in includes/Database.php on line 818 (MW 1.12.0 / PHP 5.1) --Subfader 21:47, 10 May 2008 (UTC)
 * And I also support the "per namespace" feature. --Subfader 15:56, 11 May 2008 (UTC)

Version 0.2 released
Hi, to everyone who's had problems with the extension before, I just released a new version, 0.2. This version has two big changes: the replacement of text is a two-step process, with the user first shown a list of pages to be replaced so they can choose which ones to replace in; and actual replacement is done through MediaWiki jobs, as Jean-Lou Dupont suggested above. Both of these changes should combine to deal with many of the problems people have experienced, like server timeout, memory overload, and lack of control over replacements. Please try out this new version when you can, and let me know if it improves things for you. Yaron Koren 20:53, 12 May 2008 (UTC)

Still my memory problem
Thanks for the fast work. But still: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 3672 bytes) in C:\xampp\htdocs\wiki12\includes\Title.php on line 142

I use MW 1.12.0rc and I saw heavy changes are done to Title.php on SVN these days. --Subfader 21:53, 12 May 2008 (UTC)


 * At which stage of the process do you see this error? Yaron Koren 22:59, 12 May 2008 (UTC)
 * Right after hitting Continue --Subfader 05:16, 13 May 2008 (UTC)
 * Okay, I think I see what's going on - the way in which the extension currently finds the list of pages is extremely inefficient. I think there's an easy fix, which I'll add to the next version, which should speed things up, and reduce memory usage, considerably. Yaron Koren 14:54, 13 May 2008 (UTC)
 * Maybe use a form of iterator to solve the issue? Insert a job in the queue with 1st article as target. When a job concludes, insert a new job with the next article id and so on. With this method, no more long-lead-time-fetching. Of course, you could fetch a handful of article IDs per-job and process them as well.  I would also suggest adding an entry in the log when the whole process is finished. Jean-Lou Dupont 15:01, 13 May 2008 (UTC)
 * Actually, this comes before jobs are used - when the list of pages containing the search string is found. Thanks for the previous jobs tip, though. Yaron Koren 15:50, 13 May 2008 (UTC)
 * Well, you could have the job go through all pages one by one also; no more searching first and post replace job after. Jean-Lou Dupont 15:54, 13 May 2008 (UTC)