Extension talk:Replace Text/Archive 2018

Large Wikis
For wikis with numerous pages, an approach based on jobs might be more appropriate. Just a thought. Jean-Lou Dupont 15:48, 29 April 2008 (UTC)


 * Asking administrators to set up a cron job might be overkill - what do you think about the idea of including a command-line script instead, for the case of large wikis? Yaron Koren 19:25, 7 May 2008 (UTC)
 * I was thinking about a MediaWiki custom job written as an extension: each page request (from any user) would trigger X number of replace tasks. This is the general way jobs under MediaWiki work. Jean-Lou Dupont 19:38, 7 May 2008 (UTC)


 * Oh, I wasn't even aware of that functionality. I just looked it up; that definitely clarifies some things about how templates and categories work... do you know if there's documentation somewhere on how to add jobs? Yaron Koren 19:44, 7 May 2008 (UTC)
 * I usually just look at the code in those cases ;-) Look at JobQueue.php. Jean-Lou Dupont 19:49, 7 May 2008 (UTC)


 * Okay, thanks; that looks ideal. It could be useful not just for this extension, but for functionality like SMW's refresh-semantic-data action, which currently runs only as a command-line script. Is there any code of yours I can look at that uses the job queue? Yaron Koren 19:55, 7 May 2008 (UTC)
 * Unfortunately no but it looks straight forward enough. Jean-Lou Dupont 20:02, 7 May 2008 (UTC)

Error
I have this error: Fatal error:

Maximum execution time of 30 seconds exceeded in path\includes\Database.php on line 681

--85.18.14.29 10:53, 4 May 2008 (UTC)


 * See Jean-Lou Dupont 11:04, 4 May 2008 (UTC)

Regexp and others
Hi Yaron. Could you make this excellent extension use Regexp and make some check box on giving it a preview mode or a list of data which will be changed. :)--Roc michael 20:19, 8 May 2008 (UTC)
 * Hi Roc, I totally agree with you. +: I think we could have a way to select in which namespaces or categories we want to do the search, and could be a multiselect box, something like this: (first field) http://lab.arc90.com/tools/multiselect/

Finally!
Was looking for something like this as this would be the only reason to use bots what I'm too dumb to. Thanks!

Still I get memory problems: On localhost I get this error after ~3 seconds:

Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 3672 bytes) in C:\xampp\htdocs\wiki12clean\includes\Title.php on line 142 On my sever where I have a php memory limit of 30MB it gives me a blank page. Error log:

PHP Fatal error: Maximum execution time of 30 seconds exceeded in includes/Database.php on line 818 (MW 1.12.0 / PHP 5.1) --Subfader 21:47, 10 May 2008 (UTC)
 * And I also support the "per namespace" feature. --Subfader 15:56, 11 May 2008 (UTC)

Version 0.2 released
Hi, to everyone who's had problems with the extension before, I just released a new version, 0.2. This version has two big changes: the replacement of text is a two-step process, with the user first shown a list of pages to be replaced so they can choose which ones to replace in; and actual replacement is done through MediaWiki jobs, as Jean-Lou Dupont suggested above. Both of these changes should combine to deal with many of the problems people have experienced, like server timeout, memory overload, and lack of control over replacements. Please try out this new version when you can, and let me know if it improves things for you. Yaron Koren 20:53, 12 May 2008 (UTC)

Still my memory problem
Thanks for the fast work. But still: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 3672 bytes) in C:\xampp\htdocs\wiki12\includes\Title.php on line 142

I use MW 1.12.0rc and I saw heavy changes are done to Title.php on SVN these days. --Subfader 21:53, 12 May 2008 (UTC)


 * At which stage of the process do you see this error? Yaron Koren 22:59, 12 May 2008 (UTC)
 * Right after hitting Continue --Subfader 05:16, 13 May 2008 (UTC)
 * Okay, I think I see what's going on - the way in which the extension currently finds the list of pages is extremely inefficient. I think there's an easy fix, which I'll add to the next version, which should speed things up, and reduce memory usage, considerably. Yaron Koren 14:54, 13 May 2008 (UTC)
 * Maybe use a form of iterator to solve the issue? Insert a job in the queue with 1st article as target. When a job concludes, insert a new job with the next article id and so on. With this method, no more long-lead-time-fetching. Of course, you could fetch a handful of article IDs per-job and process them as well.  I would also suggest adding an entry in the log when the whole process is finished. Jean-Lou Dupont 15:01, 13 May 2008 (UTC)
 * Actually, this comes before jobs are used - when the list of pages containing the search string is found. Thanks for the previous jobs tip, though. Yaron Koren 15:50, 13 May 2008 (UTC)
 * Well, you could have the job go through all pages one by one also; no more searching first and post replace job after. Jean-Lou Dupont 15:54, 13 May 2008 (UTC)
 * I don't have much clue but would a preview still be possible job-by-job? Speed is no issue with this extension imo since a bot isn't fast either. So if the preview would take 10 minutes to load; fine for me. At least it would give me a better feeling than bashing everything without undo. Btw: there are ~8500 articles and 16,112 total pages in my local wiki i tested this on. --Subfader 19:28, 13 May 2008 (UTC)

Works fine
Hi, it works smooth now. The "Job" speed varies a lot (on my local wiki). 1-15 actions per minute or even 10 minutes pause. Is it ok this way?

Preview loads but is a bit buggy when div's are included in the previewed section where the "term to be replaced" is included. This broke the layout. The source code of the section is

Thanks anyway, will use it a lot I think :) --Subfader 17:29, 14 May 2008 (UTC)


 * Okay, that's great to hear. I have no idea how long jobs are supposed to take - for me it works almost instantly, on a small wiki, but in any case, I doubt if the process can be sped up. Thanks for the bug report - that's a new bug in the current version; I'll fix it in the next version. Yaron Koren 19:25, 14 May 2008 (UTC)

Using Bots / Recent Changes
You could insert $wgNewUserSupressRC into the code in order to enable hiding bot flagged users from recent changes (as Extension:NewUserMessage does), but better world be to instert only 1 entry to the normal recent changes (User Bla: Global Text Replace 'old text' to 'new text'. Still all details should show up in the user's contribs --Subfader 07:54, 15 May 2008 (UTC)


 * Sorry, what's a bot-flagged user? Yaron Koren 15:12, 15 May 2008 (UTC)


 * Could be the term is wrong :) A user which you flag as bot / give bot rights. Such aren't displayed in Recent Changes. But mine is with the replacing actions. --Subfader 15:54, 15 May 2008 (UTC)


 * Oh, I see. Well, maybe having a bot as the user doing the text replace is a bit misleading, since it's not a script but a special page making the changes? Or maybe that's too subtle a distinction to worry about. Yaron Koren 16:31, 15 May 2008 (UTC)


 * My users have no clue about wiki, better than letting them think it was done manually by a "user". That's why I ment a single entry in Recent Changes describing it was a bunch of changes would be useful. If you see ne need for it, no problem. --Subfader 16:34, 15 May 2008 (UTC)

Replace text in title (move)
Why is it not possible? I can imagine I'm not the only one who needs it this way. Either soft by additionally moving or hard (if it's possible and easier). Categories won't need to be moved though (that's just one manual change and losing all revisions for the category is ok). So there could be a tickbox before the preview [_] Change article names as well (move)? --Subfader 16:06, 15 May 2008 (UTC)
 * Well, my thought was that (a) there are challenges when moving a page, if a page already exists at the new location, and (b) the number of pages that would need to be moved is too small to justify the effort. How many pages would you want renamed, anyway? Yaron Koren 17:59, 15 May 2008 (UTC)