Extension talk:LinkTitles/Archive

Category Linking
This is a great extension, and is perfect for what I'm using. There is one minor issue I've found, however. When I'm using my template to auto-generate categories, and one of the names of the categories also is a link, it puts a link in the category syntax and breaks the creation of the categories. (I'm not sure if this happens if the category is manually created, I have not actually had the chance to look at it. Here is what I'm talking about, in case what I'm saying isn't clear.

Template: ]

On page: Test=Blue When rendered: Group]]]

I have fixed this for now by just black-listing the pages so they don't auto-link, but I would like to see an exclusion for things in a Category bracket. (I know, it can get tricky to do, but it's a suggestion) ^.^

--Taintedsnowqueen (talk) 20:12, 8 October 2012 (UTC)


 * Well if I understand you correctly, the problem is that the extension parses template parameters. I've quickly added a new option "$wgLinkTitlesSkipTemplates" that lets you control this behavior. Please see the main extension page to read more about it. The downside is that this setting will prevent all template contents from being automatically linked; but there is currently no other way that I can think of how to accomplish what you want. Is it what you want? -- Bovender (talk) 17:51, 9 October 2012 (UTC)


 * Sorry it took so long to get back. It wasn't exactly what I was looking for, but it is a decent work-around. (It wouldn't work for my purposes, since I mostly use templates in my wiki to generate content). After giving it a decent amount of thought, I've realized that because of how templates & this extension work, it is likely that setting is the only way to get the issue to stop. Thanks for the setting! :) Taintedsnowqueen (talk) 23:43, 1 November 2013 (UTC)

Case of linking
This is really hurting me. I hate to have to create a stub redirect for all of the items. I have lots of articles that have two words with both being capitalized. I would like the option to search exact, then case-insensitive. I know this would double the load, but it would vastly help. - - - ''The extension performs a case-insensitive regexp search. Therefore, brackets may be added to words that have incorrect capitalization, causing 'broken' wiki links to appear. You may want to create redirecting pages for these variants (to also handle different user inputs).'' -- Philipsaj


 * Hi, I've added a new option $wgLinkTitlesIgnoreCase and published version 1.7.0, but I'm afraid there will be no solution that satisfies all needs. The problem is that all articles in a Wiki begin with a capital letter, thus, if you had an article "Snow" and set $wgLinkTitlesIgnoreCase to false (meaning an exact match is required to link a title), none of the occurrences of the word "snow" in an English article about the winter would be linked. That's the reason why I had the extension perform a non-configurable case-insensitive search and replace in the first place. If I understand you correctly, you have lots of articles on "Snow Flakes" and "Snow Men". With case-insensitive linking, all occurrences of "snow flakes" and "snow men" would link to non-existing pages. I guess that's your problem, right? But how about writing "Snow Flakes" and "Snow Men" in your articles' texts in the first place? Would that not be more practical? -- Bovender (talk) 16:03, 22 January 2013 (UTC)


 * Thank you for your quick response and help! I appreciate your help and you understand the situation quite well in regards to the article titles and the challenge I face.  I'm sure my problem sounds easy and simplistic to fix, unfortunately I wish your suggestion for a fix was as easy to implement.  It is not just me doing the editing, but hundreds of people.  Many of which are extremely limited in their understanding of what they are doing, but they hold the knowledge we are trying to get recorded.  I think the only solution is a two part scan of the text.  I guess I need to find an extension that will scan my text and switch the case insensitive matches to the correct case that matches the article...  I really appreciate your extension and how it helps my wiki.  Thanks for your help.  -- Philipsaj 7:44, 23 January 2013 (CST)


 * Well I think I got it now. What is needed is automatic aliasing if the case of a page title and the case of its occurrence on the page do not match, so that link such as Snow ball are generated. I've added this functionality to the version 1.8.1 which I uploaded just now. You are probably right that a two-pass algorithm would be more useful, so that case-sensitive matches are preferred. I'll think about this further. Bovender (talk) 17:48, 26 January 2013 (UTC)


 * Version 2.0.0 of the extension should do what you requested, and I think it's a much better way to do the linking than before: First, a case-sensitive search for page titles is performed (with the exception of the first letter of a page title, which is searched for in a case-insensitive way). In a second pass, a case-insensitive search is performed, and 'piped' links "..." are added as needed. If you find that this two-pass mechanism slows down your wiki, you can turn off this behavior by setting $wgLinkTitleSmartMode = false. Hope this helps! -- Bovender (talk) 20:29, 29 January 2013 (UTC)

Blacklist pages
Thanks for this extension, it really helps when working with people that haven't grasped yet that a Wiki is there for linking contents. However, we have pages where we would like to have no links at all. So it would be cool if you considered to add a blacklist parameter for pages based on their titles analogously to the word based blacklist. What do you think of that idea? (Jonas)
 * Hi Jonas, that should be feasible. I wonder whether the best way to implement this would be a blacklist array in the config file, or something like a "__NOLINKTITLES__" command in the content of the page. Or both. Bovender (talk) 19:14, 14 February 2013 (UTC)
 * Uh, quick, cool :). I think it would be better to have a directive because then each author can specify that and not only the server admin who can access the config file. Both could also be an option to have administrative prescriptions that are not editable. In the case you would do both it would probably make sense to have flags that (de-)activate them.
 * Jonas, I've added a new 'magic word'  which can be added to a given page to prevent the extension from adding links. It's in the new 2.1.0 release. Hope it suits your needs! -- Bovender (talk) 13:23, 23 February 2013 (UTC)

Wir haben das Update installiert, bekommen aber eine Exception von unserem Wiki (es handelt sich um das Paket 'mediawiki 1:1.15.5-2squeeze5' aus Debian Squeeze - meinst du es könnte ein Problem sein, dass wir nicht 1.18 haben?) : Magic word 'MAG_LINKTITLES_NOAUTOLINKS' not found

Backtrace:

Hast du eine Idee, woran das liegen könnte? (Jonas)
 * 1) 0 /usr/share/mediawiki/includes/MagicWord.php(244): Language->getMagic(Object(MagicWord))
 * 2) 1 /usr/share/mediawiki/includes/MagicWord.php(197): MagicWord->load('MAG_LINKTITLES_...')
 * 3) 2 /var/lib/mediawiki/extensions/LinkTitles/LinkTitles.body.php(206): MagicWord::get('MAG_LINKTITLES_...')
 * 4) 3 [internal function]: LinkTitles::removeMagicWord(Object(Parser), 'parse('parse('wrapWikiMsg('mainPrefsForm('')
 * 11) 10 /usr/share/mediawiki/includes/specials/SpecialPreferences.php(15): PreferencesForm->execute
 * 12) 11 [internal function]: wfSpecialPreferences(NULL, Object(SpecialPage))
 * 13) 12 /usr/share/mediawiki/includes/SpecialPage.php(771): call_user_func('wfSpecialPrefer...', NULL, Object(SpecialPage))
 * 14) 13 /usr/share/mediawiki/includes/SpecialPage.php(559): SpecialPage->execute(NULL)
 * 15) 14 /usr/share/mediawiki/includes/Wiki.php(229): SpecialPage::executePath(Object(Title))
 * 16) 15 /usr/share/mediawiki/includes/Wiki.php(59): MediaWiki->initializeSpecialCases(Object(Title), Object(OutputPage), Object(WebRequest))
 * 17) 16 /usr/share/mediawiki/index.php(116): MediaWiki->initialize(Object(Title), NULL, Object(OutputPage), Object(User), Object(WebRequest))
 * 18) 17 {main}
 * Hi Jonas, es hatten sich ein paar Dinge eingeschlichen, die mit PHP < 5.3 Probleme verursacht haben. Das habe ich jetzt behoben (Version 2.1.1). Kann sein, daß das Euer Problem schon löst. Habt Ihr sicher die mit 2.0.0 neu eingeführte Datei  auf dem Server? Falls es immer noch Probleme gibt, bitte Rückmeldung! -- Bovender (talk) 18:07, 6 March 2013 (UTC)
 * Moin, ich hab eben das repo aktualisiert und die magic-Datei ist auch da, aber es kommt immernoch die Exception. PHP hat bei uns Version 5.3 (5.3.3-7+squeeze15). Die Exception kommt übrigens auch beim Aufrufen, sprich anschauen von Seiten, nicht nur wenn man beim editieren speichern will. Falls ich irgendwelche Experimente/Dumps/Traces beisteuern kann, die dir helfen würden, lass es mich wissen.
 * Endlich bin ich dazu gekommen, den Fehler zu reproduzieren. In einer virtuellen Maschine mit Ubuntu Server 12.04 (64-bit), PHP 5.3.10 und dem alten MediaWiki 1.15.5 bekomme ich denselben Fehler. Wenn ich dann einfach die aktuelle MediaWiki 1.20.3 darüber spiele (und die StartProfiler.*-Dateien aus dem Wiki-Rootverzeichnis lösche, die beim Update zu Fehlermeldungen führen), funktioniert die Extension. Es hängt also von der Version ab. In zwei weiteren Wikis (lokal mit PHP 5.4.6-1ubuntu1.1/MediaWiki 1.17.0 und im Web mit PHP 5.2.6-1+lenny16/MediaWiki 1.18.0) klappt es ebenfalls problemlos. Bislang habe ich nicht herausgefunden, warum es mit MediaWiki 1.15.5 nicht funktioniert; die ChangeLogs geben dazu m.E. nichts her. Der naheliegende Workaround besteht in einem Upgrade Eures Systems... aber das ist ja oftmals keine praktikable Lösung. Ich muß weiter forschen. -- Bovender (talk) 15:34, 15 March 2013 (UTC)

Slight problem with parsing
I just discoverd a problem with the parser. It parses text inside and tags and then converts them to links. This causes the resulting output to show the brackets and link text INSIDE a block where links will never be parsed.

(For some reason I cant currently log in, I think I forgot the password, but I will check back. By the way, look at my next topic.) ok my sig now C.Jason.B (talk) 20:50, 6 July 2013 (UTC)


 * This looks feasible, I just need to find time to edit the code. -- Bovender (talk) 19:04, 7 July 2013 (UTC)

Added Functionality
By the way I am the user who changed the testing tag to include 1.21. I've been abusing it on this wiki for over a week flawlessly except for the whole pre and code thing.

I needed this extension to work accross several namespaces for a project I am working on. Before this I had ZERO experience writting PHP code.

I have now modified the extension to accept the following:
 * Crossnamespace linking is de-activatable by setting whitelist to include only namespace 0
 * A whitelist of namespaces to link against. (If set to include only '0', it over rides the cross namespace thing)
 * A weighted list of the priority of the namespaces to link against (Current namespace is always top priority)
 * Boolean flag that checks for reorder teh namespace weights based on user group membership
 * Additional weight lists which modify the primary one if the above flag is true

Since I have been coding php for 3 days only, my codding skills kinda suck, but:
 * It works
 * It has caused no errors with extensive testing (save the problem I reported previously, but I dont know if it is caused by yours or mine)(I assume both since I didnt modify the search part of the code at all)

Would you like the code? I have a dump e-mail account at aragornnrogara  at  Yahoo if you want to drop me a line and have me send it to ya. All the features are able to be switched off, so it cant hurt to have them.

Since it took me almost three full days to write under 100 lines of code, I probabgly wont be maintaining this, but I thought others might want the added functionality.

The only core functionality I altered is in the two callback functions, added a parameter to the main parse function ( I had it pass on the user) and altered $1 in the parser.

Also since your skills are far superior to mine, I told it to simply return true if it encountered a page in the database of the form Page/subpage/subpage or deeper (an IF that checks for subst_count of "\/" being more than 1 right after you had it create safe_title or watever it was called.)

OK, my sig now

C.Jason.B (talk) 20:51, 6 July 2013 (UTC)
 * Are you familiar with Git? If so, you could issue a pull request on the repository at https://github.com/bovender/LinkTitles -- otherwise, if you want you may send me the code at xltoolbox at gmx net. -- Bovender (talk) 19:07, 7 July 2013 (UTC)


 * You may flog me now, with 10+ years using linux I have no idea how to use either subversion or git. I responded to your email adress (ignore the mysql question sent later) C.Jason.B (talk) 10:23, 10 July 2013 (UTC)


 * Sent you the code via your email. This is a bug fix and hopefully no new ones crawled in.  There is a text doc in the file suitable for pasting into a wiki the update info in it.  I suggest sandboxing it a while, i am still testing. This turned into a complete rewrite of my code so the changes to all of it are now many, but well documented (maybe to much) and default state is meant to run as a drop in replacement for the original.  Also most of my stuff is in fuctions, so if a problem crops up it is easily locked out.C.Jason.B (talk) 10:23, 10 July 2013 (UTC)

A bug crept in
In the code I sent you, line 126 should read if(count($wgLinkTitlesNamespaceNOAUTOLINKS) > 0) { C.Jason.B (talk) 13:20, 10 July 2013 (UTC)

Link Titles Bug (Current version w/ mediawiki 1.21)
When editing a page which contains a full wiki link display text and in the MAIN namespace a document titled text exists (using the link I gave for an example). The resulting output will be display[[text]]

I discovered this while editing this wiki text:

In my wiki the word skills exists as a title in NS_MAIN and in another namespace. I manually linked to the other namespace but after clicking the save button the link was [[skills]]

At first I thought this was due to my modifications of the code for the extension. But I pulled my version down and put the original back up, and was able to duplicate the result after clearing all caches.

I tracked the problem to the SmartLinks phase. In the main for loop you begin with $i=0 and increment by 2,  this causes the error (I believe because the very first character on the page is a { )  When I rewrite that code to be $i=1 it works fine for this page, but fails everywhere else.

Proposal: Prior to the loops instantiation, check the value of subst($text,0,1) for regex match on $deliminator. If it fails set $init=0, if it matches (first char in $text is a wiki text component) set $init=1, then at the loop: for($i=$init; $i < count($arr); $i +=2){

Unless you know a better way, I will experiment along these lines for a couple of days.C.Jason.B (talk) 05:08, 11 July 2013 (UTC)


 * As it turns out, this bug only seems to crop up when the extension parses template contents. If that is set to True, then the bug can occur as it seems to also reparse links.  I'm not sure about the links thing, but since I turned off the parse templates feature the bug hasn't recurred.  It appears my entire previous premise was false. C.Jason.B (talk) 06:36, 19 July 2013 (UTC)

Batch change
Is it possible to run this script automatically? My wiki site expands and I'd like to run LinkTitles for all the pages from time to time. It has already over 3000 pages, so doing this manually is a bit difficult. It would be great to run it during the night time.
 * As far as I know, it is not possible to schedule tasks in MediaWiki (I might be wrong). What I can imagine is a Special Page that could be used to trigger an automatic link scan of all pages. One would have to consider runtime limitations on shared servers though. -- The problem here is that I have extremely little time to think about this extension at the moment, but I'll try and come up with something. -- Bovender (talk) 13:52, 31 December 2013 (UTC)

Self Linking via Redirects
First off, thank you for this extension. This is easily one of the most useful extensions I have found since I have been using MediaWiki. Now onto my question. I noticed that LinkTitles will not cause a page to link to itself (which makes sense, given that the title of a page is generally going to be mentioned several times in the body) however it will link to a redirect that links back to that page.

EXAMPLE: In my wiki there is a page called Enclave. There is also an Enclaves page which redirects to Enclave. The enclave page skips the word enclave when generating links but creates a link for each instance of Enclaves which essentially links back to itself.

Would it be possible to have this extension check the redirects before generating the link to avoid it linking to itself?

Thanks.

EDIT AFTER UPDATE: Wow that was faster than I expected. You are awesome.

Version 2.4 with major new features
Today I released version 2.4 which proffers several new features that people have been asking for on this page: I still have the namespaces on my to do list. -- Bovender (talk) 15:45, 6 June 2014 (UTC)
 * Prevent linking of pages that redirect to the current page.
 * Prevent linking in PRE segments and in attributes of DIV and SPAN elements.
 * Offer batch processing (by means of a Special:LinkTitles page and as a maintenance script to be run from the command line).

Version 3 released
I was unhappy with the performance, which decreased due to recent feature additions. So I changed the algorithm quite a bit, which resulted in an ~10x increased speed on my machine. Release 3.0.0 is available for download now.

Error on Batch Processing
When running the Batch Processing via the Special Page I see a blank page with the following (with my server name removed): Fatal error: Call to undefined method WikiPage::getContent in /MyServer/extensions/LinkTitles/LinkTitles.body.php on line 254

UPDATE: So it turns out I just needed to upgrade my MediaWiki (Thought I had the latest version. Apparently not.) the Batch Processing now works like a charm. I'll leave this here in case someone else has the same issue.