Extension talk:LinkTitles/Archive

This is a great extension, and is perfect for what I'm using. There is one minor issue I've found, however. When I'm using my template to auto-generate categories, and one of the names of the categories also is a link, it puts a link in the category syntax and breaks the creation of the categories. (I'm not sure if this happens if the category is manually created, I have not actually had the chance to look at it. Here is what I'm talking about, in case what I'm saying isn't clear.

Template: ]

On page: Test=Blue When rendered: Group]]]

I have fixed this for now by just black-listing the pages so they don't auto-link, but I would like to see an exclusion for things in a Category bracket. (I know, it can get tricky to do, but it's a suggestion) ^.^

--Taintedsnowqueen (talk) 20:12, 8 October 2012 (UTC)


 * Well if I understand you correctly, the problem is that the extension parses template parameters. I've quickly added a new option "$wgLinkTitlesSkipTemplates" that lets you control this behavior. Please see the main extension page to read more about it. The downside is that this setting will prevent all template contents from being automatically linked; but there is currently no other way that I can think of how to accomplish what you want. Is it what you want? -- Bovender (talk) 17:51, 9 October 2012 (UTC)

Case of linking
This is really hurting me. I hate to have to create a stub redirect for all of the items. I have lots of articles that have two words with both being capitalized. I would like the option to search exact, then case-insensitive. I know this would double the load, but it would vastly help. - - - ''The extension performs a case-insensitive regexp search. Therefore, brackets may be added to words that have incorrect capitalization, causing 'broken' wiki links to appear. You may want to create redirecting pages for these variants (to also handle different user inputs).'' -- Philipsaj


 * Hi, I've added a new option $wgLinkTitlesIgnoreCase and published version 1.7.0, but I'm afraid there will be no solution that satisfies all needs. The problem is that all articles in a Wiki begin with a capital letter, thus, if you had an article "Snow" and set $wgLinkTitlesIgnoreCase to false (meaning an exact match is required to link a title), none of the occurrences of the word "snow" in an English article about the winter would be linked. That's the reason why I had the extension perform a non-configurable case-insensitive search and replace in the first place. If I understand you correctly, you have lots of articles on "Snow Flakes" and "Snow Men". With case-insensitive linking, all occurrences of "snow flakes" and "snow men" would link to non-existing pages. I guess that's your problem, right? But how about writing "Snow Flakes" and "Snow Men" in your articles' texts in the first place? Would that not be more practical? -- Bovender (talk) 16:03, 22 January 2013 (UTC)


 * Thank you for your quick response and help! I appreciate your help and you understand the situation quite well in regards to the article titles and the challenge I face.  I'm sure my problem sounds easy and simplistic to fix, unfortunately I wish your suggestion for a fix was as easy to implement.  It is not just me doing the editing, but hundreds of people.  Many of which are extremely limited in their understanding of what they are doing, but they hold the knowledge we are trying to get recorded.  I think the only solution is a two part scan of the text.  I guess I need to find an extension that will scan my text and switch the case insensitive matches to the correct case that matches the article...  I really appreciate your extension and how it helps my wiki.  Thanks for your help.  -- Philipsaj 7:44, 23 January 2013 (CST)


 * Well I think I got it now. What is needed is automatic aliasing if the case of a page title and the case of its occurrence on the page do not match, so that link such as Snow ball are generated. I've added this functionality to the version 1.8.1 which I uploaded just now. You are probably right that a two-pass algorithm would be more useful, so that case-sensitive matches are preferred. I'll think about this further. Bovender (talk) 17:48, 26 January 2013 (UTC)


 * Version 2.0.0 of the extension should do what you requested, and I think it's a much better way to do the linking than before: First, a case-sensitive search for page titles is performed (with the exception of the first letter of a page title, which is searched for in a case-insensitive way). In a second pass, a case-insensitive search is performed, and 'piped' links "..." are added as needed. If you find that this two-pass mechanism slows down your wiki, you can turn off this behavior by setting $wgLinkTitleSmartMode = false. Hope this helps! -- Bovender (talk) 20:29, 29 January 2013 (UTC)

Blacklist pages
Thanks for this extension, it really helps when working with people that haven't grasped yet that a Wiki is there for linking contents. However, we have pages where we would like to have no links at all. So it would be cool if you considered to add a blacklist parameter for pages based on their titles analogously to the word based blacklist. What do you think of that idea? (Jonas)
 * Hi Jonas, that should be feasible. I wonder whether the best way to implement this would be a blacklist array in the config file, or something like a "__NOLINKTITLES__" command in the content of the page. Or both. Bovender (talk) 19:14, 14 February 2013 (UTC)
 * Uh, quick, cool :). I think it would be better to have a directive because then each author can specify that and not only the server admin who can access the config file. Both could also be an option to have administrative prescriptions that are not editable. In the case you would do both it would probably make sense to have flags that (de-)activate them.
 * Jonas, I've added a new 'magic word'  which can be added to a given page to prevent the extension from adding links. It's in the new 2.1.0 release. Hope it suits your needs! -- Bovender (talk) 13:23, 23 February 2013 (UTC)

Wir haben das Update installiert, bekommen aber eine Exception von unserem Wiki (es handelt sich um das Paket 'mediawiki 1:1.15.5-2squeeze5' aus Debian Squeeze - meinst du es könnte ein Problem sein, dass wir nicht 1.18 haben?) : Magic word 'MAG_LINKTITLES_NOAUTOLINKS' not found

Backtrace:

Hast du eine Idee, woran das liegen könnte? (Jonas)
 * 1) 0 /usr/share/mediawiki/includes/MagicWord.php(244): Language->getMagic(Object(MagicWord))
 * 2) 1 /usr/share/mediawiki/includes/MagicWord.php(197): MagicWord->load('MAG_LINKTITLES_...')
 * 3) 2 /var/lib/mediawiki/extensions/LinkTitles/LinkTitles.body.php(206): MagicWord::get('MAG_LINKTITLES_...')
 * 4) 3 [internal function]: LinkTitles::removeMagicWord(Object(Parser), 'parse('parse('wrapWikiMsg('mainPrefsForm('')
 * 11) 10 /usr/share/mediawiki/includes/specials/SpecialPreferences.php(15): PreferencesForm->execute
 * 12) 11 [internal function]: wfSpecialPreferences(NULL, Object(SpecialPage))
 * 13) 12 /usr/share/mediawiki/includes/SpecialPage.php(771): call_user_func('wfSpecialPrefer...', NULL, Object(SpecialPage))
 * 14) 13 /usr/share/mediawiki/includes/SpecialPage.php(559): SpecialPage->execute(NULL)
 * 15) 14 /usr/share/mediawiki/includes/Wiki.php(229): SpecialPage::executePath(Object(Title))
 * 16) 15 /usr/share/mediawiki/includes/Wiki.php(59): MediaWiki->initializeSpecialCases(Object(Title), Object(OutputPage), Object(WebRequest))
 * 17) 16 /usr/share/mediawiki/index.php(116): MediaWiki->initialize(Object(Title), NULL, Object(OutputPage), Object(User), Object(WebRequest))
 * 18) 17 {main}
 * Hi Jonas, es hatten sich ein paar Dinge eingeschlichen, die mit PHP < 5.3 Probleme verursacht haben. Das habe ich jetzt behoben (Version 2.1.1). Kann sein, daß das Euer Problem schon löst. Habt Ihr sicher die mit 2.0.0 neu eingeführte Datei  auf dem Server? Falls es immer noch Probleme gibt, bitte Rückmeldung! -- Bovender (talk) 18:07, 6 March 2013 (UTC)
 * Moin, ich hab eben das repo aktualisiert und die magic-Datei ist auch da, aber es kommt immernoch die Exception. PHP hat bei uns Version 5.3 (5.3.3-7+squeeze15). Die Exception kommt übrigens auch beim Aufrufen, sprich anschauen von Seiten, nicht nur wenn man beim editieren speichern will. Falls ich irgendwelche Experimente/Dumps/Traces beisteuern kann, die dir helfen würden, lass es mich wissen.
 * Endlich bin ich dazu gekommen, den Fehler zu reproduzieren. In einer virtuellen Maschine mit Ubuntu Server 12.04 (64-bit), PHP 5.3.10 und dem alten MediaWiki 1.15.5 bekomme ich denselben Fehler. Wenn ich dann einfach die aktuelle MediaWiki 1.20.3 darüber spiele (und die StartProfiler.*-Dateien aus dem Wiki-Rootverzeichnis lösche, die beim Update zu Fehlermeldungen führen), funktioniert die Extension. Es hängt also von der Version ab. In zwei weiteren Wikis (lokal mit PHP 5.4.6-1ubuntu1.1/MediaWiki 1.17.0 und im Web mit PHP 5.2.6-1+lenny16/MediaWiki 1.18.0) klappt es ebenfalls problemlos. Bislang habe ich nicht herausgefunden, warum es mit MediaWiki 1.15.5 nicht funktioniert; die ChangeLogs geben dazu m.E. nichts her. Der naheliegende Workaround besteht in einem Upgrade Eures Systems... aber das ist ja oftmals keine praktikable Lösung. Ich muß weiter forschen. -- Bovender (talk) 15:34, 15 March 2013 (UTC)

Slight problem with parsing
I just discoverd a problem with the parser. It parses text inside and tags and then converts them to links. This causes the resulting output to show the brackets and link text INSIDE a block where links will never be parsed.

(For some reason I cant currently log in, I think I forgot the password, but I will check back. By the way, look at my next topic.) ok my sig now C.Jason.B (talk) 20:50, 6 July 2013 (UTC)


 * This looks feasible, I just need to find time to edit the code. -- Bovender (talk) 19:04, 7 July 2013 (UTC)

Added Functionality
By the way I am the user who changed the testing tag to include 1.21. I've been abusing it on this wiki for over a week flawlessly except for the whole pre and code thing.

I needed this extension to work accross several namespaces for a project I am working on. Before this I had ZERO experience writting PHP code.

I have now modified the extension to accept the following:
 * Crossnamespace linking is de-activatable by setting whitelist to include only namespace 0
 * A whitelist of namespaces to link against. (If set to include only '0', it over rides the cross namespace thing)
 * A weighted list of the priority of the namespaces to link against (Current namespace is always top priority)
 * Boolean flag that checks for reorder teh namespace weights based on user group membership
 * Additional weight lists which modify the primary one if the above flag is true

Since I have been coding php for 3 days only, my codding skills kinda suck, but:
 * It works
 * It has caused no errors with extensive testing (save the problem I reported previously, but I dont know if it is caused by yours or mine)(I assume both since I didnt modify the search part of the code at all)

Would you like the code? I have a dump e-mail account at aragornnrogara  at  Yahoo if you want to drop me a line and have me send it to ya. All the features are able to be switched off, so it cant hurt to have them.

Since it took me almost three full days to write under 100 lines of code, I probabgly wont be maintaining this, but I thought others might want the added functionality.

The only core functionality I altered is in the two callback functions, added a parameter to the main parse function ( I had it pass on the user) and altered $1 in the parser.

Also since your skills are far superior to mine, I told it to simply return true if it encountered a page in the database of the form Page/subpage/subpage or deeper (an IF that checks for subst_count of "\/" being more than 1 right after you had it create safe_title or watever it was called.)

OK, my sig now

C.Jason.B (talk) 20:51, 6 July 2013 (UTC)
 * Are you familiar with Git? If so, you could issue a pull request on the repository at https://github.com/bovender/LinkTitles -- otherwise, if you want you may send me the code at xltoolbox at gmx net. -- Bovender (talk) 19:07, 7 July 2013 (UTC)


 * You may flog me now, with 10+ years using linux I have no idea how to use either subversion or git. I responded to your email adress (ignore the mysql question sent later)