Project:Support desk

Jump to navigation Jump to search

About this board

Welcome to's Support desk, where you can ask MediaWiki questions!

There are also other places where to ask :

Before you post

Post a new question

  1. To help us answer your questions, please always indicate which versions you are using (reported by your wiki's Special:Version page):
    • MediaWiki
    • PHP
    • Database
  2. Please include the URL of your wiki unless you absolutely can't. It's often a lot easier for us to identify the source of the problem if we can look for ourselves.
  3. To start a new thread, click "Start a new topic".
Previous page history was archived for backup purposes at Project:Support_desk/old on 2015-07-30.
Other languages: English  العربية čeština Esperanto français 日本語 中文

Cannot Access MediaWiki from other networks

1 (talkcontribs)


thanks for this great tool.

We installed it on our Synology Nas. It works very well when we access it from the local wifi.

The moment we try to access it from another network (vpn, phone network) we get error message (below).

We can access the interface of our NAS from any network (so they refuse responsibility).

Help would be greatly apreciated.

Thank you!

From Chrome:

The browser you are using may have difficulty receiving images and video. If you experience any issues viewing this page, we suggest using Internet Explorer, Mozilla Firefox, or Apple Safari.

From Internet Explorer:

The website declined to show this webpage

  HTTP 403


Most likely causes:

•This website requires you to log in.


What you can try:


  Go back to the previous page.  


More information  More information   

This error (HTTP 403 Forbidden) means that Internet Explorer was able to connect to the website, but it does not have permission to view the webpage.

For more information about HTTP errors, see Help.


Reply to "Cannot Access MediaWiki from other networks"

What happened to Close Captioning 🙁

Lindamalan (talkcontribs)

Close captioning has disappeared

Malyacko (talkcontribs)

If you explained what "Close captioning" is or was and clear steps where to see "Close Captioning" in MediaWiki then someone might be able to answer.

TheDJ (talkcontribs)

Disappeared from where ?

Reply to "What happened to Close Captioning 🙁" (talkcontribs)


I have a MediaWiki site with an older version of MediaWiki. Links to articles which don't exist are red. But if I create such an article, the link remains red until I edit the linking article and save it. Is it possible to make all the links blue (not red) automatically when a new article is created?


David. (talkcontribs)

MediaWiki 1.25.1

PHP 5.5.9-1ubuntu4.11 (apache2handler)

MySQL 5.5.43-0ubuntu0.14.04.1

Ciencia Al Poder (talkcontribs)

See manual:Job queue for recomendations. You'll need to set a job runner service.

Reply to "Red Links"

PHP Access Question (2nd try with explanation)

Riventree (talkcontribs)


I am trying to build a "spartan" version of index.html which will ONLY do one thing: Display a page given a title.

This should be really short, something like:




$title= htmlspecialchars($_GET["title"])

GetBodyContent ($title, $mw_body_content)l

print $mw_body_content;




I know that won't have all the right divs and such, but it explains what I'm looking for.

If I understand the mediawiki architecture properly, each page goes from raw (user submitted) wikitext to "expanded" (substituted) wikitext, and thence to actual (x)html. I believe this "core page image" html is stored in the database somewhere (I'm guessing somewhere in "text", but I don't know)

Obviously, index.php is the entry point, but the query that actually results in html is hidden deep under the skins and WikiPage (I think) and I'm stuck.


1) What is the correct name of the function or functions I am looking for (GetBodyContent is probably wrong)

2) What file are they in?

Jonathan3 (talkcontribs)

How about starting with an existing skin and strip out the bits you don't want?

Riventree (talkcontribs)

Jonathan3: Thank you for replying. Pardon me while I beat my head

Arrrrrrgh. I post WITH a reason why I don't use skins, and everyone gets confused. I summarize the question and leave that bit out, and people ask why I don't do it with a skin...

The answer is: If you take away the links in the skin, someone can still craft a URL with action=foo and do the things I want to disallow. Thus, skins are not the answer.

Riventree (talkcontribs)

See "PHP question" for a more direct request.

Ciencia Al Poder (talkcontribs)
Jonathan3 (talkcontribs)

I like the "XY problem" page!

Sounds like MediaWiki isn't the answer here (to whatever the question is).

Are Github and Phab-something really the only ways to access the dev forum?

Riventree (talkcontribs)

Seems like a standard email-username-password ought to be an option

Malyacko (talkcontribs)

@Riventree: What is a "dev forum"? Neither Github nor Phabricator are "dev forums". See Phabricator what Phab is [not].

Samwilson (talkcontribs)

I reckon you're referring to and yeah, they are at the moment the only options. Actually, in practice this means that you must have a Wikimedia account (which you do, Riventree), because you can log into Phabricator as that. Which means it's a two-step process, but does work.

When Discourse makes it into production (i.e. off it'll be a one-step thing and you'll still only need your main Wikimedia username.

The reason for not supporting Wikimedia login directly from Discourse is phab:T124691.

Reply to "Are Github and Phab-something really the only ways to access the dev forum?"
Tofiq Kərimli (talkcontribs)

A basic installation of Wikibase is pretty much straight forward and basically consists of 4 steps:

# getting Wikibase

# fetching dependencies

# modifying LocalSettings.php

# running some maintenance scripts.

The latter is not clear to me. I can not figure out.

You write:

# if composer is available as a binary composer install --no-dev # if you downloaded composer.phar php composer.phar install --no-dev # Or use dockerized version (does not require PHP or composer installed) docker run -it --rm --user $(id -u):$(id -g) -v ~/.composer:/tmp -v $(pwd):/app install --no-dev

But I can not understand where the file is located? Can you tell me this? Thanks in advance.

Malyacko (talkcontribs)

Where is that written? Please provide a link to the documentation that you're following.

Tofiq Kərimli (talkcontribs)
Reply to "Wikibase/Installation"

Problem regarding 'Extension:StructuredDiscussions' a.k.a 'Flow' .

Falcopragati (talkcontribs)

I have installed the version of 'Extension:Flow' which is compatible with my MediaWiki(version-1.32.2)

'Flow' extension appears on the 'Special:version' page too the 'Installed extensions' sections of my wiki.

But still, it does not runs properly and this error now appears-->

Fatal error: Class 'Pimple\Container' not found in C:\wamp64\www\wiki\extensions\Flow\includes\Container.php on line 5

Ciencia Al Poder (talkcontribs)

I think you need to run composer update --no-dev on the Flow directory

Reply to "Problem regarding 'Extension:StructuredDiscussions' a.k.a 'Flow' ."
Riventree (talkcontribs)

Given a page title in $title, how do I get the html for the page's body? What is the function to call?

Ciencia Al Poder (talkcontribs)
Reply to "PHP question"
Riventree (talkcontribs)

I have just found this page and am almost certainly asking a dumb question that is documented somewhere, and I've read the documentation pages for "Page" and "Text", but there's either not enough, or far too much, data there for my brain to parse.

In short, I'm feeling a bit lost and I'm asking for help.

I am trying to do something that is slightly against the basic wiki premiseː produce a read-only engine that provides lookup-by-title, what-links-here, and search, but no other "actions" whatsoever. Simply adding a new skin won't remove access to the core functionality if someone clever crafts their own url, so I know I need to actually do PHP-level work.

If I understand the mediawiki architecture properly, each page goes from raw (user submitted) wikitext to "expanded" (substituted) wikitext, and thence to actual (x)html. I believe this "core page image" html is stored in the database somewhere (I'm guessing somewhere in "text", but I don't know)

I have many years of programming under my belt, so I'm confident I can muddle my way through once I find the part of code that says "This is the title, give me the HTML". I can handle the PHP, SQL, html, css, javascript et al, but I've been stymied trying to drill down from index.php to the parts of the engine I'm looking for.

Obviously, index.php is the entry point, but the query that actually results in html is hidden deep under the skins and WikiPage (I think) and I'm stuck.

I'm hunting for three bits of code, which i imagine don't look anything like this: :)

Query::GetBodyContent (title, &mw_body_content)


Query::WhatLinksHere(title, &listOfPageTitles)

Can someone give me a pointer to the right php file(s) or class(es) to continue my search?

Thank you kindly,


Ciencia Al Poder (talkcontribs)

I'm not sure what's your goal. Use MediaWiki but prevent other users to edit pages? Then see Manual:Preventing access.

If you want to start from scratch your own content management system, I guess this will bee too complicated, and it's better for you to not attempt to see how things are done here, because what you want to do is relatively simple on its own having a proper database design.

Jonathan3 (talkcontribs)

I wonder if you could keep the actual wiki completely hidden (behind .htaccess or whatever) and use the MediaWiki API to create a separate site with the minimal features you need.

Riventree (talkcontribs)

See "PHP question" for a more direct request.

Custom special page to manage old pages

3 (talkcontribs)

Hi, we're wondering how we can manage old pages. We would like to check once in a while which pages of a certain category exist whose last modified date are more than one year ago. We could then ask the owners to validate them to make sure they are always up-to-date. This process is important for some content.

Can this be done with a custom special page? Can I easily create a page that gives me alle pages of a certain category with last modified date between x and y?


Jonathan3 (talkcontribs) (talkcontribs)
Reply to "Custom special page to manage old pages"