Project:Support desk

Jump to navigation Jump to search

About this board

Welcome to the MediaWiki Support desk, where you can ask MediaWiki questions!

(Read this message in a different language)

See also

Other places to ask for help:

Before you post

Post a new question

  1. To help us answer your questions, please indicate which versions you are using, as found on your wiki's Special:Version page:
    • MediaWiki version
    • PHP version
    • Database type and version
  2. Please include the web address (URL) to your wiki if possible. It's often easier for us to identify the source of the problem if we can see the error directly.
  3. To start a new thread, click "Start a new topic".

Requesting help using API action=clientlogin

9
Mrwassen (talkcontribs)

Hi guys,

As a relative noobie to mediawiki I am looking to get a little help regarding a very basic SSO project. I have a php based web site "mysite" which requires user login. I am trying to come up with a php script that does the following:


1) User logs into mysite

2) The user login scripts executes a "wikilogin.php" script

3) The wikilogin.php script logs into mywikisite and creates the required cookies in the browser

4) User can now go to mywikisite and access pages etc. without having to log into mywikisite


I am not trying to build any logic to manage user creation/change/deletion or password change. The assumption for now is simply that credentials are identical across the 2 applications.


I have tried to put together a basic "wikilogin.php" which uses the mediawiki api as follows:

  a) Get a logintoken using "api?action=query&meta=tokens&type=login&format=json

  b) Parse out the returned logintoken to a string variable

  c) Perform the login using "api?action=clientlogin&username=joe&password=secret&logintoken=<token from step 2>"&loginreturnurl=http:mysite,org"


however I am running into the error:

  "code": "badtoken", "info": "Invalid CSRF token."


I have tried to change "type"="csrf" in step 1), however then I get:

  "code": "nologintoken","info": "The \"logintoken\" parameter must be set."


Below is the php - any help would be much appreciated.

Thanks

Dennis


//

<?php

$ch = curl_init();

curl_setopt($ch, CURLOPT_URL,"http://mywikisite.org/api.php");

curl_setopt($ch, CURLOPT_POST, 1);

curl_setopt($ch, CURLOPT_POSTFIELDS,

            "action=query&meta=tokens&type=login&format=json");

// Receive server response ...

curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);

$server_output = curl_exec($ch);

//echo $server_output;

$logintoken_array = json_decode($server_output);

$logintoken = $logintoken_array->query->tokens->logintoken;

echo $logintoken;

curl_setopt($ch, CURLOPT_URL,"http://mywikisite.org/api.php");

curl_setopt($ch, CURLOPT_POST, 1);

curl_setopt($ch, CURLOPT_POSTFIELDS,

"action=clientlogin&username=joe&password=secret&logintoken=" . $logintoken . "&loginreturnurl=http://mywikisite.org");

// Receive server response ...

curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);

$server_output = curl_exec($ch);

echo $server_output;

curl_close ($ch);

?>

Mrwassen (talkcontribs)

Forgot to mention versions:

mediawiki 1.34.4

php 7.2

Bawolff (talkcontribs)

i think you need to tell curl to save and send cookies for the login to work.


You may also be interested in reading about SessionManager - which i think is the more proper way to do what you are trying to do in MediaWiki.

Mrwassen (talkcontribs)

Hi Bawolff,


Thanks for your help - I was able to make some progress: I rewrote the php script to write a cookie file and then login using action=clientlogin which thankfully returned the following response:

{ "clientlogin": { "status": "PASS", "username": "Admin" }}

However, I think I am still missing something: once this login was successful I was expecting in the same browser to be able to open a mediawiki page without logging in, however the main page is still showing "not logged in".

I also noticed that after the script successfully logged in, there was nothing listed in the browsers "Storage/Cookies" list under the domain.

Is this a case of me not understanding how cookies work?

Any help appreciated.

Dennis

EDIT: or will I need to programmatically open the wiki page from php using something like Headers() after using setcookies() to set the cookies?

EDIT#2:

OK so a little further progress:

I logged in as normal directly through the wiki main page to determine cookie behavior and saw that 3 cookies are created:

1) session cookie containing a token

2) username cookie

3) user ID cookie

I then added code to the php script which replicates these exact cookies after the login is completed using the newly acquired token to create the session cookie.

What I notice is that I can run my php script and see the 3 cookies get created and when the script echos the page, it shows as logged in.

However the moment I click on a link to go to a different wiki page, the user ID and session cookies disappear and only the user ID cookie remains "in the browser" and the page logs out.

So it seems I am close, but the elusive part is how to get those cookies to persist so that the session stays logged in?


php:


<?php

$cookie_jar = tempnam('/volume1/web/cookies','cookie');

//retrieve token

$c = curl_init('http://mywikisite/api.php?action=query&meta=tokens&type=login&format=json');

curl_setopt($c, CURLOPT_POST, 1);

curl_setopt($c, CURLOPT_HEADER, 0);

curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($c, CURLOPT_COOKIEFILE, $cookie_jar);

curl_setopt($c, CURLOPT_COOKIEJAR, $cookie_jar);

$page = curl_exec($c);

$logintoken_array = json_decode($page);

$logintoken = $logintoken_array->query->tokens->logintoken;

echo $logintoken;

curl_close($c);

//log in

$c = curl_init('http://mywikisite/api.php');

$post = [

    'action' => 'clientlogin',

'password' => 'xxxxxxxxx',

    'username' => 'admin',

    'logintoken' => $logintoken,

'loginreturnurl' => 'http://1mywikisite/index.php'

];

$c = curl_init('http://mywikisite/tng/wiki/api.php');

curl_setopt($c, CURLOPT_POST, 1);

curl_setopt($c, CURLOPT_POSTFIELDS, $post);

curl_setopt($c, CURLOPT_HEADER, 0);

curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($c, CURLOPT_COOKIEFILE, $cookie_jar);

curl_setopt($c, CURLOPT_COOKIEJAR, $cookie_jar);

$page = curl_exec($c);

curl_close($c);

//create 3 cookies

$cookie_name = "tng_upgrade_12_3_wiki__session";

$cookie_value = $logintoken;

setcookie($cookie_name, $cookie_value, time() + (86400 * 30), "/"); // 86400 = 1 day

$cookie_name = "tng_upgrade_12_3_wiki_UserID";

$cookie_value = "1";

setcookie($cookie_name, $cookie_value, time() + (86400 * 30), "/"); // 86400 = 1 day

$cookie_name = "tng_upgrade_12_3_wiki_UserName";

$cookie_value = "Admin";

setcookie($cookie_name, $cookie_value, time() + (86400 * 30), "/"); // 86400 = 1 day

//open wiki

$c = curl_init('http://mywikisite/index.php?title=Main_Page');

curl_setopt($c, CURLOPT_HEADER, 0);

curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($c, CURLOPT_COOKIEFILE, $cookie_jar);

curl_setopt($c, CURLOPT_COOKIEJAR, $cookie_jar);

$page = curl_exec($c);

echo $page;

curl_close($c);

?>

Bawolff (talkcontribs)

The normal approach would be for your script to set its own cookies, and then use sessionmanager so that mediawiki recognizes those cookies and then sets its own cookies appropriately.


I think the reason why your code isnt working is because you are assuming that the login token will be the same as the session cookie value (i dont think it is)

Mrwassen (talkcontribs)

Hi Bawolff.,

Thanks again - based on your earlier reply, I managed to figure out that the returned token is NOT the same as the session cookie token (as you also mentioned above). I was able to get curl to write the cookies to a temporary "cookie jar" file which I then read, parse into an array after which I submit a set of setcookie() commands which create the cookies in the browser.

The one minor issue I had is that it seems that action=clientlogin does not (yet?) support a "rememberMe" attribute (despite documentation showing an example using it), but since I am now fairly fluent in cookie baking, I am able to tweak the expiry time to get a session length I would like.

I am only able to use v. 1.34.4 due to other constraints, but perhaps rememberMe was introduced in 1.35.xx?

In any case, many thanks for your help!

Thanks

Dennis

Sasha Rizzetto (talkcontribs)

Hi Mrwassen,

did you finally manage to make the script working so that the session stays logged in? And if yes, could you please share your code?

Thanks

Bawolff (talkcontribs)

Details might vary depending on which login extensions are installed, your wiki may be different than wikipedia. Check api.php?action=query&meta=authmanagerinfo&amirequestsfor=login on your wiki for what the remember me field is named.

Mrwassen (talkcontribs)

Bawolf: thanks for that advice, I will do some more digging on this.

Sasha: yes I finally managed to get this working, below is the code. Modify with your login details, URL's and temp folder path then run the script. Once you have run the script, you should find the cookie jar file in the temp folder as well as "login_details.txt" which will contain the login token and cookie details which were used to create the cookies.

(Since the final curl call redirects a header with the cookies, it is not possible to do any echos prior to that, hence the "log file").

Thanks

Dennis

<?php

    //open file for logging progress:

    $temp_folder = '/volume1/web/cookies/';

$log = fopen($temp_folder . 'login_details.txt', 'w');

    //create cookie jar file to temporary store cookie data

$cookie_jar = tempnam($temp_folder,'cookie');

$tokenurl='http://192.168.xxx.xxx:90xx/wiki/api.php?action=query&meta=tokens&type=login&format=json';

// acquire mediawiki login token

$c = curl_init($tokenurl);

curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($c, CURLOPT_COOKIEFILE, $cookie_jar);

curl_setopt($c, CURLOPT_COOKIEJAR, $cookie_jar);

$page = curl_exec($c);

$logintoken_array = json_decode($page);

$logintoken = $logintoken_array->query->tokens->logintoken;

    fwrite($log,'Token = ' . $logintoken . PHP_EOL . PHP_EOL);

// log in to mediawiki using action=clientlogin

curl_close($c);

$post = [

'action' => 'clientlogin',

'username' => '<your username>',

'password' => '<your password>',

'logintoken' => $logintoken,

'loginreturnurl' => 'http://192.168.xxx.xxx:90xx'

    ];

$loginurl='http://192.168.xxx.xxx:90xx/wiki/api.php';

$c = curl_init($loginurl);

curl_setopt($c, CURLOPT_POST, 1);

curl_setopt($c, CURLOPT_POSTFIELDS, $post);

curl_setopt($c, CURLOPT_HEADER, 0);

curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($c, CURLOPT_COOKIEFILE, $cookie_jar);

curl_setopt($c, CURLOPT_COOKIEJAR, $cookie_jar);

//write cookies to the file $cookie_jar

$page = curl_exec($c);

curl_close($c);

//extract cookies from $cookie_jar

$file=$cookie_jar;

$fopen = fopen($file, "r");

$fread = fread($fopen,filesize($file));

fclose($fopen);

$remove = "\n";

$split = explode($remove, $fread);

$array[] = null;

$tab = "\t";

foreach ($split as $string) {

$row = explode($tab, $string);

array_push($array,$row);

}

    fwrite($log, 'Cookie details:' . PHP_EOL);

for ($x = 5; $x <= 9; $x++) {

$cookie_name =  $array[$x][5];       

        if (isset($cookie_name)) {

$cookie_expire = $array[$x][4];

$cookie_value = $array[$x][6];

setcookie($cookie_name, $cookie_value, $cookie_expire, "/");

fwrite($log, $cookie_name . '|' . $cookie_value . '|' . $cookie_expire . PHP_EOL);

}

}

//delete the file $cookie_jar

//if (file_exists($cookie_jar)) {

//unlink($cookie_jar);

//    }

    fclose($log);

?>

Reply to "Requesting help using API action=clientlogin"

How to Exclude NoIndex Pages From Sitemap?

1
Goodman Andrew (talkcontribs)

How do I exclude noindex pages from site map in MW1.35 when running generatesitemap.php

The noindex tag is applied via $wgNamespaceRobotPolicies or $wgDefaultRobotPolicies

Reply to "How to Exclude NoIndex Pages From Sitemap?"
GOVINDBHAI VITHALBHAI TADAVI (talkcontribs)

Karali Nana Faliya Mahadev Mandir Same At: Karali Post: Karali Taluka: Jetpur Pavi Distic: Chhota Udepur State: Gujarat Cantry: India Zip Code:391135 City: Bodeli Chhota Udepur Gujarat India Zip Code:391135

Reply to "7567082597"

Import failed: No pages to import

7
Goodman Andrew (talkcontribs)

This happened after I upgraded from MW1.33 to 1.35

What I'm I missing?

MarkAHershberger (talkcontribs)

You need to provide a few more details about what you did.

Goodman Andrew (talkcontribs)

I exported a page from Wikipedia then tried to import it into my wiki as usual but it doesn't work. I tried with different pages with varying sizes (some 100+kb, others 200+kb), no luck

Malyacko (talkcontribs)

If "it didn't work" then you have to elaborate why and how it didn't work.

Goodman Andrew (talkcontribs)

@Malyacko: if I knew the why to this problem I won't have come here.

Malyacko (talkcontribs)

@Goodman Andrew I asked because "it doesn't work" is so little information that usually nobody is able to help. As Mark already wrote, this still lacks details and complete steps to reproduce (click by click, step by step), error messages, etc.

Goodman Andrew (talkcontribs)

@Malyacko: pay attention, the topic of this thread is: "No pages to import" that's the error message.

Reply to "Import failed: No pages to import"

Has MW 1.35 changed when/how the HTTP header, Content-Length, are used?

6
Peculiar Investor (talkcontribs)

I recently upgraded my wiki from MediaWiki 1.31 (PHP 7.3) to MediaWiki (PHP 7.4) on a shared hosting plan. We use the Extension:MobileFrontend. Since the upgrade a subset of my mobile device users (seems to be iPhone and iPad) are reporting that pages won't fully load and/or they get a "cannot parse response" error.

I cannot reproduce the problem on a localhost server (CentOS 8.2 with PHP 7.4).

We cannot reproduce the issue with a desktop browser (Chrome or Edge) DevTools emulating these devices, which is frustrating to say the least.

I've recreated my MediaWiki 1.31 in another directory on the shared hosting service and looked for differences in the network traffic. One difference we're noticed is that MediaWiki 1.35's initial header response includes a content-length parameter which isn't present when the same page is loaded from MediaWiki 1.31. We're using the Special:Version page for testing, so our wiki content isn't a factor (other than common interface elements such as logo and sidebar).

As a double-check, we checked the Special:Version page with The W3C Markup Validation Service and it reports "IO Error: Premature end of Content-Length delimited message body (expected: 55542; received: 11585" which we think might explain why the mobile devices are also giving an error.

I've tried changing the PHP version back to 7.3 for the MediaWiki 1.35 wiki and the problem persists.

Does MediaWiki 1.35 code include something that sets the Content-Length parameter in the HTTP headers? Is there a configuration setting involved?

Peculiar Investor (talkcontribs)
Lady G2016 (talkcontribs)

For the Project: support template:

MediaWiki 1.35.0
PHP 7.4.11 (cgi-fcgi)
MySQL 5.6.41-84.1

I forced the W3C Markup Validation Service check to use mobile view. The page has no output except for this single entry:

IO Error: Premature end of Content-Length delimited message body (expected: 51697; received: 11318 https://www.finiki.org/w/index.php?title=Special:Version&mobileaction=toggle_view_mobile

I can view the page in my desktop Chrome browser in both desktop and mobile views.

(This doesn't answer your question, but it's additional information to help identify the problem.)

Rom2cu (talkcontribs)

Our installation:

MW1.35.0, PHP 7.4.3 (cgi-fcgi), MySQL 5.7.29-log

It works fine except that Safari users get a blank screen with a "NSPOSSIXErrorDomain:100" and iPads and iPhones also get blank screens. No error with Chrome on the Macs. Is this the upstream bug the MW 1.35 system requirements warn against "MediaWiki is not compatible with PHP 7.4.0 to 7.4.2 due to an upstream bug"?

Peculiar Investor (talkcontribs)

To address the above point, I'm seeing the problem with PHP 7.4.11 so the not compatible issue mentioned in Compatibility does not apply. As mentioned, changing the PHP version back to 7.3.23 doesn't change the behaviour so I am thinking it is a MediaWiki related issue.

I've already worked though Manual:Common errors and symptoms and in particular the 'You see a Blank Page' section and it doesn't help identify any of the mentioned issues or help resolve the problem.

I don't know whether or not the HTTP header information for content-length appearing only with MediaWiki 1.35 and not with MediaWiki 1.31 on the same shared hosting server is a a symptom of the problem or a cause, but everything else that has been checked or investigated can explain why the Apple mobile devices are having issues with blank pages or giving "cannot parse response" errors.

Bawolff (talkcontribs)

Try setting $wgDisableOutputCompression = true;

MediaWiki will set a content length header if the HTTP version is 1.0 and gzipping is enabled

There are also certain situations where if MW wants to stay executing after it does output, and is not running as fast-cgi, where it will set a content-length header (4f11b614544be).

Reply to "Has MW 1.35 changed when/how the HTTP header, Content-Length, are used?"

Wiki Search broken after upgrade to 1.35

2
Rajeshekv (talkcontribs)

Can someone help me with the below error, it seems the search is broken after the upgrade.


[b2da0aa0b75271aba78dbd75] /index.php?search=jboss&title=Special%3ASearch&go=Go Wikimedia\Rdbms\DBQueryError from line 1699 of C:\wiki\websites\mediawiki\includes\libs\rdbms\database\Database.php: A database query error has occurred. Did you forget to run your application's database schema updater after upgrading?

Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'NATURAL LANGUAGE MODE) DESC LIMIT 21' at line 1 (127.0.0.1)

Function: SearchMySQL::searchInternal

Query: SELECT page_id,page_namespace,page_title FROM `erp_page`,`erp_searchindex` WHERE (page_id=si_page) AND ( MATCH(si_title) AGAINST('+jboss ' IN BOOLEAN MODE) ) AND page_namespace = 0 ORDER BY MATCH(si_title) AGAINST('+jboss ' IN NATURAL LANGUAGE MODE) DESC LIMIT 21

Backtrace:

#0 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\Database.php(1683): Wikimedia\Rdbms\Database->getQueryException()

#1 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\Database.php(1658): Wikimedia\Rdbms\Database->getQueryExceptionAndLog()

#2 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\Database.php(1227): Wikimedia\Rdbms\Database->reportQueryError()

#3 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\Database.php(1907): Wikimedia\Rdbms\Database->query()

#4 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\DBConnRef.php(68): Wikimedia\Rdbms\Database->select()

#5 C:\wiki\websites\mediawiki\includes\libs\rdbms\database\DBConnRef.php(313): Wikimedia\Rdbms\DBConnRef->__call()

#6 C:\wiki\websites\mediawiki\includes\search\SearchMySQL.php(193): Wikimedia\Rdbms\DBConnRef->select()

#7 C:\wiki\websites\mediawiki\includes\search\SearchMySQL.php(179): SearchMySQL->searchInternal()

#8 C:\wiki\websites\mediawiki\includes\search\SearchDatabase.php(74): SearchMySQL->doSearchTitleInDB()

#9 C:\wiki\websites\mediawiki\includes\search\SearchEngine.php(156): SearchDatabase->doSearchTitle()

#10 C:\wiki\websites\mediawiki\includes\search\SearchEngine.php(187): SearchEngine->{closure}()

#11 C:\wiki\websites\mediawiki\includes\search\SearchEngine.php(157): SearchEngine->maybePaginate()

#12 C:\wiki\websites\mediawiki\includes\specials\SpecialSearch.php(387): SearchEngine->searchTitle()

#13 C:\wiki\websites\mediawiki\includes\specials\SpecialSearch.php(179): SpecialSearch->showResults()

#14 C:\wiki\websites\mediawiki\includes\specialpage\SpecialPage.php(600): SpecialSearch->execute()

#15 C:\wiki\websites\mediawiki\includes\specialpage\SpecialPageFactory.php(635): SpecialPage->run()

#16 C:\wiki\websites\mediawiki\includes\MediaWiki.php(307): MediaWiki\SpecialPage\SpecialPageFactory->executePath()

#17 C:\wiki\websites\mediawiki\includes\MediaWiki.php(940): MediaWiki->performRequest()

#18 C:\wiki\websites\mediawiki\includes\MediaWiki.php(543): MediaWiki->main()

#19 C:\wiki\websites\mediawiki\index.php(53): MediaWiki->run()

#20 C:\wiki\websites\mediawiki\index.php(46): wfIndexMain()

#21 {main}

Rajeshekv (talkcontribs)

If I search for existing pages, it just pulling up results just just fine, but If i search with some key words in those pages, it giving the above error. Can someone please help with this.

Reply to "Wiki Search broken after upgrade to 1.35"

Visualeditor issues due to nginx???

1
Gjtokkel (talkcontribs)

Hi,

I am running wiki 1.35 php7.4 and cannot get visualeditor properly to work. I had to disable it. Apparently the issue is that the rest api is not called correctly. I am wondering whether this is due to nginx.... my phpinfo shows:

$_SERVER['SERVER_SOFTWARE'] nginx/1.10.3

Here: Parsoid I did read: "If you're serving MediaWiki with Nginx, you'll need to also add something like this to your server conf:"

location /rest.php/ { try_files $uri $uri/ /rest.php?$query_string; }


Do I need to ask my provider to add this in de nginx config in order to make visualeditor work? Is there any way I can find out myself?

Reply to "Visualeditor issues due to nginx???"

Short URL when script path and article path are the same

2
199.104.151.131 (talkcontribs)

I'm trying to setup a wiki to use short urls with the following directory structure:


Apache DocumentRoot: /var/www/html

Mediawiki stuff is in /var/www/wiki, this includes index.php, extensions, images, etc.


I can get short URLS to semi-work using the following set up:


mediawiki.conf:

Alias /wiki/skins /usr/share/mediawiki/skins

Alias /wiki /var/www/wiki

<Directory "/var/www/wiki">

  LogLevel trace8

  SetEnv MW_INSTALL_PATH "/var/www/wiki"

  AllowOverride All

  Require all granted

  Options FollowSymLinks

  RewriteEngine On

  # This checks to make sure the connection is not already HTTPS

  RewriteRule ^(.*)$ index.php?title=$1 [PT,L,QSA]

  RewriteRule ^/*$ index.php [L,QSA]

</Directory>


In LocalSettings.php I added:

$wgArticlePath = "/wiki/$1";


Now all links in the wiki omit the index.php?title= part and the rewrite rules seem to work. But the wiki only shows the bare html now, if that makes sense. The skin and the wiki logo are not being used now.

I've also tried using

$wgUsePathInfo = true;

but that doesn't change the outcome.


Does anyone know how to fix this?

199.104.151.131 (talkcontribs)

I should withdraw this question because I see that mediawiki doesn't recommend installing a wiki like this. Manual:Short URL#No Skins says it's a "beginner's mistake". Now I'm trying it again putting the mediawiki install into the /w directory.

Reply to "Short URL when script path and article path are the same"
SweeBill (talkcontribs)

Hi everyone, I am using Mediawiki on a hosted site that uses CPanel. We are attempting to set up an alias so that my users can go to a domain that we own as opposed to my hosted domain. We've set up the appropriate CNAME on our DNS for the url, and an alias in the cpanel. However, after going to the site, it does not retain the "normalized" url. My hosting site support says they think it's a configuration in the mediawiki settings that needs to be changed. Any thoughts on how to make this happen?

The site url we are attempting to use is https://constructionmanual.deldot.gov. It redirects to https://constructionmanual.deldot.a2hosted.com. Any help would be greatly appreciated!

MarkAHershberger (talkcontribs)
SweeBill (talkcontribs)

@wgServer is set to $wgServer = "https://constructionmanual.deldot.a2hosted.com"; Changing it to https://constructionmanual.deldot.gov takes me to a default a2hosting page. It's currently set like that right now, if you want to see what it's taking it to.

I'll be honest, I know just enough to be dangerous and don't have a ton of support from our IT section on this. They'd prefer I use Sharepoint's wiki, so they are relatively hands-off. I am not currently using short urls, but would be open to it if it'll fix it.

SweeBill (talkcontribs)

I have to walk away from my computer for a few hours, so I am going to change it back so that the site at least works for our users. Let me know if you'd like me to change it back so you can see what it's doing.

MarkAHershberger (talkcontribs)

Nice wiki!

I don't understand why it isn't behaving correctly. Let me know when you are around to change it back.

SweeBill (talkcontribs)

Hi Mark, I'm back. Thanks for the compliments. I've switched it back again to where it's "broken"

@MarkAHershberger

SweeBill (talkcontribs)

Also, just to make sure I mention it. Mediawiki isn't installed in it's default location. I created a subdomain that I installed it under. So, it's in a constructionmanual.deldot.a2hosted.com folder on the server. I hope that's the right way to explain it...

MarkAHershberger (talkcontribs)

I think @Bawolff is probably right. This sounds like a cpanel issue. It is probably related to the different directory names you mentioned.

SweeBill (talkcontribs)

Sorry, I ran out of time again and needed to set it back.

I may try to call A2Hosting support again just to make sure I have it all set up right and hope to get hold of someone in support that has some sort of mediawiki experience.

They seemed to think that there was something missing in the localsettings.php that I needed to set, but I'm starting to wonder now if there is just something set up incorrectly, or if I set something up incorrectly when I set the server up.

Bawolff (talkcontribs)

So when changing $wgServer and you get a default hosting page, sounds a lot like something wrong with your (apache) virtual host settings. Check if cpanel has something about virtual hosts.

SweeBill (talkcontribs)

So, just to update everyone. I had A2Hosting wipe the account and I started over from scratch. We set up the Primary Domain as deldot.gov and I create the constructionmanual subdomain under the public_html folder. Everything is working now like we want. Feel free to go to https://constructionmanual.deldot.gov to check it out.

Lady G2016 (talkcontribs)

@SweeBill I checked out your wiki.

Your footer links need content - "Privacy policy", "About", and "Disclaimers" all say "There is currently no text in this page."

This is MediaWiki's way to politely say "Page not found".

Also, consider updating to PHP 7.4. It's faster, but you'll need to consider compatibility with other applications which also run PHP on that server.

Reply to "Mediawiki and CPanel Alias"
Remy170 (talkcontribs)

Hola, recién he instalado mediawiki y quiero cambiarle la URL a mi página para que más personas puedan entrar.

Actualmente tengo que ingresar con el link: localhost/mediawiki y si este link se lo paso a alguien más obviamente no puede ingresar. ¿Cómo podría adquirir una URL donde todos puedan ingresar? GRACIAS.

Jörgi123 (talkcontribs)

I think you are basically looking for $wgServer, which you can change in LocalSettings.php. This will change the URLs, which MediaWiki creates.

Remy170 (talkcontribs)

I got to change the URL modifying the hosts file and then editing the $wgServer, however, I think that it only changes the appearance of the link because I still cannot enter from another device (in this case, my cell phone).

Thanks for answering!

2003:CC:ABDD:6000:C083:9F7E:B7FC:92A9 (talkcontribs)

The links should not point to localhost, but to some kind of URL or IP. If this is the case, then configuration of MediaWiki is fine. The next step then is to make sure that this URL or IP is reachable via your network so that another device can reach it.

Ciencia Al Poder (talkcontribs)

Para que más gente pueda acceder, lo ideal es instalarlo en un servidor de internet. Si el proyecto es muy pequeño y quieres tenerlo en el PC de casa, deberás primero saber cuál es tu dirección IP pública (lo puedes ver en alguna web como http://whatismyipaddress.com/), y esa será la dirección que poner en $wgServer y la que debes pasar a los que quieran acceder a tu sitio. Es posible que debas configurar en el router la redirección de puertos, normalmente del puerto 80 hacia el PC que tiene MediaWiki, y configurar los firewall para que permitan el tráfico externo.

Debido a que la dirección IP pública de tu casa puede cambiar con el tiempo, lo mejor es contratar un dominio, o bien un servicio dyndns para usar un nombre de dominio en lugar de una IP.

Remy170 (talkcontribs)

Gracias, pero aún así no pudieron acceder. :(

La Gata Flowery (talkcontribs)

¿Ya tienen la respuesta? Llevo semanas tratando de hacer publica mi wiki, pero nada (,u.u,)

Ciencia Al Poder (talkcontribs)

Qué no entendiste de mi respuesta?

La Gata Flowery (talkcontribs)

No quiero ofender, pero honestamente no entiendo nada :(

Ciencia Al Poder (talkcontribs)
La Gata Flowery (talkcontribs)

Olvidé agradecerte. El vídeo me ha ayudado!


¡Gracias :)!

Reply to "Cambiar URL de mi wiki"