Project:Support desk

Jump to navigation Jump to search

About this board

Welcome to MediaWiki.org's Support desk, where you can ask MediaWiki questions!

There are also other places where to ask :

Before you post

Post a new question

  1. To help us answer your questions, please always indicate which versions you are using (reported by your wiki's Special:Version page):
    • MediaWiki
    • PHP
    • Database
  2. Please include the URL of your wiki unless you absolutely can't. It's often a lot easier for us to identify the source of the problem if we can look for ourselves.
  3. To start a new thread, click "Start a new topic".
Previous page history was archived for backup purposes at Project:Support_desk/old on 2015-07-30.
Other languages: English  العربية čeština Esperanto français 日本語 中文

Add textContent to HTML via MediaWiki template

2
49.230.76.81 (talkcontribs)

I created a MediaWiki template named by the Hebrew letter א (template:א):

<code></code>

I call it by {{א|SOME_textContent}}.

The problem is that after I save the page, I get an empty <code> tag content ( ).

Why do I get empty <code> tag content ( ), instead getting SOME_textContent (the input I gave in call after the vertical bar) ?

Bawolff (talkcontribs)

You get the contents of the template. Add {{{1}}} in the template where you want the first parameter to be substituted. See help:Templates for more info.

Reply to "Add textContent to HTML via MediaWiki template"
2600:1700:F0D0:7BA0:AC0C:E926:29A1:BD3F (talkcontribs)

I cannot log in and I know I exist, but the login says that there is no username that is mine. What gives? ~~~~Nacky

AbelKnight (talkcontribs)

- Check that you are placing the password correctly.

- Check that the name is well written, perfectly.

Reply to "Logging In"
49.230.98.200 (talkcontribs)

I'd like to comment all Sidebar menu links (actually, not all of them besides one).

The rationale is not to delete them (even if I delete them, generally I could always restore earlier version from "View history" but let's say I just want to comment them instead).

Is it possible in version 1.32.0 without installing any extension?

MarkAHershberger (talkcontribs)

You can comment them out using <!-- ** menu link -->

49.230.76.81 (talkcontribs)

Thanks

Reply to "Comment all Sidebar menu links"

Elastica\Exception\Connection\HttpException from line 187...\Http.php: Couldn't connect to host, Elasticsearch down?

10
Bruceillest (talkcontribs)
Product Version
MediaWiki 1.32.0
PHP 7.2.7 (cgi-fcgi)
MySQL 8.0.15
ICU 61.1
CirrusSearch 0.2 (b1fa4bd)06:47, 20 February 2019 GPL-2.0-or-later
Elastica 1.3.0.0 (9fcf88c)02:09, 11 October 2018 GPL-2.0-or-later

I tried to run the commands in the read me and it just states that elasticsearch is down. I've tried to connect using curl localhost:9200 and all it gives me is "curl: (7) Failed to connect to localhost port 9200: Connection refused". I ran a netstat and didn't see port 9200 I've also rebooted and don't have any firewalls running. Is there a way to start the elasticsearch service with this setup? This is the first time installing CirrusSearch and Elastica.

MarkAHershberger (talkcontribs)

Try sudo systemctl start elasticsearch. If that works--if there is no error and curl is giving you results--then systemctl can be used to enable it so it starts at boot. To do that, try sudo systemctl enable elasticsearch.

Let us know of any problems.

Bruceillest (talkcontribs)

Mark, thanks for your response but I forgot to mention I am running all of this in Windows server 2012R2. There is no elasticsearch service installed in Windows its only the Elastica extension folder. Is there a batch file or php file I have to run to start it?

MarkAHershberger (talkcontribs)

That does make a difference!

You'll need to install ElasticSearch and make sure it is running. MediaWiki will communicate with it to do the searching.

Bruceillest (talkcontribs)

Awesome I'm making progress but now I'm getting this error.

Elastica\Exception\ResponseException from line 179 of C:\inetpub\wwwroot\CAS\extensions\Elastica\vendor\ruflin\elastica\lib\Elastica\Transport\Http.php: Root mapping definition has unsupported parameters:  [mw_cirrus_metastore : {dynamic=false, properties={mediawiki_version={type=keyword}, mapping_min={type=long}, analysis_maj={type=long}, cirrus_commit={type=keyword}, mapping_maj={type=long}, wiki={type=keyword}, shard_count={type=long}, type={type=keyword}, index_name={type=keyword}, mediawiki_commit={type=keyword}, analysis_min={type=long}, namespace_name={norms=false, analyzer=near_match_asciifolding, type=text, index_options=docs}}}] [reason: Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [mw_cirrus_metastore: {dynamic=false, properties={mediawiki_version={type=keyword}, mapping_min={type=long}, analysis_maj={type=long}, cirrus_commit={type=keyword}, mapping_maj={type=long}, wiki={type=keyword}, shard_count={type=long}, type={type=keyword}, index_name={type=keyword}, mediawiki_commit={type=keyword}, analysis_min={type=long}, namespace_name={norms=false, analyzer=near_match_asciifolding, type=text, index_options=docs}}}]]


I installed Elasticsearch v7.2.0 is this issue due to the version I'm using?

MarkAHershberger (talkcontribs)

Yes, you need Elasticsearch 5.5.x or 5.6.x as the CirrusSearch page says.

Bruceillest (talkcontribs)

Yep sorry about that I noticed that late. So I installed elasticsearch and was able to run all the commands apparently well but when I search for a page it won't populate. I added "&action=cirrusDump" on pages that show up and don't show up on search and the ones that show up have text information and the searches that don't show up have no text information. I'm guessing some pages are indexing and others aren't so I ran updateSearchIndexConfig.php --reindexAndRemoveOk --indexIdentifier=now and forceSearchIndex.php and still no dice. I ended up installing elasticsearch 5.6.0.


Also when I ran forceSearchIndex.php --skipLinks --indexOnSkip and forceSearchIndex.php --skipParse commands I didn't get an output which I was wondering if that's normal.

MarkAHershberger (talkcontribs)

Does it populate your index even though it isn't printing?

Bruceillest (talkcontribs)

How can I tell if it populates my index?


Bruceillest (talkcontribs)

This is the output I get when I run

C:\Windows\system32>curl localhost:9200/_cat/indices?v

health status index                      uuid                    pri rep docs.count docs.deleted store.size pri.store.size

green  open   mw_cirrus_metastore_first  U05k9IpgRTyliejRcCPbcw    1    0          21           18      18.3kb         18.3kb

green  open   caswiki_general_1563218277 fO7-X4mmTAGDn-JY0jleNA   4    0          0            0        648b            648b

green  open   caswiki_content_1563218273 bns-287uS36967cO13eY9g    4    0          9            0    652.2kb        652.2kb

green  open   .tasks                      KX7oUmTUSui9XV5NIfqhlA   1    0          0            0        191b            191b


I have around 244 pages created with a lot of content so I guessing these numbers should be more.

Reply to "Elastica\Exception\Connection\HttpException from line 187...\Http.php: Couldn't connect to host, Elasticsearch down?"
Wgkderdicke (talkcontribs)

I recently updated this wiki here to V1.31. Now I'm doing this and that to polish up the visual appearance a little bit. Doing so, I discovered an odd behaviour:

Every page that contains the term spam, actually three pages which are dealing with this topic, are rejected by a dubios spam protection filter. This filter claims, that

The page you wanted to save was blocked by the spam filter. This is probably caused by a link to an external site.

and

The following text was found by the spam filter: Spam

And I have to admit: Yes, the term is used on this pages. But to describe something around the topic spam. As far as I know, this is far away from being spam itself.

Subsequently there is really no way to circle around this blocking. Even as Sysop my hands are tied!

So does anybody know a way to stall this edgy spam filter jumping on a harmless four letter word like spam, in particular one will only explain this term.

Many thanks for an answer in advance!

MarkAHershberger (talkcontribs)

This functionality is not built into MediaWiki. Try disabling some extensions or talk to your host?

Wgkderdicke (talkcontribs)

Well, in case of this rejected content the de.json file from the languages/i18n folder provides exactly the shown error messages. There are three messages called spamprotectiontitle, spamprotectiontext and spamprotectionmatch. The content of that messages, in addition with the bad word spam, is displayed instead of saving the article. The en.json file also contains the english pendant of this messages, starting in line 3020 of en.json (MW 1.31.3).

Also this messages are mentioned here: Manual:$wgSpamRegex

But it is not changed in my LocalSettings.php. It comes from DefaultSettigs.php as $wgSpamRegex = [];.

The explanation in above mentioned manual also describes my experience: even a Sysop fails to save if that error occurs.

Wgkderdicke (talkcontribs)

Bingo. I added $wgSpamRegex = false; to my LocalSettings.php. Now the pages which cotains the term spam can be saved again. In contrast to Manual:$wgSpamRegex, which claims that the default value is false, the MW 1.31.3 DefaultSetting.php gives me an empty array instead. Maybe this causes confusion or some weird fallback solution with some odd regex from elsewhere gets suddenly effective.

MarkAHershberger (talkcontribs)
Wgkderdicke (talkcontribs)
MarkAHershberger (talkcontribs)

yes, thanks.

Reply to "Edgy spam filter?"

I am trying to make a new Extension, but getting an error at line '$parser->getOutput()->addModules( 'ext.E.scripts' )';

7
103.118.50.4 (talkcontribs)

[f49fe744377fe4fd5f717ceb] /wiki/index.php/Main_Page Error from line 7 of C:\wamp64\www\wiki\extensions\E\E_body.php: Call to a member function addModules() on null

Backtrace:

#0 C:\wamp64\www\wiki\includes\Hooks.php(174): EC::onParserInit(Parser)

#1 C:\wamp64\www\wiki\includes\Hooks.php(202): Hooks::callHook(string, array, array, NULL)

#2 C:\wamp64\www\wiki\includes\parser\Parser.php(369): Hooks::run(string, array)

#3 C:\wamp64\www\wiki\includes\cache\MessageCache.php(1190): Parser->firstCallInit()

#4 C:\wamp64\www\wiki\includes\cache\MessageCache.php(1166): MessageCache->getParser()

#5 C:\wamp64\www\wiki\includes\Message.php(1282): MessageCache->transform(string, boolean, LanguageEn, Title)

#6 C:\wamp64\www\wiki\includes\Message.php(883): Message->transformText(string)

#7 C:\wamp64\www\wiki\includes\Message.php(943): Message->toString(string)

#8 C:\wamp64\www\wiki\includes\OutputPage.php(924): Message->text()

#9 C:\wamp64\www\wiki\includes\OutputPage.php(971): OutputPage->setHTMLTitle(Message)

#10 C:\wamp64\www\wiki\includes\page\Article.php(622): OutputPage->setPageTitle(string)

#11 C:\wamp64\www\wiki\includes\actions\ViewAction.php(68): Article->view()

#12 C:\wamp64\www\wiki\includes\MediaWiki.php(501): ViewAction->show()

#13 C:\wamp64\www\wiki\includes\MediaWiki.php(294): MediaWiki->performAction(Article, Title)

#14 C:\wamp64\www\wiki\includes\MediaWiki.php(860): MediaWiki->performRequest()

#15 C:\wamp64\www\wiki\includes\MediaWiki.php(517): MediaWiki->main()

#16 C:\wamp64\www\wiki\index.php(42): MediaWiki->run()

#17 {main}

MarkAHershberger (talkcontribs)

Well, at this point you do not have anything in $parser: "Call to a member function addModules() on null". If you want us to help, you'll need to share your source code.

103.118.50.4 (talkcontribs)

OK, right now it has only 4 files(I am just learning how to create extensions).The Javascript module is not working.

1.Extension.json--

{
    "name": "E",
	"version": "2.9.4",
	"author": [
		"PS",
		"xyz"
	],
	"license-name": "GPL-2.0-or-later",
	"type": "parserhook",
	"requires": {
		"MediaWiki": ">= 1.30.0"
	},
	"AutoloadClasses": {
		"EC": "E_body.php"
	},
	"Hooks": {
		"ParserFirstCallInit": "EC::onParserInit"
		},
	"ResourceFileModulePaths": {
		"localBasePath": "",
		"remoteExtPath": "E"
	},
	"ResourceModules": {
	    "ext.E.scripts": {
			"scripts": "resources/E.js",
			"dependencies": [ "mediawiki.api" ]
		},
		"ext.E.styles": {
			"styles": "resources/E.css"
		}
	},
    "manifest_version": 1	
}


2.E_body.php-->

<?php
class EC {
	static function onParserInit( Parser $parser ) {
		$parser->setHook( 'E', array( __CLASS__, 'ERender' ) ); 
		$parser->getOutput()->addModules( 'ext.E.scripts' );
		return true;
	}
	static function ERender( $input, array $args, Parser $parser, PPFrame $frame ) {
		$ret='<style>table.wtable { border-style:solid; border-width:1px; border-collapse:collapse;} </style>';
	    $ret .= '<table class="wtable">';
		$ret .= '<tr>';
		$ret .= '<td>Feedback</td>';
		$ret .= '<td><input id="inp001" type="text" /></td>';
		$ret .= '</tr>';
		$ret .= '<tr>';
		$ret .= '<td>upvote</td>';
		$ret .= '<td><input id="chk001" type="radio" name="r1"/></td>';
		$ret .= '</tr>';
		$ret .= '<tr>';
		$ret .= '<td><p font-colour=red>downvote</p></td>';
		$ret .= '<td><input id="chk001" type="radio" name="r1" /></td>';
		$ret .= '</tr>';
		$ret .= '<tr><p>here</p>';
        $ret .= '<td align="center" colspan=2><input id="btn001" type="button" value="Submit"></td>';
        $ret .= '</tr>';
		$ret .= '</table>';
		$ret .= '<input type="submit" value="Vote" onclick="myFunction()">/>';
		return $ret;
	}
}


3.E.js-->

$("#btn001").click(function() {
	$("#inp001").html("Hello <b>world!</b>");
	//inp001.innerHTML("Clicked");
	$('p').html("Hello <b>world!</b>");
	alert("Button clicked.");
});


4.E.css-->

table.wtable {
	border-style:solid; border-width:1px; border-collapse:collapse;
}

@MarkAHershberger

103.118.50.4 (talkcontribs)

Will someone look at this please ?

I am trying to learn extension creation but right now I am stuck due to this.

Please see this...

Falcopragati (talkcontribs)

having similar problem

MarkAHershberger (talkcontribs)

The problem is your call

$parser->getOutput()->addModules( 'ext.E.scripts' );

in the ParserFirstCallInit hook. That is called too early to have getOutput() be populated.

Move the call so that it is just inside EC:ERender and you'll get what you want.

103.118.50.4 (talkcontribs)

Thanks Mark ! It worked.

Actually I took this code from [Manual:Tag extensions/Example], in which a tutorial is given about creating an interacting tag.

But due to this ' $parser->getOutput()->addModules' in the wrong place, it does'nt works, and this is not the only line of code on that page which should be corrected..

Maybe you should take a look, and edit the codes on that page to make them right...

I can edit, but I don't think I am qualified to edit that page.

Reply to "I am trying to make a new Extension, but getting an error at line '$parser->getOutput()->addModules( 'ext.E.scripts' )';"
23.115.12.250 (talkcontribs)

How can I remove a page that is incorrect.

DannyS712 (talkcontribs)

You can't "remove" a page, but an administrator can delete it

Reply to "How to remove a page"

Error while thanking a user for an edit.

6
103.118.50.4 (talkcontribs)

I installed Extension:Thanks , but when I tried to thank a user in my wiki for a rivison to an article, this error appeared-->

[ab483df2c97b19cad31449d4] /wiki/index.php/Special:Thanks/140 TypeError from line 1223 of C:\wamp64\www\wiki\extensions\Echo\includes\DiscussionParser.php: Argument 1 passed to EchoDiscussionParser::getEditExcerpt() must be an instance of MediaWiki\Revision\RevisionRecord, instance of Revision given, called in C:\wamp64\www\wiki\extensions\Thanks\includes\ApiCoreThank.php on line 50

Backtrace:

0 C:\wamp64\www\wiki\extensions\Thanks\includes\ApiCoreThank.php(50): 
    EchoDiscussionParser::getEditExcerpt(Revision, LanguageEn)
1 C:\wamp64\www\wiki\includes\api\ApiMain.php(1570): ApiCoreThank->execute()
2 C:\wamp64\www\wiki\includes\api\ApiMain.php(500): ApiMain->executeAction()
3 C:\wamp64\www\wiki\extensions\Thanks\includes\SpecialThanks.php(166): ApiMain- 
   >execute()
4 C:\wamp64\www\wiki\includes\htmlform\HTMLForm.php(665): SpecialThanks >onSubmit(array, OOUIHTMLForm)
5 C:\wamp64\www\wiki\includes\htmlform\HTMLForm.php(557): HTMLForm->trySubmit()
6 C:\wamp64\www\wiki\includes\htmlform\HTMLForm.php(572): HTMLForm->tryAuthorizedSubmit()
7 C:\wamp64\www\wiki\includes\specialpage\FormSpecialPage.php(184): HTMLForm->show()
8 C:\wamp64\www\wiki\includes\specialpage\SpecialPage.php(569): FormSpecialPage->execute(string)
9 C:\wamp64\www\wiki\includes\specialpage\SpecialPageFactory.php(568): SpecialPage->run(string)
10 C:\wamp64\www\wiki\includes\MediaWiki.php(288): MediaWiki\Special\SpecialPageFactory->executePath(Title, RequestContext)
11 C:\wamp64\www\wiki\includes\MediaWiki.php(860): MediaWiki->performRequest()
12 C:\wamp64\www\wiki\includes\MediaWiki.php(517): MediaWiki->main()
13 C:\wamp64\www\wiki\index.php(42): MediaWiki->run()
14 {main}
Malyacko (talkcontribs)

Which exact MediaWiki version? Which exact Echo version? Which exact Thanks version?

103.118.50.4 (talkcontribs)
MediaWiki  1.32.2
PHP        7.2.18 (apache2handler)
MySQL      5.7.26
ICU        63.1
Extension:Thanks version-1.2.0 (latest stable version)
Echo version can't be seen at the Special:version page, but it is the one which is used for MediaWiki version 1.32.0 or later.
MarkAHershberger (talkcontribs)

If the Echo version doesn't show up on the version page, then it probably isn't being used. Did you make sure it is Loaded in LocalSettings.php?

103.118.50.4 (talkcontribs)

I checked again, it is loaded in LocalSettings.php.

And I am sure it is working because the users are perfectly getting notifications etc in the wiki without any error.


I checked Echo's extension.json file, in that file version is not mentioned, maybe that is why the extension is listed on my Special:version page but it's version is not showing up in the designated field.

103.118.50.4 (talkcontribs)

I tried thanking users in StructuredDiscussions, and it is working perfectly !

BUT when I am thanking users for their contributions to an article(thanking through the article history page) , instead of this Backtrace , an alert box is appearing now, saying :

Thank action failed (error code: internal_api_error_TypeError).
 Please try again.
Reply to "Error while thanking a user for an edit."

update.php Error: 1054 Unknown column 'ar_comment_id' in 'where clause' (localhost)

27
Beardedfool (talkcontribs)

Seems to be a new field, should this be covered in update.php?


Trying to migrate to a new install of 1.33.0 on a new server from 1.32.1.

Mysql dump


On running update.php I'm getting the above. Manually running migrateactors.php doesn't seem to do it either.


The way our setup works it's a bit awkward to do it another way but guess I can install 1.32.1 on the new server and patch up if needed. But should I have to please?


It could also be something weird in our LocalSettings.php as this has built up over time and I'm aiming to start again, but wanted to rule out the update.php and ask for advice first please.


TheDJ (talkcontribs)

This might be a bug.. I filed https://phabricator.wikimedia.org/T227662 to which you can subscribe.

Do you have any migrate settings specified in your LocalSettings.php ? And is this mysql/mariadb, or something else ?

81.97.98.8 (talkcontribs)

Thanks for filling that.

10.1.38-MariaDB-0ubuntu0.18.10.2 Ubuntu 18.10

10.1.40-MariaDB-0ubuntu0.18.04.1 Ubuntu 18.04

Can you talk me through the migrate settings part? At the extent of my knowledge here

Iowajason (talkcontribs)

Similar problem with 1.32.x to 1.33 upgrade of migrateComments script, except on column 'pt_reason_id'. Checked MySQL schema and column does not exist in database. DB is MySQL Community 8.0.16. No settings in LocalSettings that seem to be migration related.

TheDJ (talkcontribs)

So this is reporting errors on database columns that were introduced in 1.30. Have both of you ran pre-1.30 versions at some point in time ?

Iowajason (talkcontribs)

My Mediawiki install definitely predates 1.30. Clearly recall 1.22 to 1.23 upgrade challenges and think might have been 1.08 or so for initial install.

TheDJ (talkcontribs)

Also, you seem to have a comment table, can you check what the structure of that table is and report back here (or in the ticket).

Iowajason (talkcontribs)

MySQL DDL reports:

CREATE TABLE `comment` (

  `comment_id` bigint(20) unsigned NOT NULL AUTO_INCREMENT,

  `comment_hash` int(11) NOT NULL,

  `comment_text` blob NOT NULL,

  `comment_data` blob,

  PRIMARY KEY (`comment_id`),

  KEY `comment_hash` (`comment_hash`)

) ENGINE=InnoDB AUTO_INCREMENT=8651 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci

Iowajason (talkcontribs)

My pre-upgrade (May) backup version also doesn't seem to have for pt_reason_id flavor of issue:

CREATE TABLE `protected_titles` (

  `pt_namespace` int(11) NOT NULL,

  `pt_title` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin NOT NULL,

  `pt_user` int(10) unsigned NOT NULL,

  `pt_reason` tinyblob,

  `pt_timestamp` binary(14) NOT NULL,

  `pt_expiry` varbinary(14) NOT NULL DEFAULT '',

  `pt_create_perm` varbinary(60) NOT NULL,

  PRIMARY KEY (`pt_namespace`,`pt_title`),

  KEY `pt_timestamp` (`pt_timestamp`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;

Bttfvgo (talkcontribs)

I have the exact same problem! I started out using 1.29 (followed by 1.30) and just updated to 1.32. Everything seemed to be working and update.php worked great. I then tried updating to 1.33 and ran into the same problem... "Error: 1054 Unknown column 'ar_comment_id' in 'where clause' (localhost)". I'm dumbfounded. I did have the Comments extension at one time but haven't used it in years. Not sure how to get table structure. Is this what you mean? Do I need to resort back to 1.32? Is it because the Comments extension hasn't been updated in years because I disabled it? I am SO glad it isn't just me having this problem!

mysql> DESCRIBE (or EXPLAIN) comment;
+--------------+---------------------+------+-----+---------+----------------+
| Field        | Type                | Null | Key | Default | Extra          |
+--------------+---------------------+------+-----+---------+----------------+
| comment_id   | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| comment_hash | int (11)            | NO   | MUL | NULL    |                |
| comment_text | blob                | NO   |     | NULL    |                |
| comment_data | blob                | YES  |     | NULL    |                |
+--------------+---------------------+------+-----+---------+----------------+
Bttfvgo (talkcontribs)

I also have a table "Comments", rather than "comment", but I'm sure the error refers to the latter.

When I use "describe archive;" I get all of the "ar_" fields (ar_comment is one of them) but the only names ending in "_id" are "ar_id", "ar_rev_id", "ar_text_id", "ar_page_id", and "ar_parent_id". No "ar_comment_id".

Bttfvgo (talkcontribs)

Looking at the page Manual:Archive_table, my archives table looks identical to the MediaWiki versions 1.25 – 1.29 table. So I don't have the column they are wanting to migrate!  :( Can I create the fields myself?

TheDJ (talkcontribs)

Anyone of you have a backup to check if you had those _id columns before upgrading ?

The weird thing is that those *_id's should have been added at the same time as the comment table was added (and that comment table describe of Bbttfvgo does look correct). Also weird is that some of you have different missing fields than the others..

So either something went wrong in originally adding them during 1.30 or they later got dropped, or there is an error in the migration script or the 1.33 sql updates..

Iowajason (talkcontribs)

My Backup from May of this year has (seemingly identical) schema of:

CREATE TABLE `comment` (

  `comment_id` bigint(20) unsigned NOT NULL AUTO_INCREMENT,

  `comment_hash` int(11) NOT NULL,

  `comment_text` blob NOT NULL,

  `comment_data` blob,

  PRIMARY KEY (`comment_id`),

  KEY `comment_hash` (`comment_hash`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;

/*!40101 SET character_set_client = @saved_cs_client */;

Iowajason (talkcontribs)

I didn't experience problem with that column or several that other reports listed. I did have ar_comment_id, ipb_reason_id, fa_deleted_reason_id, and log_comment_id in my May backup. My Wiki just missing the protected titles ID column. Didn't lose other columns due to 1.33 update, just page display seems to now depend on some of those columns.

Bttfvgo (talkcontribs)

Okay I went through my DB backup from MW 1.30 (dated 8-17-2018) and the archives table is identical to the one I have now. Well with one exception. The field named "ar_text" was in the 1.30 backup but now no longer exists.

Bttfvgo (talkcontribs)

Can we just recreate the missing table fields? Should I switch back to an older version and try the migrateComments.php command?

I went back through a DB backup from MW 1.29 (dated 5-17-2017) and the archive table is there (with the same 19 fields) but the comment table is not.

Bttfvgo (talkcontribs)

Okay, so I decided to:

mysql> ALTER TABLE archive
    -> ADD COLUMN ar_comment_id INT NOT NULL;

and ran the upgrade script again. This time it actually gave me a different error:

Error: 1054 Unknown column 'ipb_reason_id' in 'where clause' (localhost)

So if I can create all of the columns it claims it is missing, do you think it might actually go through? I'll test the theory, missing file by missing file, and let you know what happens!

Bttfvgo (talkcontribs)

Okay so I did:

mysql> ALTER TABLE ipblocks
    -> ADD COLUMN ipb_reason_id BIGINT NOT NULL;

and ran the upgrade script and it passed this stage (updated 1039 row(s) with 7 new comment(s)). The next step, 'img_description_id' went through with no problem. (updated 2893 row(s) with 6 new comment(s)). After this I get the error:

Error: 1054 Unknown column 'oi_description_id' in 'where clause' (localhost)

. I'll add this and let you know how it goes!

Bttfvgo (talkcontribs)
mysql> ALTER TABLE oldimage
    -> ADD COLUMN oi_description_id BIGINT NOT NULL;

and that worked too (updated 121 row(s) with 6 new comment(s)). Next error is 'fa_deleted_reason_id'. Will report back soon.

Bttfvgo (talkcontribs)

fa_deleted_reason_id

mysql> ALTER TABLE filearchive
    -> ADD COLUMN fa_deleted_reason_id BIGINT NOT NULL;

led to fa_description_id

mysql> ALTER TABLE filearchive
    -> ADD COLUMN fa_description_id BIGINT NOT NULL;

which led to rc_comment_id

mysql> ALTER TABLE recentchanges
    -> ADD COLUMN rc_comment_id BIGINT NOT NULL;

which led to log_comment_id

mysql> ALTER TABLE logging
    -> ADD COLUMN log_comment_id BIGINT NOT NULL;

which led to pt_reason_id

mysql> ALTER TABLE protected_titles
    -> ADD COLUMN pt_reason_id BIGINT NOT NULL;

My wiki's pretty big so the last two took forever to update but IT WORKS NOW! YEA!!! Hopefully these findings will help many more users, including the two whose thread I inadvertantly hijacked. Thanks again for a superb and superior product. Mediawiki for life!

Beardedfool (talkcontribs)

Zero problem on the hijack as you seem to have an answer, thanks! Though I'll wait to for the more official answer of what to do.


@TheDJ The version before the 1.32.1 was so old it didn't have the comment part in it. Not sure of what version that was I'm afraid


10.1.38-MariaDB-0ubuntu0.18.10.2 Ubuntu 18.10 (version 1.32.1)

+--------------+---------------------+------+-----+---------+----------------+
| Field        | Type                | Null | Key | Default | Extra          |
+--------------+---------------------+------+-----+---------+----------------+
| comment_id   | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| comment_hash | int(11)             | NO   | MUL | NULL    |                |
| comment_text | blob                | NO   |     | NULL    |                |
| comment_data | blob                | YES  |     | NULL    |                |
+--------------+---------------------+------+-----+---------+----------------+
4 rows in set (0.00 sec)


10.1.40-MariaDB-0ubuntu0.18.04.1 Ubuntu 18.04 (version 1.33.0)

MariaDB [wiki]> DESCRIBE wiki_comment;
+--------------+---------------------+------+-----+---------+----------------+
| Field        | Type                | Null | Key | Default | Extra          |
+--------------+---------------------+------+-----+---------+----------------+
| comment_id   | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| comment_hash | int(11)             | NO   | MUL | NULL    |                |
| comment_text | blob                | NO   |     | NULL    |                |
| comment_data | blob                | YES  |     | NULL    |                |
+--------------+---------------------+------+-----+---------+----------------+
4 rows in set (0.00 sec)
TheDJ (talkcontribs)

Right, so it seems all of you had the comment table since 1.30, but not the new *_id columns in the other tables. That likely means no accidental data loss or something scary like that, but it is still very weird, as the table and the new _id fields of the other columns were added in the same SQL change file..

None of you remember running into an error with previous upgrades ? Did any of you run git versions of MediaWiki perhaps ?

The changes that SHOULD have run, but apparently for some of you didn't run completely are: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/master/maintenance/archives/patch-comment-table.sql

If you make backups etc and manually try to run these instructions, then maybe you can figure out where things went wrong.. Please don't do things you are not comfortable with. P.S. The /**/ syntax the SQL commands is where the table prefix generally is inserted (often that's just wiki_)

Beardedfool (talkcontribs)

Didn't see any errors before - wouldn't want to bet my life on that but pretty sure. When I get a chance I'll try to setup a new one to test that for you.

I'm not totally comfortable with bits in SQL side but will try those as I have backups and am happy enough dropping the database entirely, Mariadb in necessary. The server isn't doing anything else at present.


Unlikely to be today though, hopefully tomorrow. Thanks for the help!


Couple of questions for my learning please?

So that's mostly copy and enter in SQL command line after sed /*_*/ with wiki_ e.g /*_*/comment becomes wiki_comment

1) What's /*i*/ on lines 12/19?

2) On the CREATE INDEX lines, what does the last part of the line in brackets do e.g. (imgcomment_name) - column name?

Jagatronic (talkcontribs)

I think this may be collateral damage from the "max key length is 767 bytes" issue with MySQL / UTF support / varchar(255) fields. I hit the max key length problem upgrading to 1.31.

Now trying to upgrade 1.32.1 -> 1.33 I hit this bug. Trying the SQL from patch-comments-table.sql line by line it fails at "CREATE TABLE /*_*/image_comment_temp" because of the max key length problem. It looks like none of the rest of this file was applied.

Are there likely to be other bits of updates that have been missed along the way? Is there some way to validate the whole schema is up to date?

Jagatronic (talkcontribs)

Having fixed my max length problem and applied everything in patch-comments-table.sql the update progressed until I hit Topic:V3adkmqnefp8bf30 - this is where I first hit the max key length bug a couple of upgrades ago. It turns out my database doesn't have the changes in maintenance/archive/patch-actor-table.sql either (except for the first table: actor).


Reply to "update.php Error: 1054 Unknown column 'ar_comment_id' in 'where clause' (localhost)"
Jer Hughes (talkcontribs)

Our wiki is publicly available and editable. However, several times a year, our top contributors go to remote areas without reliable internet access for long periods of time. This is also when they have a lot of free time to edit the wiki if they could access it.

We're considering setting up a raspberry pi as an off-grid LAMP server with MediaWiki. Copying over the database and upload directory with the most up to date version of the site, and letting them take it with them. Once they get back we would merge the forked wikis.

But that's where the problem is. What's the best way to merge the two together? Theoretically some pages would be edited by one group and unedited by the other. But some pages could have edits by both groups, some of which don't conflict while others do.

MarkAHershberger (talkcontribs)

This is currently a project that I'm working on via MABS (also this grant).

If you have interest in it, I'd like to get you involved.

Reply to "Best way to merge forked wikis"