Topic on Project:Support desk

Can't solve this error in Parser.php

15
200.142.207.244 (talkcontribs)

Hi friends.

MediaWiki 1.24.2 PHP 5.6.8 (apache2handler) MySQL 5.6.24

I have a local Wiki at my company which I'm the head of the project. We're very close to get the project done, but some categories have a lot of articles and content. I really don't know why this is happening (I googled a lot before I come here). The error I get in some categories is:

Fatal error: Call to a member function getMaxIncludeSize() on null in C:\xampp\htdocs\wiki-hdti\includes\parser\Parser.php on line 3266

I already tried to set the $wgMaxArticleSize to various values, changed the max POST size in php.ini too, I really don't know what to do more and this the only one issue that are affecting the project.

Please, I ask humbly, if someone could help me It'll be great!

Thanks until now!

88.130.107.90 (talkcontribs)

Cross-post on Stackoverflow:

http://stackoverflow.com/questions/30351810/cant-solve-this-error-in-parser-php-of-mediawiki

This is the function call, which is failing in your case:

 public function replaceVariables( $text, $frame = false, $argsOnly = false ) {
​    # Is there any text? Also, Prevent too big inclusions!
​    if ( strlen( $text ) < 1 || strlen( $text ) > $this->mOptions->getMaxIncludeSize() ) { # line 3266
​      return $text;
​ }
200.142.207.244 (talkcontribs)

Hi friend.

Yes, it was me on StackOverFlow.

I THINK I got it fixed with $wgMaxArticleSize and removing pages categories which had the error and creating it again. But, I don't know if this fix is the correct or may it come back again.

If someone had something like this and would like to share the experience, it'd help.

Thanks!

200.142.207.244 (talkcontribs)

Happened again. Maybe all pages will need to remove and create again.

Ciencia Al Poder (talkcontribs)

How do you get pages with more than 2048 kilobytes? (the default for $wgMaxArticleSize)

You should probably split them into separate articles.

200.169.48.9 (talkcontribs)

Hi,

I'm at another computer right now.

Actually it was a mistake, I think my pages don't have more then 2048kb. I was looking better and I think this problem could be from Comments extension or from header and footer extension.

It's very strange, because it happened again right after an user inserting a new comment in an article.

Ciencia Al Poder (talkcontribs)

Well, if the comment extension is trying to parse the entire page, including comments, that's a very bad idea!

200.142.207.244 (talkcontribs)

Hi Ciencia,

I disabled the Comments extension and that's exactly what is the problem. I think is what you said.

Any idea if have something to do to make the extension don't parse?

Thanks!

200.142.207.244 (talkcontribs)

I was thinking about the "noinclude" tag, maybe this is the correct way?

Thanks!

(Sorry for new post, I can't edit and save the another)

Ciencia Al Poder (talkcontribs)

You said you have Extension:Header Footer. If you have comments in the header/footer you may need to put them directly on each page.

200.142.207.244 (talkcontribs)

Hm.. The problem is with the header footer action?

Ciencia Al Poder (talkcontribs)

I don't know, but I've never seen before this error about parser

109.232.208.230 (talkcontribs)
109.232.208.230 (talkcontribs)

Actually not :(

109.232.208.230 (talkcontribs)

Ok, I nailed down the problem to be with Mediawiki's parser cache. A workaround is to disable it by setting:

$wgEnableParserCache = false;

Of course this is not a solution, but it's something...

Reply to "Can't solve this error in Parser.php"