Hello, I'm having some problems with mediawiki parsoid regarding memory exhaustion, can someone help me?
on a very big page (can't tell the exact size, but the original written on MS Word has more than 70 pages) I get the following issue
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 135168 bytes) in /var/www/html/mediawiki-1.35.1/vendor/wikimedia/parsoid/src/Html2Wt/WikitextSerializer.php on line 1683. Is an explode function
As you can see, it says the memory limit is 128M, but my phpinfo says 750M, configured via php.ini in several places to make sure (php.ini, php-fpm.conf)
from my phpinfo
memory_limit 750M 750M
here's a grep -r memory_limit on my /etc
php-fpm.d/www.conf:php_admin_value[memory_limit] = 750M
php.ini:memory_limit = 750M
so, both php.ini and fpm are configured with 750M
I already tryed to fix the memory_limit on the LocalSettings.php, but also no deal
PHP 7.4.14 (fpm-fcgi)
MediaWiki 1.35.1
Lua 5.1.5
ICU 65.1
MySQL 5.6.35-80.0-log
wikimedia/parsoid 0.12.1
Can someone help me? This is preventing me and my team to create long and important documents.
Thank you!