Extension talk:Collection

Jump to navigation Jump to search

About this board

Archives 

/Archive 1

readapidenied: Collection extension is not working for private wiki's

3
Summary by S0ring
S0ring (talkcontribs)

For private wiki's the Extension:Collection doesn't work

$wgGroupPermissions['*']['read'] = false;

Is there any solution?


Here the readapidenied errors from mwlib:

   creating nuwiki in u'/var/cache/mwlib/7e/7e607dca372f1977/tmpXYsGQC/nuwiki'

   ERR: You need read permission to use this module.: [fetching https://<domain_name>/wiki/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords|rightsinfo&format= json]

   ERR: You need read permission to use this module.: [fetching https://<domain_name>/wiki/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords&format=json]

   ERR: You need read permission to use this module.: [fetching https://<domain_name>/wiki/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases&format=json]

   ERR: You need read permission to use this module.: [fetching https://<domain_name>/wiki/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap&format=json]

   removing tmpdir u'/var/cache/mwlib/7e/7e607dca372f1977/tmpXYsGQC'

   memory used: res=33.8 virt=230.7

   1% error Traceback (most recent call last):

     File "/usr/local/bin/mw-zip", line 11, in <module>

       sys.exit(main())

     File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main

       make_zip(output, options, env.metabook, podclient=podclient, status=status)

     File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 50, in make_zip

       make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

     File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki

       pool.join(raise_error=True)

     File "/usr/local/lib/python2.7/dist-packages/gevent/pool.py", line 433, in join

       greenlet._raise_exception()

     File "src/gevent/greenlet.py", line 317, in gevent._greenlet.Greenlet._raise_exception

     File "src/gevent/greenlet.py", line 766, in gevent._greenlet.Greenlet.run

     File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 119, in run

       self.fetch_pages_from_metabook(api)

     File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 59, in fetch_pages_from_metabook

       fetch_images=not self.options.noimages)

     File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 282, in __init__

       siteinfo = self.get_siteinfo_for(self.api)

     File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 453, in get_siteinfo_for

       return m.get_siteinfo()

     File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 183, in get_siteinfo

       raise RuntimeError("could not get siteinfo")

   RuntimeError: could not get siteinfo

    in function system, file /usr/local/lib/python2.7/dist-packages/mwlib/nslave.py, line 64

Ckepper (talkcontribs)

The error indicates that your API is not accessible for the renderer. The renderer must be able to fetch data via the MW API in order to render the content. I can't give you any specific tips on how to resolve this, unfortunately.

S0ring (talkcontribs)
Reply to "readapidenied: Collection extension is not working for private wiki's"

Low traffic sites that are accessible from the internet don't need to install their own PDF Server - does this functionality still work?

1
Peculiar Investor (talkcontribs)

Our wiki is running on MediaWiki 1.31.7 and using Collection 1.7.0 (af3a0b8) 14:23, 15 April 2018. The Download as PDF is constantly failing and directing the user to Reading/Web/PDF Functionality which doesn't specifically address the reason for the "Book rendering failed". Reading through Talk:Reading/Web/PDF Functionality doesn't clear up the situation much either. It does seem to indicate there is a new render server available at https://pediapress.com/collector but that doesn't seem to work for non-Wikipedia sites. The existing render server https://tools.pediapress.com/mw-serve/ does seem to still active.

Is the functionality via this extension dead for low traffic sites that don't need or cannot install (i.e shared hosting) their own PDF server?

Reply to "Low traffic sites that are accessible from the internet don't need to install their own PDF Server - does this functionality still work?"

No capability for right-to-left languages (arabic)?

1
Summary by S0ring

The pyfribidi package (of mwlib) must be installed to render right-to-left texts.

S0ring (talkcontribs)

Hi,

Extension:Collection is installed on MW1.31 and is running on its own rendering server; mwlib runs as a standalone server. But the rendering of the arabic text is left-to-right only. Is there a known bug or do I miss something in the configuration?

Thank you in advance!

en.wikipedia facility under discussion

4
Steelpillow (talkcontribs)

There is a discussion here which is suggesting that the Wikipedia Books function be withdrawn. The RfC concerns user interface hooks to the collection extension and the discussion has broadened into the whole Books namespace and this software. You are invited to contribute.

I understand that PediaPress are, or were, heavily involved in maintaining this extension. Anybody who knows their full relationship with the WMF would also be more than welcome to give a clarification, as the upload link to their PoD service is also being criticised.

Ckepper (talkcontribs)

Thank you for the heads up. I will try to clarify as much as I can.

Tinss (talkcontribs)

@Steelpillow, thanks for vouching for the Collection extension. It's a valuable tool for the wiki that I'm running (http://wikimedi.ca). Once we have the resources, we'll try and figure out a solution to get it back up and running.

We're anxiously waiting for a new version of the Collection extension and we will happily devote some ressources maintaining and improving it .

Steelpillow (talkcontribs)

@Tinss: The book rendering service is new, the collection extension will need only small updates to link to it.

User:Ckepper at PediaPress is writing a new PDF renderer. I do not know if he will accept help, but you can ask. (His user page here seems very out of date).

User:Dirk Hünniger maintains the MediaWiki2LaTeX renderer. It is available but needs some work, both on the software, which is written in an unusual language called Haskell, and on the service configuration. I think he will be glad of some proper help.

Reply to "en.wikipedia facility under discussion"

[Resolved] RuntimeError: could not get siteinfo

5
Wmat (talkcontribs)

I have 2 environments that I (hope) think are identical. They are both setup with:

MW 1.25alpha PHP 5.3.17 Collection 1.7.0 (1618025)

In the Dev environment, Collection works perfectly. In the Prod environment, I'm seeing this error:

<Greenlet at 0xd43eb0: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0xdbba50>>> failed with  RuntimeError
   creating nuwiki in u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI/nuwiki'
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   removing tmpdir u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI'
   memory used: res=18.2 virt=152.7
   1% error Traceback (most recent call last):
     File "/usr/local/bin/mw-zip", line 9, in <module>
       load_entry_point('mwlib==0.15.14', 'console_scripts', 'mw-zip')()
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 155, in main
       make_zip(output, options, env.metabook, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip
       make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki
       pool.join(raise_error=True)
     File "/usr/local/lib64/python2.6/site-packages/gevent/pool.py", line 98, in join
       raise greenlet.exception
   RuntimeError: could not get siteinfo
    in function system, file /usr/local/lib64/python2.6/site-packages/mwlib/nslave.py, line 64

So it's failing on the start_fetcher.run method, but I can't seem to figure out why. Is this likely some system configuration thing?

Wmat (talkcontribs)

Solved.

This was a DNS issue. We had to add and entry to /etc/hosts so that the server could talk to itself, basically.

66.77.160.179 (talkcontribs)

Hi, I am getting the same error. Do you happen to know what you put in your /etc/hosts file to resolve this?

Ahsan96 (talkcontribs)

I am getting same error after enabling SSL. Could you share how did you resolve this issue

Ablum010777 (talkcontribs)

Yes, please, show us what you did to your /etc/hosts file. I need help with that, too.

Reply to "[Resolved] RuntimeError: could not get siteinfo"
Steelpillow (talkcontribs)

OCG has been withdrawn. The only conversion functionality specified for its replacement is to PDF. Should the information given on this page be updated accordingly, or should it be left because the Collection extension remains theoretically capable of supporting it all?

Reply to "Withdrawn functionality"

Collection Extension: RuntimeError: command failed with returncode 256

17
Anewuserformediawiki (talkcontribs)

Hello,

I'm using MediaWiki 1.16.3 with a Windows XAMPP-Stack 1.7.4 (Apache 2.2.17, MySQL 5.5.8, PHP 5.3.5 (VC6 X86 32bit) + PEAR).

I tried to install and configure the Collection extension with the following LocalSettings.php configuration:

require_once("$IP/extensions/Collection/Collection.php");

$wgCollectionMWServeURL = "http://tools.pediapress.com/mw-serve/";

$wgCollectionFormats = array('rl' => 'PDF', 'odf' => 'ODT', );

When I try to download the PDF/ODT offline Version of a new book I get an error message:

Auf dem Render-Server ist ein Fehler aufgetreten: RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', u'cache/ee/ee57ce6375c56696/collection.zip', '-m', u'cache/ee/ee57ce6375c56696/metabook.json', '--status', u'qserve://localhost:14311/ee57ce6375c56696:makezip', '--config', u'http://s10vitl01/wiki', '--template-blacklist', u'MediaWiki:PDF Template Blacklist', '--template-exclusion-category', u'Vom Druck ausschlie\xdfen', '--print-template-prefix', u'Drucken', '--print-template-pattern', u'$1/Druck'] Last Output: 2011-11-04T07:40:17 mwlib.options.warn >> Both --print-template-pattern and --print-template-prefix (deprecated) specified. Using --print-template-pattern only. 1% creating nuwiki in u'cache/ee/ee57ce6375c56696/tmpl0yZuX/nuwiki' ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> Traceback (most recent call last): File "/home/pp/local/lib/python2.7/site-packages/gevent/greenlet.py", line 402, in run result = self._run(*self.args, **self.kwargs) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 120, in run self.fetch_pages_from_metabook(api) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 60, in fetch_pages_from_metabook fetch_images=not self.options.noimages) File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/fetch.py", line 268, in __init__ siteinfo = self.get_siteinfo_for(self.api) File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/fetch.py", line 417, in get_siteinfo_for return m.get_siteinfo() File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/sapi.py", line 166, in get_siteinfo raise RuntimeError("could not get siteinfo") RuntimeError: could not get siteinfo <Greenlet at 0x24aff30: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x2540c50>>> failed with RuntimeError removing tmpdir u'cache/ee/ee57ce6375c56696/tmpl0yZuX' memory used: res=16.6 virt=128.9 1% error Traceback (most recent call last): File "/home/pp/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.12.17', 'console_scripts', 'mw-zip')() File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 155, in main make_zip(output, options, env.metabook, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 193, in make_nuwiki pool.join(raise_error=True) File "/home/pp/local/lib/python2.7/site-packages/gevent/pool.py", line 105, in join raise greenlet.exception RuntimeError: could not get siteinfo in function system, file ./bin/nslave.py, line 37 in function qaddw, file /home/pp/local/lib/python2.7/site-packages/qs/slave.py, line 66

Any help would be appreciated.

212.66.144.68 (talkcontribs)

Hallo i have the same error, even if i use my own render Server or if i use the $wgCollectionMWServeURL = ("http://tools.pediapress.com/mw-serve/"); Server!!!!! Please any one can help? I'm using wiki 1.17.0

46.180.199.47 (talkcontribs)

I have the same problem. I use trunk version of this extention and MW 19.2. My settings:

require_once( "$IP/extensions/Collection/Collection.php" );
$wgCollectionFormats = array(
	  'rl' => 'PDF',
	  'odf' => 'ODT',
);

  $wgCollectionArticleNamespaces = array(
	  NS_MAIN,
	  NS_PROJECT,
);

$wgCollectionMWServeURL = ("http://tools.pediapress.com/mw-serve/");

Error message:

RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', u'/home/pp/cache/ce/ce21a325a1031a95/collection.zip', '-m', u'/home/pp/cache/ce/ce21a325a1031a95/metabook.json', '--status', u'qserve://localhost:14311/ce21a325a1031a95:makezip', '--template-blacklist', u'MediaWiki:PDF Template Blacklist', '--template-exclusion-category', u'\u0418\u0441\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u044f \u0438\u0437 \u043f\u0435\u0447\u0430\u0442\u0438', '--print-template-prefix', u'\u041f\u0435\u0447\u0430\u0442\u044c', '--print-template-pattern', u'$1/\u041f\u0435\u0447\u0430\u0442\u044c'] Last Output: 2012-04-20T22:43:24 mwlib.options.warn >> Both --print-template-pattern and --print-template-prefix (deprecated) specified. Using --print-template-pattern only. 1% creating nuwiki in u'/home/pp/cache/ce/ce21a325a1031a95/tmpkvsX2V/nuwiki' removing tmpdir u'/home/pp/cache/ce/ce21a325a1031a95/tmpkvsX2V' memory used: res=15.7 virt=122.5 1% error Traceback (most recent call last): File "/home/pp/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.13.3', 'console_scripts', 'mw-zip')() File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 151, in main make_zip(output, options, env.metabook, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 152, in make_nuwiki assert x.wikiident in id2wiki, "no wikiconf for %r (%s)" % (x.wikiident, x) AssertionError: no wikiconf for None (<article {'_env': <mwlib.wiki.Environment object at 0x1666250>, 'content_type': u'text/x-wiki', 'title': u'\u0413\u043b\u0430\u0432\u043d\u0430\u044f \u0441\u0442\u0440\u0430\u043d\u0438\u0446\u0430', 'timestamp': u'1326466208', 'type': 'article', 'revision': u'2311'}>) in function system, file ./bin/nslave.py, line 63 in function qaddw, file /home/pp/local/lib/python2.7/site-packages/qs/slave.py, line 66

85.158.224.252 (talkcontribs)

You need to specify credentials for the pdf generation to work.

First try if those commands work on your system:
mw-zip -c http://hostname/wiki --username=yourusername --password=yourpassword -o startpage.zip Main_Page mw-render -c startpage.zip -o startpage.pdf -w rl

If this works then you are good to go - without the --username and --password it would not work.

If this worked you can proceeded by adding the following below your require_once
$wgCollectionMWServeCredentials = "yourusername:yourpassword";

This worked for me :)

This post was posted by 85.158.224.252, but signed as User:Friesoft.

Varnent (talkcontribs)

Still not working for me. Anyone else having better luck?

Pastakhov (talkcontribs)

apache access_log on mw-zip with the --username and --password

92.46.175.36 - - [31/May/2012:13:10:26 +0600] "POST /w/api.php HTTP/1.1" 301 317 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:10:26 +0600] "GET /w/api.php HTTP/1.1" 500 128 "-" "Python-urllib/2.7"

without the --username and --password

92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords|rightsinfo&format=json HTTP/1.1" 301 446 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords|rightsinfo&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords&format=json HTTP/1.1" 301 435 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases&format=json HTTP/1.1" 301 424 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap&format=json HTTP/1.1" 301 407 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"


mw-zip says

...
ERR: HTTP Error 500: MediaWiki configuration Error
...
RuntimeError: could not get siteinfo
Pastakhov (talkcontribs)

in LocalSettings.php add $wgEnableAPI = true;
This worked for me :)

Unikum~mediawikiwiki (talkcontribs)
Inkydjango (talkcontribs)

Newbie on this extension, i have spend my time to have the solution of this.

You need more than just the module, you must install some local server to translate your wiki page on pdf file.

You must install firt the tool to access to module install pip

# wget http://pypi.python.org/packages/source/s/setuptools/setuptools-0.6c11.tar.gz
# tar zxvf setuptools-0.6c11.tar.gz
# cd setuptools-0.6c11
# python setup.py build
# python setup.py install 
$ wget http://pypi.python.org/packages/source/p/pip/pip-1.2.tar.gz
$ tar xvf pip-1.2.tar.gz
$ cd pip-1.2
# python setup.py install


  • when it is done, you must follow that :
# yum install g++ perl python python-dev python-setuptools python-imaging python-lxml libevent-devel
# yum install python-devel 
# yum install libxml2-python.x86_64 libxslt-python.x86_64 libxslt-devel.x86_64
# yum install python-imaging python-lxml pdftk
# wget http://python-distribute.org/distribute_setup.py
# sudo python distribute_setup.py
# sudo easy_install -U virtualenv
$ wget http://effbot.org/downloads/Imaging-1.1.7.tar.gz
$ gtar -zxvf Imaging-1.1.7.tar.gz
$ cd Imaging-1.1.7
$ ~/plone-python/bin/python setup.py install 
# pip install -i http://pypi.pediapress.com/simple/ pil
# pip install -i http://pypi.pediapress.com/simple/ mwlib
# pip install -i http://pypi.pediapress.com/simple/ mwlib.rl
# pip install -i http://pypi.pediapress.com/simple/ pyfribidi
  • verif de l'install
# mw-zip -c :en -o test.zip Acdc Number
# mw-render -c test.zip -o test.pdf -w rl
  • boot script example mw-serve
#!/bin/sh
#
#chkconfig: 345 20 80
#
#description: mw-serve

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/bin:/usr/local/sbin

case "$1" in
    start)
        su - www-data -c 'nserve.py >> /tmp/mwcache/log.txt 2>&1  &'
        su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
        su - www-data -c 'nslave.py --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
        su - www-data -c 'postman.py >> /tmp/mwcache/log.txt 2>&1 &'
    ;;
  stop)
        mv /data/mwcache/log.txt /data/mwcache/log.old
        killall nserve.py
        killall mw-qserve
        killall nslave.py
        killall postman.py
    ;;
  force-reload|restart)
    $0 stop
    $0 start
    ;;
  *)
    echo "Usage: /etc/init.d/mw-serve {start|stop}"
    exit 1
    ;;
esac

exit 0
# mkdir /tmp/mw-serve
# adduser www-data
# chown www-data /etc/init.d/mw-serve
# chown -R www-data /tmp/mw-serve
# chkconfig --add mw-serve
# /etc/init.d/mw-serve start

config file

following an extract of my localsettings.php file

require_once("$IP/extensions/PdfExport/PdfExport.php");
$wgPdfExportMwLibPath = '/usr/bin/mw-render';
$wgPdfExportBackground = "/tmp/background-image/image.jpg";
$wgPdfExportAttach = true;

########################
# creation pdf 
require_once("$IP/extensions/Collection/Collection.php");
$wgCollectionFormats=array( 'rl' => 'PDF', );

#$wgCollectionPODPartners = false;
$wgGroupPermissions['*']['collectionsaveascommunitypage'] = true;
$wgGroupPermissions['*']['collectionsaveasuserpage']      = true;
$wgEnableAPI = true;
$wgCollectionMWServeCredentials = "collection_user:password";
$wgCollectionMaxArticles = 150;

# Collection tool for the PediaPress extension
$wgCollectionMWServeURL="http://localhost:8899/";
$wgCommunityCollectionNamespace=NS_MEDIAWIKI;

  • You can help yourself with this link to complete this previous process

http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation

Inkydjango (talkcontribs)

So I have done this process, but it is still not running. The command line is OK. The create PDF page from special page is OK, but the create book on the main nenu NOK.

On the log.txt, the pdf file is created but the log show that mw-zip can create the file. A file with 0 octet size stay on the cache directory.

THe firewall on 8899 is open. The cache directory is 777 permission for the moment. I don't find the solution. Is there anybody that a complete install process working ??

Thanks a lot

Inkydjango (talkcontribs)

my error is

File "/usr/lib64/python2.6/site-packages/mwlib/apps/make_nuwiki.py", line 152, in make_nuwiki
        assert x.wikiident in id2wiki, "no wikiconf for %r (%s)" % (x.wikiident,  x)
    AssertionError: no wikiconf for None (<article {'_env': <mwlib.wiki.Environment object at 0x8eec90>, 'content_type': u'text/x-wiki', 'title': u'Recontruction des handlers', 'timestamp': u'1286529761', 'type': 'article', 'revision': u'661'}>)
     in function system, file /usr/bin/nslave.py, line 64
62.231.10.170 (talkcontribs)

The only way to deal with

no wikiconf for None

is to define $wgScriptPath in config, making it different from "", i.e.

$wgScriptPath       = "/w";

or

$wgScriptPath       = "http://server_name"; #WITHOUT SLASH ON END

but not

$wgScriptPath       = "";

it's a bug in collection extension (see https://github.com/pediapress/Collection/issues/1)

Kghbln (talkcontribs)

This one is a life-saver. I just added this info to the extensions page. Thank you for sharing. :) Cheers

188.22.199.103 (talkcontribs)
97.64.134.182 (talkcontribs)

When following your instructions I discovered some issues on my Centos server. Below is a quote from your instructions:

start)

       su - www-data -c 'nserve.py >> /tmp/mwcache/log.txt 2>&1  &'
       su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'nslave.py --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'postman.py >> /tmp/mwcache/log.txt 2>&1 &'
   ;;
 stop)
       mv /data/mwcache/log.txt /data/mwcache/log.old
       killall nserve.py
       killall mw-qserve
       killall nslave.py
       killall postman.py
   ;;

I made the following changes in BOLD:

start)

       su - www-data -c 'nserve >> /tmp/mwcache/log.txt 2>&1  &'
       su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'nslave --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'postman >> /tmp/mwcache/log.txt 2>&1 &'
   ;;
 stop)
       mv /tmp/mwcache/log.txt /tmp/mwcache/log.old
       killall nserve
       killall mw-qserve
       killall nslave
       killall postman
   ;;


Thank you for all of your help

This post was posted by 97.64.134.182, but signed as Jecker.

WessexOne (talkcontribs)

Hi,

I know this thread is quite old but encountered this problem and none of the suggested fixes worked for me. Just in case anyone else has the same issue still, I thought I'd post my solution.

I'm running a private render server on the same server as our Wiki. CentOS 6.5 x64 MediaWiki 1.22.6 PHP 5.3.3 MySQL 5.1.73 Collection Version 1.6.1

The first step is to ensure SELINUX is disabled. If it was not, the curl_exec($ch) didn't work for me (and I couldn't figure out how to make it work with SELinux).

The second step is to edit

$IP/extensions/Collection/Collection.body.php

In the function

static function mwServeCommand( $command, $args ) {

Replace

$serveURL = $wgCollectionMWServeURL;

with

$serveURL = 'myservername:8899/';

For me, this got rid of the "RuntimeError: command failed with returncode 256" error as it seems that, regardless of what is set in LocalSettings.php, $wgCollectionMWServeURL was being set to pediapress. However, I was then left with the "The POST request to server:8899/ failed ($2)" error. I fixed this by replacing

$response = Http::post($serveURL, array('postData' => $args));

with

$boundary = '--myboundary-bps';
$retval= '';
foreach($args as $key => $value){
    $retval .= "--$boundary\nContent-Disposition: form-data; name=\"$key\"\n\n$value\n";
}
$retval .= "--$boundary--";
$body = $retval;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Content-Type: multipart/form-data; boundary=$boundary"));
curl_setopt($ch, CURLOPT_URL, $serveURL);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $body);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true );
$response = curl_exec($ch);


I got the code from this site and tweaked it http://scraperblog.blogspot.co.uk/2013/07/php-curl-multipart-form-posting.html

Nemo bis (talkcontribs)

I don't see a bug filed for $wgCollectionMWServeURL being overridden, please file one or even better submit a patch. Thanks!

Reply to "Collection Extension: RuntimeError: command failed with returncode 256"
Summary by BluAlien

Solved, again the old issue with "localhost" and "192.168". I supposed that the issue was removed, but I was wrong.

There is a check for the incoming addres in file nserve.py at around line 286 which returns False if the calling server (media wiki collection address) starts with 192.168 or is localhost.

Comment the statement or set its return to True. Done.

BluAlien (talkcontribs)

I'm trying to setup a render server on a Ubuntu 17.10 server, MediaWiki is 1.30 and Collection extension is the one in snapshot ba6fa49 for MediaWiki REL1_30. Render server is a separate server.

My LocalSettings is configured as:

//Extension Collection

require_once "$IP/extensions/Collection/Collection.php";

$wgCollectionPODPartners = false;

$wgCollectionFormats = array(

   'rl' => 'PDF', # enabled by default

   'odf' => 'ODT',

   'epub' => 'e-book (EPUB)',

);

$wgGroupPermissions['user']['collectionsaveascommunitypage'] = true;

$wgGroupPermissions['user']['collectionsaveasuserpage']      = true;

$wgCollectionMWServeCredentials = "Myuser:MyPassword";

$wgCollectionMWServeURL = "http://myRenderServer:8899";

I checked my render server against Wikipedia Article with

mw-zip -c :en -o test.zip Acdc Number

mw-render -c test.zip -o test.pdf -w rl

and also with MyWiki Article with

mw-zip --username="Myuser" --password="MyPassword" -c http://myIPAddress/w -o test.zip TestPage

mw-render -c test.zip -o test.pdf -w rl

in both cases the Article was rendered correctly, but when I try to render the same Article from inside MyWiki I got the error reported down here. I checked and re-checked, the base_url path is correct. I saw and old bug about this, but It was solved since REL_1.27. Any Idea ?

new-collection 1        'http://myIPAddress/w'        'rl'

2018-01-31T02:01:08 mwlib.serve.bad >> bad base_url: 'http://myIPAddress/w'

myIPAddress - - [2018-01-31 02:01:08] "POST /cache HTTP/1.0" 200 283 0.005379

Thanks in advance.

BluAlien (talkcontribs)

Solved, again the old issue with "localhost" and "192.168". I supposed that the issue was removed, but I was wrong.

There is a check for the incoming addres in file nserve.py at around line 286 which returns False il the calling server (media wiki collection addredd) starts with 192.168 or is localhos.

Comment the statement or set its return to True. Done.

How to debug when using default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL

1
Peculiar Investor (talkcontribs)

I am the admin on a low volume wiki that uses this extension. Our configuration uses the default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL.

Some of my users are periodically reporting that when they click on ''Download as PDF'', they sometimes get an error message: ''book rendering failed - there was an error while attempting to render your book''

Is there any log files/debug options that I can use and/or configure on my side to figure out the cause of the failed PDF rendering?

Reply to "How to debug when using default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL"

Can not get my rendering server to work apart from wikipedia.

6
Solanki (talkcontribs)

Hi, I've been trying to set up the Collection extension on my own rendering server, so I can generate pdf files from my wiki. So far no luck.


Here's where I am:

I followed this guide:

http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation

And I can create pdf files from wikipedia using:

mw-zip -c :en -o test.zip Acdc Number

mw-render -c test.zip -o test.pdf -w pdf

mw-zip works just as one would expect.


mw-zip -c :en -o test.zip Acdc Number

creating nuwiki in u'tmpuIdHyY/nuwiki' 2013-09-07T10:10:58 mwlib.utils.info >> fetching 'http://en.wikipedia.org/w/index.php?title=Help:Books/License&action=raw&templates=expand' removing tmpdir u'tmpuIdHyY' memory used: res=25.0 virt=816.4


I can read those pdf files, so I know my basic render farm setup is working.

The problem is that I cannot get it to work with anything other than wikipedia.

If I try the URL in the guide:


mw-zip -c http://edutechwiki.unige.ch/mediawiki/ -o test2.zip Mediawiki_collection_extension_installation

creating nuwiki in u'tmpRnDvRH/nuwiki' Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 747, in refcall_fun fun(*args, **kw)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 632, in handle_new_basepath api = self._get_mwapi_for_path(path)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 684, in _get_mwapi_for_path raise RuntimeError("cannot guess api url for %r" % (path,))

RuntimeError: cannot guess api url for 'http://edutechwiki.unige.ch/en' <Greenlet at 0x24d2cd0: refcall_fun> failed with RuntimeError

WARNING: (u'Mediawiki_collection_extension_installation', None) could not be fetched removing tmpdir u'tmpRnDvRH' memory used: res=19.3 virt=226.7



and if I try my own:



mw-zip -c http://IP:PortNo/wiki/index.php/ --username=uuu --password=ppp -o test2.zip Test

creating nuwiki in u'tmpG82RPH/nuwiki' Traceback (most recent call last):

File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 114, in run api = self.get_api()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 28, in get_api api.login(self.username, self.password, self.domain)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 186, in login res = self._post(**args)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 106, in _post res = loads(self._fetch(req))

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 23, in loads return json.loads(s)

File "/usr/lib/python2.7/dist-packages/simplejson/__init__.py", line 413, in loads return _default_decoder.decode(s)

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 402, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 420, in raw_decode raise JSONDecodeError("No JSON object could be decoded", s, idx)

JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0) <Greenlet at 0x1a7b870: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x1acf790>>> failed with JSONDecodeError

removing tmpdir u'tmpG82RPH' memory used: res=16.8 virt=152.5 Traceback (most recent call last):

File "/usr/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.15.11', 'console_scripts', 'mw-zip')()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main make_zip(output, options, env.metabook, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 189, in make_nuwiki pool.join(raise_error=True)

File "/usr/local/lib/python2.7/dist-packages/gevent/pool.py", line 98, in join raise greenlet.exception

simplejson.decoder.JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)


I as using Mediawiki 1.23 and I am not behind any proxy and also I disabled SELinux.

Variables that I am using in LocalSettings file are as follows:

$wgServer = "http://IP:portno";

$wgScriptPath = "/wiki";

require_once "$IP/extensions/Collection/Collection.php";

$wgCollectionMWServeURL = 'http://IP:8899'; (default port of mw-serve)

$wgCollectionMWServeCredentials = "username:password";

$wgEnableAPI = true;


I can't even begin to work on the actual extension interface until I have this working..... Any suggestions? Where do I go next?

Any help would be appreciated!


Thanks!

Solanki (talkcontribs)

Guys! I would really appreciate any kind of help or just point me in the right direction, coz I am banging my head here.

Thanks!

Jongfeli (talkcontribs)
Solanki (talkcontribs)

Hi Felipe. No, my server is running on RHEL 6.5. The strange thing am encountering is its different behavior to different sites, like I mentioned above, it works perfectly fine for Wikipedia, however gives a different error to http://edutechwiki.unige.ch/mediawiki/ and a different one for my site i.e. mw-zip -c http://IP:PortNo/wiki/api.php/ --username=uuu --password=ppp -o test2.zip Test.


Now, this is what's boggling my mind. I have almost gone through every document possible.

(Errors related to different sites are given above.)

MarkAHershberger (talkcontribs)

Yay! I'm getting this exact error ("cannot guess api url...")

And my server was working before....

Anu8791 (talkcontribs)

HI,

we have the similar kind of setup in our RHEL6 server and we render the PDF documents successfully. But, here the issue is the PDF doc could not contain all the embedded JPEG images in the wiki article page and it only intakes .PNG pictures in the PDF document.

Could some one please advise with solution for this issue? It would be really appreciated !

Thanks in Adv,

Sanjay

Reply to "Can not get my rendering server to work apart from wikipedia."