Extension talk:Collection

Jump to navigation Jump to search

About this board

Archives 

/Archive 1

Collection Extension: RuntimeError: command failed with returncode 256

17
Anewuserformediawiki (talkcontribs)

Hello,

I'm using MediaWiki 1.16.3 with a Windows XAMPP-Stack 1.7.4 (Apache 2.2.17, MySQL 5.5.8, PHP 5.3.5 (VC6 X86 32bit) + PEAR).

I tried to install and configure the Collection extension with the following LocalSettings.php configuration:

require_once("$IP/extensions/Collection/Collection.php");

$wgCollectionMWServeURL = "http://tools.pediapress.com/mw-serve/";

$wgCollectionFormats = array('rl' => 'PDF', 'odf' => 'ODT', );

When I try to download the PDF/ODT offline Version of a new book I get an error message:

Auf dem Render-Server ist ein Fehler aufgetreten: RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', u'cache/ee/ee57ce6375c56696/collection.zip', '-m', u'cache/ee/ee57ce6375c56696/metabook.json', '--status', u'qserve://localhost:14311/ee57ce6375c56696:makezip', '--config', u'http://s10vitl01/wiki', '--template-blacklist', u'MediaWiki:PDF Template Blacklist', '--template-exclusion-category', u'Vom Druck ausschlie\xdfen', '--print-template-prefix', u'Drucken', '--print-template-pattern', u'$1/Druck'] Last Output: 2011-11-04T07:40:17 mwlib.options.warn >> Both --print-template-pattern and --print-template-prefix (deprecated) specified. Using --print-template-pattern only. 1% creating nuwiki in u'cache/ee/ee57ce6375c56696/tmpl0yZuX/nuwiki' ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> ERR: <urlopen error [Errno 4] ARES_ENOTFOUND: Domain name not found> Traceback (most recent call last): File "/home/pp/local/lib/python2.7/site-packages/gevent/greenlet.py", line 402, in run result = self._run(*self.args, **self.kwargs) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 120, in run self.fetch_pages_from_metabook(api) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 60, in fetch_pages_from_metabook fetch_images=not self.options.noimages) File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/fetch.py", line 268, in __init__ siteinfo = self.get_siteinfo_for(self.api) File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/fetch.py", line 417, in get_siteinfo_for return m.get_siteinfo() File "/home/pp/local/lib/python2.7/site-packages/mwlib/net/sapi.py", line 166, in get_siteinfo raise RuntimeError("could not get siteinfo") RuntimeError: could not get siteinfo <Greenlet at 0x24aff30: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x2540c50>>> failed with RuntimeError removing tmpdir u'cache/ee/ee57ce6375c56696/tmpl0yZuX' memory used: res=16.6 virt=128.9 1% error Traceback (most recent call last): File "/home/pp/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.12.17', 'console_scripts', 'mw-zip')() File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 155, in main make_zip(output, options, env.metabook, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 193, in make_nuwiki pool.join(raise_error=True) File "/home/pp/local/lib/python2.7/site-packages/gevent/pool.py", line 105, in join raise greenlet.exception RuntimeError: could not get siteinfo in function system, file ./bin/nslave.py, line 37 in function qaddw, file /home/pp/local/lib/python2.7/site-packages/qs/slave.py, line 66

Any help would be appreciated.

212.66.144.68 (talkcontribs)

Hallo i have the same error, even if i use my own render Server or if i use the $wgCollectionMWServeURL = ("http://tools.pediapress.com/mw-serve/"); Server!!!!! Please any one can help? I'm using wiki 1.17.0

46.180.199.47 (talkcontribs)

I have the same problem. I use trunk version of this extention and MW 19.2. My settings:

require_once( "$IP/extensions/Collection/Collection.php" );
$wgCollectionFormats = array(
	  'rl' => 'PDF',
	  'odf' => 'ODT',
);

  $wgCollectionArticleNamespaces = array(
	  NS_MAIN,
	  NS_PROJECT,
);

$wgCollectionMWServeURL = ("http://tools.pediapress.com/mw-serve/");

Error message:

RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', u'/home/pp/cache/ce/ce21a325a1031a95/collection.zip', '-m', u'/home/pp/cache/ce/ce21a325a1031a95/metabook.json', '--status', u'qserve://localhost:14311/ce21a325a1031a95:makezip', '--template-blacklist', u'MediaWiki:PDF Template Blacklist', '--template-exclusion-category', u'\u0418\u0441\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u044f \u0438\u0437 \u043f\u0435\u0447\u0430\u0442\u0438', '--print-template-prefix', u'\u041f\u0435\u0447\u0430\u0442\u044c', '--print-template-pattern', u'$1/\u041f\u0435\u0447\u0430\u0442\u044c'] Last Output: 2012-04-20T22:43:24 mwlib.options.warn >> Both --print-template-pattern and --print-template-prefix (deprecated) specified. Using --print-template-pattern only. 1% creating nuwiki in u'/home/pp/cache/ce/ce21a325a1031a95/tmpkvsX2V/nuwiki' removing tmpdir u'/home/pp/cache/ce/ce21a325a1031a95/tmpkvsX2V' memory used: res=15.7 virt=122.5 1% error Traceback (most recent call last): File "/home/pp/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.13.3', 'console_scripts', 'mw-zip')() File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 151, in main make_zip(output, options, env.metabook, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status) File "/home/pp/local/lib/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 152, in make_nuwiki assert x.wikiident in id2wiki, "no wikiconf for %r (%s)" % (x.wikiident, x) AssertionError: no wikiconf for None (<article {'_env': <mwlib.wiki.Environment object at 0x1666250>, 'content_type': u'text/x-wiki', 'title': u'\u0413\u043b\u0430\u0432\u043d\u0430\u044f \u0441\u0442\u0440\u0430\u043d\u0438\u0446\u0430', 'timestamp': u'1326466208', 'type': 'article', 'revision': u'2311'}>) in function system, file ./bin/nslave.py, line 63 in function qaddw, file /home/pp/local/lib/python2.7/site-packages/qs/slave.py, line 66

85.158.224.252 (talkcontribs)

You need to specify credentials for the pdf generation to work.

First try if those commands work on your system:
mw-zip -c http://hostname/wiki --username=yourusername --password=yourpassword -o startpage.zip Main_Page mw-render -c startpage.zip -o startpage.pdf -w rl

If this works then you are good to go - without the --username and --password it would not work.

If this worked you can proceeded by adding the following below your require_once
$wgCollectionMWServeCredentials = "yourusername:yourpassword";

This worked for me :)

This post was posted by 85.158.224.252, but signed as User:Friesoft.

Varnent (talkcontribs)

Still not working for me. Anyone else having better luck?

Pastakhov (talkcontribs)

apache access_log on mw-zip with the --username and --password

92.46.175.36 - - [31/May/2012:13:10:26 +0600] "POST /w/api.php HTTP/1.1" 301 317 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:10:26 +0600] "GET /w/api.php HTTP/1.1" 500 128 "-" "Python-urllib/2.7"

without the --username and --password

92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords|rightsinfo&format=json HTTP/1.1" 301 446 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords|rightsinfo&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords&format=json HTTP/1.1" 301 435 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases|magicwords&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases&format=json HTTP/1.1" 301 424 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap|namespacealiases&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap&format=json HTTP/1.1" 301 407 "-" "Python-urllib/2.7"
92.46.175.36 - - [31/May/2012:13:15:14 +0600] "GET /w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|interwikimap&format=json HTTP/1.1" 500 128 "-" "Python-urllib/2.7"


mw-zip says

...
ERR: HTTP Error 500: MediaWiki configuration Error
...
RuntimeError: could not get siteinfo
Pastakhov (talkcontribs)

in LocalSettings.php add $wgEnableAPI = true;
This worked for me :)

Unikum~mediawikiwiki (talkcontribs)
Inkydjango (talkcontribs)

Newbie on this extension, i have spend my time to have the solution of this.

You need more than just the module, you must install some local server to translate your wiki page on pdf file.

You must install firt the tool to access to module install pip

# wget http://pypi.python.org/packages/source/s/setuptools/setuptools-0.6c11.tar.gz
# tar zxvf setuptools-0.6c11.tar.gz
# cd setuptools-0.6c11
# python setup.py build
# python setup.py install 
$ wget http://pypi.python.org/packages/source/p/pip/pip-1.2.tar.gz
$ tar xvf pip-1.2.tar.gz
$ cd pip-1.2
# python setup.py install


  • when it is done, you must follow that :
# yum install g++ perl python python-dev python-setuptools python-imaging python-lxml libevent-devel
# yum install python-devel 
# yum install libxml2-python.x86_64 libxslt-python.x86_64 libxslt-devel.x86_64
# yum install python-imaging python-lxml pdftk
# wget http://python-distribute.org/distribute_setup.py
# sudo python distribute_setup.py
# sudo easy_install -U virtualenv
$ wget http://effbot.org/downloads/Imaging-1.1.7.tar.gz
$ gtar -zxvf Imaging-1.1.7.tar.gz
$ cd Imaging-1.1.7
$ ~/plone-python/bin/python setup.py install 
# pip install -i http://pypi.pediapress.com/simple/ pil
# pip install -i http://pypi.pediapress.com/simple/ mwlib
# pip install -i http://pypi.pediapress.com/simple/ mwlib.rl
# pip install -i http://pypi.pediapress.com/simple/ pyfribidi
  • verif de l'install
# mw-zip -c :en -o test.zip Acdc Number
# mw-render -c test.zip -o test.pdf -w rl
  • boot script example mw-serve
#!/bin/sh
#
#chkconfig: 345 20 80
#
#description: mw-serve

PATH=/bin:/usr/bin:/sbin:/usr/sbin:/usr/local/bin:/usr/local/sbin

case "$1" in
    start)
        su - www-data -c 'nserve.py >> /tmp/mwcache/log.txt 2>&1  &'
        su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
        su - www-data -c 'nslave.py --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
        su - www-data -c 'postman.py >> /tmp/mwcache/log.txt 2>&1 &'
    ;;
  stop)
        mv /data/mwcache/log.txt /data/mwcache/log.old
        killall nserve.py
        killall mw-qserve
        killall nslave.py
        killall postman.py
    ;;
  force-reload|restart)
    $0 stop
    $0 start
    ;;
  *)
    echo "Usage: /etc/init.d/mw-serve {start|stop}"
    exit 1
    ;;
esac

exit 0
# mkdir /tmp/mw-serve
# adduser www-data
# chown www-data /etc/init.d/mw-serve
# chown -R www-data /tmp/mw-serve
# chkconfig --add mw-serve
# /etc/init.d/mw-serve start

config file

following an extract of my localsettings.php file

require_once("$IP/extensions/PdfExport/PdfExport.php");
$wgPdfExportMwLibPath = '/usr/bin/mw-render';
$wgPdfExportBackground = "/tmp/background-image/image.jpg";
$wgPdfExportAttach = true;

########################
# creation pdf 
require_once("$IP/extensions/Collection/Collection.php");
$wgCollectionFormats=array( 'rl' => 'PDF', );

#$wgCollectionPODPartners = false;
$wgGroupPermissions['*']['collectionsaveascommunitypage'] = true;
$wgGroupPermissions['*']['collectionsaveasuserpage']      = true;
$wgEnableAPI = true;
$wgCollectionMWServeCredentials = "collection_user:password";
$wgCollectionMaxArticles = 150;

# Collection tool for the PediaPress extension
$wgCollectionMWServeURL="http://localhost:8899/";
$wgCommunityCollectionNamespace=NS_MEDIAWIKI;

  • You can help yourself with this link to complete this previous process

http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation

Inkydjango (talkcontribs)

So I have done this process, but it is still not running. The command line is OK. The create PDF page from special page is OK, but the create book on the main nenu NOK.

On the log.txt, the pdf file is created but the log show that mw-zip can create the file. A file with 0 octet size stay on the cache directory.

THe firewall on 8899 is open. The cache directory is 777 permission for the moment. I don't find the solution. Is there anybody that a complete install process working ??

Thanks a lot

Inkydjango (talkcontribs)

my error is

File "/usr/lib64/python2.6/site-packages/mwlib/apps/make_nuwiki.py", line 152, in make_nuwiki
        assert x.wikiident in id2wiki, "no wikiconf for %r (%s)" % (x.wikiident,  x)
    AssertionError: no wikiconf for None (<article {'_env': <mwlib.wiki.Environment object at 0x8eec90>, 'content_type': u'text/x-wiki', 'title': u'Recontruction des handlers', 'timestamp': u'1286529761', 'type': 'article', 'revision': u'661'}>)
     in function system, file /usr/bin/nslave.py, line 64
62.231.10.170 (talkcontribs)

The only way to deal with

no wikiconf for None

is to define $wgScriptPath in config, making it different from "", i.e.

$wgScriptPath       = "/w";

or

$wgScriptPath       = "http://server_name"; #WITHOUT SLASH ON END

but not

$wgScriptPath       = "";

it's a bug in collection extension (see https://github.com/pediapress/Collection/issues/1)

Kghbln (talkcontribs)

This one is a life-saver. I just added this info to the extensions page. Thank you for sharing. :) Cheers

188.22.199.103 (talkcontribs)
97.64.134.182 (talkcontribs)

When following your instructions I discovered some issues on my Centos server. Below is a quote from your instructions:

start)

       su - www-data -c 'nserve.py >> /tmp/mwcache/log.txt 2>&1  &'
       su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'nslave.py --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'postman.py >> /tmp/mwcache/log.txt 2>&1 &'
   ;;
 stop)
       mv /data/mwcache/log.txt /data/mwcache/log.old
       killall nserve.py
       killall mw-qserve
       killall nslave.py
       killall postman.py
   ;;

I made the following changes in BOLD:

start)

       su - www-data -c 'nserve >> /tmp/mwcache/log.txt 2>&1  &'
       su - www-data -c 'mw-qserve >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'nslave --cachedir /tmp/mwcache/ >> /tmp/mwcache/log.txt 2>&1 &'
       su - www-data -c 'postman >> /tmp/mwcache/log.txt 2>&1 &'
   ;;
 stop)
       mv /tmp/mwcache/log.txt /tmp/mwcache/log.old
       killall nserve
       killall mw-qserve
       killall nslave
       killall postman
   ;;


Thank you for all of your help

This post was posted by 97.64.134.182, but signed as Jecker.

WessexOne (talkcontribs)

Hi,

I know this thread is quite old but encountered this problem and none of the suggested fixes worked for me. Just in case anyone else has the same issue still, I thought I'd post my solution.

I'm running a private render server on the same server as our Wiki. CentOS 6.5 x64 MediaWiki 1.22.6 PHP 5.3.3 MySQL 5.1.73 Collection Version 1.6.1

The first step is to ensure SELINUX is disabled. If it was not, the curl_exec($ch) didn't work for me (and I couldn't figure out how to make it work with SELinux).

The second step is to edit

$IP/extensions/Collection/Collection.body.php

In the function

static function mwServeCommand( $command, $args ) {

Replace

$serveURL = $wgCollectionMWServeURL;

with

$serveURL = 'myservername:8899/';

For me, this got rid of the "RuntimeError: command failed with returncode 256" error as it seems that, regardless of what is set in LocalSettings.php, $wgCollectionMWServeURL was being set to pediapress. However, I was then left with the "The POST request to server:8899/ failed ($2)" error. I fixed this by replacing

$response = Http::post($serveURL, array('postData' => $args));

with

$boundary = '--myboundary-bps';
$retval= '';
foreach($args as $key => $value){
    $retval .= "--$boundary\nContent-Disposition: form-data; name=\"$key\"\n\n$value\n";
}
$retval .= "--$boundary--";
$body = $retval;
$ch = curl_init();
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Content-Type: multipart/form-data; boundary=$boundary"));
curl_setopt($ch, CURLOPT_URL, $serveURL);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $body);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true );
$response = curl_exec($ch);


I got the code from this site and tweaked it http://scraperblog.blogspot.co.uk/2013/07/php-curl-multipart-form-posting.html

Nemo bis (talkcontribs)

I don't see a bug filed for $wgCollectionMWServeURL being overridden, please file one or even better submit a patch. Thanks!

Reply to "Collection Extension: RuntimeError: command failed with returncode 256"

[Resolved] RuntimeError: could not get siteinfo

4
Wmat (talkcontribs)

I have 2 environments that I (hope) think are identical. They are both setup with:

MW 1.25alpha PHP 5.3.17 Collection 1.7.0 (1618025)

In the Dev environment, Collection works perfectly. In the Prod environment, I'm seeing this error:

<Greenlet at 0xd43eb0: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0xdbba50>>> failed with  RuntimeError
   creating nuwiki in u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI/nuwiki'
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   removing tmpdir u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI'
   memory used: res=18.2 virt=152.7
   1% error Traceback (most recent call last):
     File "/usr/local/bin/mw-zip", line 9, in <module>
       load_entry_point('mwlib==0.15.14', 'console_scripts', 'mw-zip')()
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 155, in main
       make_zip(output, options, env.metabook, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip
       make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki
       pool.join(raise_error=True)
     File "/usr/local/lib64/python2.6/site-packages/gevent/pool.py", line 98, in join
       raise greenlet.exception
   RuntimeError: could not get siteinfo
    in function system, file /usr/local/lib64/python2.6/site-packages/mwlib/nslave.py, line 64

So it's failing on the start_fetcher.run method, but I can't seem to figure out why. Is this likely some system configuration thing?

Wmat (talkcontribs)

Solved.

This was a DNS issue. We had to add and entry to /etc/hosts so that the server could talk to itself, basically.

66.77.160.179 (talkcontribs)

Hi, I am getting the same error. Do you happen to know what you put in your /etc/hosts file to resolve this?

Ahsan96 (talkcontribs)

I am getting same error after enabling SSL. Could you share how did you resolve this issue

Reply to "[Resolved] RuntimeError: could not get siteinfo"
Summary by BluAlien

Solved, again the old issue with "localhost" and "192.168". I supposed that the issue was removed, but I was wrong.

There is a check for the incoming addres in file nserve.py at around line 286 which returns False if the calling server (media wiki collection address) starts with 192.168 or is localhost.

Comment the statement or set its return to True. Done.

BluAlien (talkcontribs)

I'm trying to setup a render server on a Ubuntu 17.10 server, MediaWiki is 1.30 and Collection extension is the one in snapshot ba6fa49 for MediaWiki REL1_30. Render server is a separate server.

My LocalSettings is configured as:

//Extension Collection

require_once "$IP/extensions/Collection/Collection.php";

$wgCollectionPODPartners = false;

$wgCollectionFormats = array(

   'rl' => 'PDF', # enabled by default

   'odf' => 'ODT',

   'epub' => 'e-book (EPUB)',

);

$wgGroupPermissions['user']['collectionsaveascommunitypage'] = true;

$wgGroupPermissions['user']['collectionsaveasuserpage']      = true;

$wgCollectionMWServeCredentials = "Myuser:MyPassword";

$wgCollectionMWServeURL = "http://myRenderServer:8899";

I checked my render server against Wikipedia Article with

mw-zip -c :en -o test.zip Acdc Number

mw-render -c test.zip -o test.pdf -w rl

and also with MyWiki Article with

mw-zip --username="Myuser" --password="MyPassword" -c http://myIPAddress/w -o test.zip TestPage

mw-render -c test.zip -o test.pdf -w rl

in both cases the Article was rendered correctly, but when I try to render the same Article from inside MyWiki I got the error reported down here. I checked and re-checked, the base_url path is correct. I saw and old bug about this, but It was solved since REL_1.27. Any Idea ?

new-collection 1        'http://myIPAddress/w'        'rl'

2018-01-31T02:01:08 mwlib.serve.bad >> bad base_url: 'http://myIPAddress/w'

myIPAddress - - [2018-01-31 02:01:08] "POST /cache HTTP/1.0" 200 283 0.005379

Thanks in advance.

BluAlien (talkcontribs)

Solved, again the old issue with "localhost" and "192.168". I supposed that the issue was removed, but I was wrong.

There is a check for the incoming addres in file nserve.py at around line 286 which returns False il the calling server (media wiki collection addredd) starts with 192.168 or is localhos.

Comment the statement or set its return to True. Done.

How to debug when using default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL

1
Peculiar Investor (talkcontribs)

I am the admin on a low volume wiki that uses this extension. Our configuration uses the default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL.

Some of my users are periodically reporting that when they click on ''Download as PDF'', they sometimes get an error message: ''book rendering failed - there was an error while attempting to render your book''

Is there any log files/debug options that I can use and/or configure on my side to figure out the cause of the failed PDF rendering?

Reply to "How to debug when using default value http://tools.pediapress.com/mw-serve/ for $wgCollectionMWServeURL"

Can not get my rendering server to work apart from wikipedia.

6
Solanki (talkcontribs)

Hi, I've been trying to set up the Collection extension on my own rendering server, so I can generate pdf files from my wiki. So far no luck.


Here's where I am:

I followed this guide:

http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation

And I can create pdf files from wikipedia using:

mw-zip -c :en -o test.zip Acdc Number

mw-render -c test.zip -o test.pdf -w pdf

mw-zip works just as one would expect.


mw-zip -c :en -o test.zip Acdc Number

creating nuwiki in u'tmpuIdHyY/nuwiki' 2013-09-07T10:10:58 mwlib.utils.info >> fetching 'http://en.wikipedia.org/w/index.php?title=Help:Books/License&action=raw&templates=expand' removing tmpdir u'tmpuIdHyY' memory used: res=25.0 virt=816.4


I can read those pdf files, so I know my basic render farm setup is working.

The problem is that I cannot get it to work with anything other than wikipedia.

If I try the URL in the guide:


mw-zip -c http://edutechwiki.unige.ch/mediawiki/ -o test2.zip Mediawiki_collection_extension_installation

creating nuwiki in u'tmpRnDvRH/nuwiki' Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 747, in refcall_fun fun(*args, **kw)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 632, in handle_new_basepath api = self._get_mwapi_for_path(path)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 684, in _get_mwapi_for_path raise RuntimeError("cannot guess api url for %r" % (path,))

RuntimeError: cannot guess api url for 'http://edutechwiki.unige.ch/en' <Greenlet at 0x24d2cd0: refcall_fun> failed with RuntimeError

WARNING: (u'Mediawiki_collection_extension_installation', None) could not be fetched removing tmpdir u'tmpRnDvRH' memory used: res=19.3 virt=226.7



and if I try my own:



mw-zip -c http://IP:PortNo/wiki/index.php/ --username=uuu --password=ppp -o test2.zip Test

creating nuwiki in u'tmpG82RPH/nuwiki' Traceback (most recent call last):

File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 114, in run api = self.get_api()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 28, in get_api api.login(self.username, self.password, self.domain)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 186, in login res = self._post(**args)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 106, in _post res = loads(self._fetch(req))

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 23, in loads return json.loads(s)

File "/usr/lib/python2.7/dist-packages/simplejson/__init__.py", line 413, in loads return _default_decoder.decode(s)

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 402, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 420, in raw_decode raise JSONDecodeError("No JSON object could be decoded", s, idx)

JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0) <Greenlet at 0x1a7b870: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x1acf790>>> failed with JSONDecodeError

removing tmpdir u'tmpG82RPH' memory used: res=16.8 virt=152.5 Traceback (most recent call last):

File "/usr/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.15.11', 'console_scripts', 'mw-zip')()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main make_zip(output, options, env.metabook, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 189, in make_nuwiki pool.join(raise_error=True)

File "/usr/local/lib/python2.7/dist-packages/gevent/pool.py", line 98, in join raise greenlet.exception

simplejson.decoder.JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)


I as using Mediawiki 1.23 and I am not behind any proxy and also I disabled SELinux.

Variables that I am using in LocalSettings file are as follows:

$wgServer = "http://IP:portno";

$wgScriptPath = "/wiki";

require_once "$IP/extensions/Collection/Collection.php";

$wgCollectionMWServeURL = 'http://IP:8899'; (default port of mw-serve)

$wgCollectionMWServeCredentials = "username:password";

$wgEnableAPI = true;


I can't even begin to work on the actual extension interface until I have this working..... Any suggestions? Where do I go next?

Any help would be appreciated!


Thanks!

Solanki (talkcontribs)

Guys! I would really appreciate any kind of help or just point me in the right direction, coz I am banging my head here.

Thanks!

Jongfeli (talkcontribs)
Solanki (talkcontribs)

Hi Felipe. No, my server is running on RHEL 6.5. The strange thing am encountering is its different behavior to different sites, like I mentioned above, it works perfectly fine for Wikipedia, however gives a different error to http://edutechwiki.unige.ch/mediawiki/ and a different one for my site i.e. mw-zip -c http://IP:PortNo/wiki/api.php/ --username=uuu --password=ppp -o test2.zip Test.


Now, this is what's boggling my mind. I have almost gone through every document possible.

(Errors related to different sites are given above.)

MarkAHershberger (talkcontribs)

Yay! I'm getting this exact error ("cannot guess api url...")

And my server was working before....

Anu8791 (talkcontribs)

HI,

we have the similar kind of setup in our RHEL6 server and we render the PDF documents successfully. But, here the issue is the PDF doc could not contain all the embedded JPEG images in the wiki article page and it only intakes .PNG pictures in the PDF document.

Could some one please advise with solution for this issue? It would be really appreciated !

Thanks in Adv,

Sanjay

Reply to "Can not get my rendering server to work apart from wikipedia."

urllib2 error when running your own rendering server

1
Jamal22066 (talkcontribs)

Getting the following error when running my own rendering server:

new-collection 1        'https://epss-dev-mw01.example.com/mediawiki'     'rl'

2017-10-20T08:12:31 mwlib.serve.info >> render 130c6a8548c2745b rl

10.102.177.204 - - [2017-10-20 08:12:31] "POST / HTTP/1.0" 200 200 0.004170

10.102.177.204 - - [2017-10-20 08:12:31] "POST / HTTP/1.0" 200 215 0.003583

10.102.177.204 - - [2017-10-20 08:12:32] "POST / HTTP/1.0" 200 229 0.002870

10.102.177.204 - - [2017-10-20 08:12:32] "POST / HTTP/1.0" 200 229 0.003326

10.102.177.204 - - [2017-10-20 08:12:33] "POST / HTTP/1.0" 200 229 0.003243

10.102.177.204 - - [2017-10-20 08:12:34] "POST / HTTP/1.0" 200 229 0.003381

10.102.177.204 - - [2017-10-20 08:12:34] "POST / HTTP/1.0" 200 229 0.002904

10.102.177.204 - - [2017-10-20 08:12:35] "POST / HTTP/1.0" 200 229 0.002736

10.102.177.204 - - [2017-10-20 08:12:36] "POST / HTTP/1.0" 200 229 0.002985

256 5.48739695549 ['mw-zip', '-o', '/root/cache/13/130c6a8548c2745b/collection.zip', '-m', '/root/cache/13/130c6a8548c2745b/metabook.json', '--status', 'qserve://localhost:14311/130c6a8548c2745b:makezip', '--config', 'https://epss-dev-mw01.serco.cms/mediawiki', '--username', 'jamal.nasir.adm', '--password', '{OMITTED}']

1%  Traceback (most recent call last):

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 534, in run

    result = self._run(*self.args, **self.kwargs)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 117, in run

    api = self.get_api()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 31, in get_api

    api.login(self.username, self.password, self.domain)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 194, in login

    res = self._post(**args)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 114, in _post

    res = loads(self._fetch(req))

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 80, in _fetch

    f = self.opener.open(url)

  File "/usr/lib64/python2.7/urllib2.py", line 446, in open

    response = meth(req, response)

  File "/usr/lib64/python2.7/urllib2.py", line 559, in http_response

    'http', request, response, code, msg, hdrs)

  File "/usr/lib64/python2.7/urllib2.py", line 484, in error

    return self._call_chain(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 418, in _call_chain

    result = func(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 567, in http_error_default

    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)

HTTPError: HTTP Error 401: Unauthorized

<Greenlet at 0x29fb910: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x28991d0>>> failed with HTTPError

creating nuwiki in u'/root/cache/13/130c6a8548c2745b/tmpInmbhl/nuwiki'

removing tmpdir u'/root/cache/13/130c6a8548c2745b/tmpInmbhl'

memory used: res=21.4 virt=331.2

1% error Traceback (most recent call last):

  File "/bin/mw-zip", line 9, in <module>

    load_entry_point('mwlib==0.15.14', 'console_scripts', 'mw-zip')()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/buildzip.py", line 155, in main

    make_zip(output, options, env.metabook, podclient=podclient, status=status)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip

    make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki

    pool.join(raise_error=True)

  File "/usr/lib64/python2.7/site-packages/gevent/pool.py", line 524, in join

    greenlet._raise_exception()

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 171, in _raise_exception

    reraise(*self.exc_info)

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 534, in run

    result = self._run(*self.args, **self.kwargs)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 117, in run

    api = self.get_api()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 31, in get_api

    api.login(self.username, self.password, self.domain)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 194, in login

    res = self._post(**args)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 114, in _post

    res = loads(self._fetch(req))

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 80, in _fetch

    f = self.opener.open(url)

  File "/usr/lib64/python2.7/urllib2.py", line 446, in open

    response = meth(req, response)

  File "/usr/lib64/python2.7/urllib2.py", line 559, in http_response

    'http', request, response, code, msg, hdrs)

  File "/usr/lib64/python2.7/urllib2.py", line 484, in error

    return self._call_chain(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 418, in _call_chain

    result = func(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 567, in http_error_default

    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)

urllib2.HTTPError: HTTP Error 401: Unauthorized

Any idea on how to get around the 'urllib2.HTTPError: HTTP Error 401: Unauthorized' error?

Reply to "urllib2 error when running your own rendering server"

Getting error when using PdfBook on render server

6
88.74.203.177 (talkcontribs)

Im using mediawiki 1.28 with the PdfBook and Collection. Im running 4.4.0-67-generic #88-Ubuntu SMP Installed the mw-render and used this tutorial: https://www.mediawiki.org/wiki/Setup_a_render_server_on_Ubuntu_12.04_LTS Following error after pdf-download action:

/***************************************** Startof log.txt *********************************************************/
/data/mwcache/log.txt                                                                                                                                                                                           179535/175K              100%
have 0 jobs
count: 18
all channels idle


error finish: bab34137f302840d:makezip: 'RuntimeError: command failed with returncode 256: [\'mw-zip\', \'-o\', \'/data/mwcache/ba/bab34137f302840d/collection.zip\', \'-m\', \'/data/mwcache/ba/bab34137f302840d/metabook.json\', \'--status
\', \'qserve://localhost:14311/bab34137f302840d:makezip\', \'--config\', \'https://wiki.senedo.de\', \'--username\', \'senedo\', \'--password\', \'{OMITTED}\']\nLast Output:\n    1%  creating nuwiki in u\'/data/mwcache/ba/bab34137f302840
d/tmpqb3ZCc/nuwiki\'\n    removing tmpdir u\'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc\'\n    memory used: res=17.1 virt=124.3\n    1% error Traceback (most recent call last):\n      File "/usr/local/bin/mw-zip", line 11, in <module>\n
        sys.exit(main())\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main\n        make_zip(output, options, env.metabook, podclient=podclient, status=status)\n      File "/usr/local/lib/pyth
on2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip\n        from mwlib.apps.make_nuwiki import make_nuwiki\n      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n        result = _
import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>\n        from mwlib.net import fetch, sapi as mwapi\n      File "/usr/local/lib/python2.7/dist-packages/gevent/bu
iltins.py", line 93, in __import__\n        result = _import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 7, in <module>\n        import gevent, gevent.pool, gevent.coros, gevent.event\n
     File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n        result = _import(*args, **kwargs)\n    ImportError: No module named coros\n     in function system, file /usr/local/lib/python2.7/dist
-packages/mwlib/nslave.py, line 64'
error finish: bab34137f302840d:render-rl: 'RuntimeError: RuntimeError: command failed with returncode 256: [\'mw-zip\', \'-o\', \'/data/mwcache/ba/bab34137f302840d/collection.zip\', \'-m\', \'/data/mwcache/ba/bab34137f302840d/metabook.js
on\', \'--status\', \'qserve://localhost:14311/bab34137f302840d:makezip\', \'--config\', \'https://wiki.senedo.de\', \'--username\', \'senedo\', \'--password\', \'{OMITTED}\']\nLast Output:\n    1%  creating nuwiki in u\'/data/mwcache/ba
/bab34137f302840d/tmpqb3ZCc/nuwiki\'\n    removing tmpdir u\'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc\'\n    memory used: res=17.1 virt=124.3\n    1% error Traceback (most recent call last):\n      File "/usr/local/bin/mw-zip", line 1
1, in <module>\n        sys.exit(main())\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main\n        make_zip(output, options, env.metabook, podclient=podclient, status=status)\n      File "/us
r/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip\n        from mwlib.apps.make_nuwiki import make_nuwiki\n      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n
      result = _import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>\n        from mwlib.net import fetch, sapi as mwapi\n      File "/usr/local/lib/python2.7/dist-Tr
aceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 150, in handle_one_job
    result = workhandler(qs).dispatch(job)
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 50, in dispatch
    return m(**tmp)
  File "/usr/local/lib/python2.7/dist-packages/mwlib/nslave.py", line 171, in rpc_render
    return doit(**params)
  File "/usr/local/lib/python2.7/dist-packages/mwlib/nslave.py", line 158, in doit
    self.qaddw(channel="makezip", payload=dict(params=params), jobid="%s:makezip" % (collection_id, ), timeout=20 * 60)
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 66, in qaddw
    raise RuntimeError(error)
RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', '/data/mwcache/ba/bab34137f302840d/collection.zip', '-m', '/data/mwcache/ba/bab34137f302840d/metabook.json', '--status', 'qserve://localhost:14311/bab34137f
302840d:makezip', '--config', 'https://wiki.senedo.de', '--username', 'senedo', '--password', '{OMITTED}']
Last Output:
    1%  creating nuwiki in u'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc/nuwiki'
    removing tmpdir u'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc'
    memory used: res=17.1 virt=124.3
    1% error Traceback (most recent call last):
      File "/usr/local/bin/mw-zip", line 11, in <module>
        sys.exit(main())
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main
        make_zip(output, options, env.metabook, podclient=podclient, status=status)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip
        from mwlib.apps.make_nuwiki import make_nuwiki
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>
        from mwlib.net import fetch, sapi as mwapi
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 7, in <module>
        import gevent, gevent.pool, gevent.coros, gevent.event
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
    ImportError: No module named coros
     in function system, file /usr/local/lib/python2.7/dist-packages/mwlib/nslave.py, line 64
127.0.0.1 - - [2017-04-12 13:25:52] "POST / HTTP/1.0" 200 2187 0.005049
127.0.0.1 - - [2017-04-12 13:25:53] "POST / HTTP/1.0" 200 2187 0.004951
/***************************************** End of log.txt *********************************************************/
82.135.96.88 (talkcontribs)

Hi, I have the same problem. Did you find any solution?

217.17.16.146 (talkcontribs)

It's a problem with gevent. Coros was deprecated and is now removed.

Downgrade to a previous version with

pip install gevent==1.0

82.135.96.88 (talkcontribs)

Thanks a lot, it seems to work now, at least for simple articles.

I had an older version of the render server running on Ubuntu 14.04 and it seemed to work better:

- The rendering page updated automatically. Now I have to reload the page manually.

- It seemed to work for more complex pages. Now I very often get the error "WARNING: Article could not be rendered - ouputting plain text.

Potential causes of the problem are: (a) a bug in the pdf-writer software (b) problematic Mediawiki markup (c) table

is too wide"

82.135.96.88 (talkcontribs)

Okay, I hacked Image.py of Pillow to make tostring available again. Now more pages can be rendered properly.

I wish the render server components would be upgraded as they seem to be quite outdated.

ChristophJahn (talkcontribs)

I ran into the same problem. I found out that gevent.coros was renamed to gevent.lock. Therefore you could also update the code of mwlib by doing:

find . -type f -exec sed -i 's/coros/lock/g' {} \;

I also got the error KeyError: 'revisions', and solved it by replacing revs = e["revisions"] with revs = e.get("revisions","") in pp/local/lib/python2.7/site-packages/mwlib/net/sapi.py:311.

Reply to "Getting error when using PdfBook on render server"

Datei nicht gefunden Die Datei, die du versuchst herunterzuladen, ist nicht vorhanden: Möglicherweise wurde sie gelöscht oder sie muss neu generiert werden.

2
92.214.160.71 (talkcontribs)

Running newest mwlib and collection...

Jongfeli (talkcontribs)
Reply to "Datei nicht gefunden Die Datei, die du versuchst herunterzuladen, ist nicht vorhanden: Möglicherweise wurde sie gelöscht oder sie muss neu generiert werden."

Print/Export menu not showing on custom namespaces

3
70.120.85.152 (talkcontribs)

Can't get the print/export menu in the sidebar to show up in custom namespaces. It shows everywhere else. Any ideas of how to fix this?

Kghbln (talkcontribs)

I believe you will have to add this namespace to the $wgCollectionArticleNamespaces configuration parameter.

70.120.85.152 (talkcontribs)

Thanks. I had already done that but rechecked and had a typo. All good now.

Reply to "Print/Export menu not showing on custom namespaces"
Wmat (talkcontribs)

I'm running the latest version of the extensions on MW1.25alpha and I'm seeing that when I click 'Download as PDF', the licensing information shows License: unknown. I have the following variables configured for the extension, as well as the default for the whole wiki:

$wgLicenseName = "Creative Commons Attribution-Share Alike 3.0 license"; $wgLicenseURL = "http://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License";

Shouldn't the licensing info appear on the PDFs as configured?

Thanks

Wmat (talkcontribs)

I tried setting:

$wgLicenseName = null;
$wgLicenseURL = null;

As I have the following configured:

$wgRightsUrl = "http://creativecommons.org/licenses/by-sa/3.0/";
$wgRightsText = "Creative Commons Attribution-ShareAlike";

According to the ReadMe in the Collections source tree, the license should default to the Rights text in this case. It doesn't. Rendered PDFs and Books still have a License: unknown.

Kghbln (talkcontribs)
Wmat (talkcontribs)
Kghbln (talkcontribs)

Thank you for doing this. This issue is actually a bit worrisome since licensing is an integral part of providing content. So the correct attribution is something that should not be missed.

Nemo bis (talkcontribs)

Indeed, I've already argued that the extension should not output at all any content which can't be attributed.

Wmat (talkcontribs)

I'm very curious how this is working on WP and not my wiki? Can anyone else reproduce this?

49.207.57.238 (talkcontribs)

Same here, I'm on MW 1.27 and Collection 1.7

License for all the images is Unknown even though it is clearly mentioned as CC-BY-SA 4.0 and this is also same for images of commons used with InstantCommons

Reply to "License: unknown in PDFs"