Extension talk:Collection

Jump to: navigation, search

About this board

Archives 

/Archive 1

Can not get my rendering server to work apart from wikipedia.

6
Solanki (talkcontribs)

Hi, I've been trying to set up the Collection extension on my own rendering server, so I can generate pdf files from my wiki. So far no luck.


Here's where I am:

I followed this guide:

http://edutechwiki.unige.ch/en/Mediawiki_collection_extension_installation

And I can create pdf files from wikipedia using:

mw-zip -c :en -o test.zip Acdc Number

mw-render -c test.zip -o test.pdf -w pdf

mw-zip works just as one would expect.


mw-zip -c :en -o test.zip Acdc Number

creating nuwiki in u'tmpuIdHyY/nuwiki' 2013-09-07T10:10:58 mwlib.utils.info >> fetching 'http://en.wikipedia.org/w/index.php?title=Help:Books/License&action=raw&templates=expand' removing tmpdir u'tmpuIdHyY' memory used: res=25.0 virt=816.4


I can read those pdf files, so I know my basic render farm setup is working.

The problem is that I cannot get it to work with anything other than wikipedia.

If I try the URL in the guide:


mw-zip -c http://edutechwiki.unige.ch/mediawiki/ -o test2.zip Mediawiki_collection_extension_installation

creating nuwiki in u'tmpRnDvRH/nuwiki' Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 747, in refcall_fun fun(*args, **kw)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 632, in handle_new_basepath api = self._get_mwapi_for_path(path)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 684, in _get_mwapi_for_path raise RuntimeError("cannot guess api url for %r" % (path,))

RuntimeError: cannot guess api url for 'http://edutechwiki.unige.ch/en' <Greenlet at 0x24d2cd0: refcall_fun> failed with RuntimeError

WARNING: (u'Mediawiki_collection_extension_installation', None) could not be fetched removing tmpdir u'tmpRnDvRH' memory used: res=19.3 virt=226.7



and if I try my own:



mw-zip -c http://IP:PortNo/wiki/index.php/ --username=uuu --password=ppp -o test2.zip Test

creating nuwiki in u'tmpG82RPH/nuwiki' Traceback (most recent call last):

File "/usr/local/lib/python2.7/dist-packages/gevent/greenlet.py", line 328, in run result = self._run(*self.args, **self.kwargs)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 114, in run api = self.get_api()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 28, in get_api api.login(self.username, self.password, self.domain)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 186, in login res = self._post(**args)

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 106, in _post res = loads(self._fetch(req))

File "/usr/local/lib/python2.7/dist-packages/mwlib/net/sapi.py", line 23, in loads return json.loads(s)

File "/usr/lib/python2.7/dist-packages/simplejson/__init__.py", line 413, in loads return _default_decoder.decode(s)

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 402, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end())

File "/usr/lib/python2.7/dist-packages/simplejson/decoder.py", line 420, in raw_decode raise JSONDecodeError("No JSON object could be decoded", s, idx)

JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0) <Greenlet at 0x1a7b870: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x1acf790>>> failed with JSONDecodeError

removing tmpdir u'tmpG82RPH' memory used: res=16.8 virt=152.5 Traceback (most recent call last):

File "/usr/local/bin/mw-zip", line 9, in <module> load_entry_point('mwlib==0.15.11', 'console_scripts', 'mw-zip')()

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main make_zip(output, options, env.metabook, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 50, in make_zip make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 189, in make_nuwiki pool.join(raise_error=True)

File "/usr/local/lib/python2.7/dist-packages/gevent/pool.py", line 98, in join raise greenlet.exception

simplejson.decoder.JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)


I as using Mediawiki 1.23 and I am not behind any proxy and also I disabled SELinux.

Variables that I am using in LocalSettings file are as follows:

$wgServer = "http://IP:portno";

$wgScriptPath = "/wiki";

require_once "$IP/extensions/Collection/Collection.php";

$wgCollectionMWServeURL = 'http://IP:8899'; (default port of mw-serve)

$wgCollectionMWServeCredentials = "username:password";

$wgEnableAPI = true;


I can't even begin to work on the actual extension interface until I have this working..... Any suggestions? Where do I go next?

Any help would be appreciated!


Thanks!

Solanki (talkcontribs)

Guys! I would really appreciate any kind of help or just point me in the right direction, coz I am banging my head here.

Thanks!

Jongfeli (talkcontribs)

Hello Solanki. Is your server running on Ubuntu? If so did you read Setup a render server on Ubuntu 12.04 LTS? If you follow the guide you should be able to get your server up and running. It currently explains how to setup on Ubuntu 12.04 LTS but I am testing it on 14.04 LTS and this seems to work just fine also. When I am done I will update Setup a render server on Ubuntu 12.04 LTS. Regards.

Solanki (talkcontribs)

Hi Felipe. No, my server is running on RHEL 6.5. The strange thing am encountering is its different behavior to different sites, like I mentioned above, it works perfectly fine for Wikipedia, however gives a different error to http://edutechwiki.unige.ch/mediawiki/ and a different one for my site i.e. mw-zip -c http://IP:PortNo/wiki/api.php/ --username=uuu --password=ppp -o test2.zip Test.


Now, this is what's boggling my mind. I have almost gone through every document possible.

(Errors related to different sites are given above.)

MarkAHershberger (talkcontribs)

Yay! I'm getting this exact error ("cannot guess api url...")

And my server was working before....

Anu8791 (talkcontribs)

HI,

we have the similar kind of setup in our RHEL6 server and we render the PDF documents successfully. But, here the issue is the PDF doc could not contain all the embedded JPEG images in the wiki article page and it only intakes .PNG pictures in the PDF document.

Could some one please advise with solution for this issue? It would be really appreciated !

Thanks in Adv,

Sanjay

Reply to "Can not get my rendering server to work apart from wikipedia."

urllib2 error when running your own rendering server

1
Jamal22066 (talkcontribs)

Getting the following error when running my own rendering server:

new-collection 1        'https://epss-dev-mw01.example.com/mediawiki'     'rl'

2017-10-20T08:12:31 mwlib.serve.info >> render 130c6a8548c2745b rl

10.102.177.204 - - [2017-10-20 08:12:31] "POST / HTTP/1.0" 200 200 0.004170

10.102.177.204 - - [2017-10-20 08:12:31] "POST / HTTP/1.0" 200 215 0.003583

10.102.177.204 - - [2017-10-20 08:12:32] "POST / HTTP/1.0" 200 229 0.002870

10.102.177.204 - - [2017-10-20 08:12:32] "POST / HTTP/1.0" 200 229 0.003326

10.102.177.204 - - [2017-10-20 08:12:33] "POST / HTTP/1.0" 200 229 0.003243

10.102.177.204 - - [2017-10-20 08:12:34] "POST / HTTP/1.0" 200 229 0.003381

10.102.177.204 - - [2017-10-20 08:12:34] "POST / HTTP/1.0" 200 229 0.002904

10.102.177.204 - - [2017-10-20 08:12:35] "POST / HTTP/1.0" 200 229 0.002736

10.102.177.204 - - [2017-10-20 08:12:36] "POST / HTTP/1.0" 200 229 0.002985

256 5.48739695549 ['mw-zip', '-o', '/root/cache/13/130c6a8548c2745b/collection.zip', '-m', '/root/cache/13/130c6a8548c2745b/metabook.json', '--status', 'qserve://localhost:14311/130c6a8548c2745b:makezip', '--config', 'https://epss-dev-mw01.serco.cms/mediawiki', '--username', 'jamal.nasir.adm', '--password', '{OMITTED}']

1%  Traceback (most recent call last):

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 534, in run

    result = self._run(*self.args, **self.kwargs)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 117, in run

    api = self.get_api()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 31, in get_api

    api.login(self.username, self.password, self.domain)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 194, in login

    res = self._post(**args)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 114, in _post

    res = loads(self._fetch(req))

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 80, in _fetch

    f = self.opener.open(url)

  File "/usr/lib64/python2.7/urllib2.py", line 446, in open

    response = meth(req, response)

  File "/usr/lib64/python2.7/urllib2.py", line 559, in http_response

    'http', request, response, code, msg, hdrs)

  File "/usr/lib64/python2.7/urllib2.py", line 484, in error

    return self._call_chain(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 418, in _call_chain

    result = func(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 567, in http_error_default

    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)

HTTPError: HTTP Error 401: Unauthorized

<Greenlet at 0x29fb910: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0x28991d0>>> failed with HTTPError

creating nuwiki in u'/root/cache/13/130c6a8548c2745b/tmpInmbhl/nuwiki'

removing tmpdir u'/root/cache/13/130c6a8548c2745b/tmpInmbhl'

memory used: res=21.4 virt=331.2

1% error Traceback (most recent call last):

  File "/bin/mw-zip", line 9, in <module>

    load_entry_point('mwlib==0.15.14', 'console_scripts', 'mw-zip')()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/buildzip.py", line 155, in main

    make_zip(output, options, env.metabook, podclient=podclient, status=status)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip

    make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki

    pool.join(raise_error=True)

  File "/usr/lib64/python2.7/site-packages/gevent/pool.py", line 524, in join

    greenlet._raise_exception()

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 171, in _raise_exception

    reraise(*self.exc_info)

  File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 534, in run

    result = self._run(*self.args, **self.kwargs)

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 117, in run

    api = self.get_api()

  File "/usr/lib64/python2.7/site-packages/mwlib/apps/make_nuwiki.py", line 31, in get_api

    api.login(self.username, self.password, self.domain)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 194, in login

    res = self._post(**args)

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 114, in _post

    res = loads(self._fetch(req))

  File "/usr/lib64/python2.7/site-packages/mwlib/net/sapi.py", line 80, in _fetch

    f = self.opener.open(url)

  File "/usr/lib64/python2.7/urllib2.py", line 446, in open

    response = meth(req, response)

  File "/usr/lib64/python2.7/urllib2.py", line 559, in http_response

    'http', request, response, code, msg, hdrs)

  File "/usr/lib64/python2.7/urllib2.py", line 484, in error

    return self._call_chain(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 418, in _call_chain

    result = func(*args)

  File "/usr/lib64/python2.7/urllib2.py", line 567, in http_error_default

    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)

urllib2.HTTPError: HTTP Error 401: Unauthorized

Any idea on how to get around the 'urllib2.HTTPError: HTTP Error 401: Unauthorized' error?

Reply to "urllib2 error when running your own rendering server"

[Resolved] RuntimeError: could not get siteinfo

3
Wmat (talkcontribs)

I have 2 environments that I (hope) think are identical. They are both setup with:

MW 1.25alpha PHP 5.3.17 Collection 1.7.0 (1618025)

In the Dev environment, Collection works perfectly. In the Prod environment, I'm seeing this error:

<Greenlet at 0xd43eb0: <bound method start_fetcher.run of <mwlib.apps.make_nuwiki.start_fetcher object at 0xdbba50>>> failed with  RuntimeError
   creating nuwiki in u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI/nuwiki'
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   ERR: <urlopen error [Errno 111] Connection refused>
   removing tmpdir u'/u1/wiki_pdf/cache/2e/2e58aa5a7230f6a6/tmpJuZHDI'
   memory used: res=18.2 virt=152.7
   1% error Traceback (most recent call last):
     File "/usr/local/bin/mw-zip", line 9, in <module>
       load_entry_point('mwlib==0.15.14', 'console_scripts', 'mw-zip')()
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 155, in main
       make_zip(output, options, env.metabook, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/buildzip.py", line 50, in make_zip
       make_nuwiki(fsdir, metabook=metabook, options=options, podclient=podclient, status=status)
     File "/usr/local/lib64/python2.6/site-packages/mwlib/apps/make_nuwiki.py", line 192, in make_nuwiki
       pool.join(raise_error=True)
     File "/usr/local/lib64/python2.6/site-packages/gevent/pool.py", line 98, in join
       raise greenlet.exception
   RuntimeError: could not get siteinfo
    in function system, file /usr/local/lib64/python2.6/site-packages/mwlib/nslave.py, line 64

So it's failing on the start_fetcher.run method, but I can't seem to figure out why. Is this likely some system configuration thing?

Wmat (talkcontribs)

Solved.

This was a DNS issue. We had to add and entry to /etc/hosts so that the server could talk to itself, basically.

66.77.160.179 (talkcontribs)

Hi, I am getting the same error. Do you happen to know what you put in your /etc/hosts file to resolve this?

Reply to "[Resolved] RuntimeError: could not get siteinfo"

Getting error when using PdfBook on render server

6
88.74.203.177 (talkcontribs)

Im using mediawiki 1.28 with the PdfBook and Collection. Im running 4.4.0-67-generic #88-Ubuntu SMP Installed the mw-render and used this tutorial: https://www.mediawiki.org/wiki/Setup_a_render_server_on_Ubuntu_12.04_LTS Following error after pdf-download action:

/***************************************** Startof log.txt *********************************************************/
/data/mwcache/log.txt                                                                                                                                                                                           179535/175K              100%
have 0 jobs
count: 18
all channels idle


error finish: bab34137f302840d:makezip: 'RuntimeError: command failed with returncode 256: [\'mw-zip\', \'-o\', \'/data/mwcache/ba/bab34137f302840d/collection.zip\', \'-m\', \'/data/mwcache/ba/bab34137f302840d/metabook.json\', \'--status
\', \'qserve://localhost:14311/bab34137f302840d:makezip\', \'--config\', \'https://wiki.senedo.de\', \'--username\', \'senedo\', \'--password\', \'{OMITTED}\']\nLast Output:\n    1%  creating nuwiki in u\'/data/mwcache/ba/bab34137f302840
d/tmpqb3ZCc/nuwiki\'\n    removing tmpdir u\'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc\'\n    memory used: res=17.1 virt=124.3\n    1% error Traceback (most recent call last):\n      File "/usr/local/bin/mw-zip", line 11, in <module>\n
        sys.exit(main())\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main\n        make_zip(output, options, env.metabook, podclient=podclient, status=status)\n      File "/usr/local/lib/pyth
on2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip\n        from mwlib.apps.make_nuwiki import make_nuwiki\n      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n        result = _
import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>\n        from mwlib.net import fetch, sapi as mwapi\n      File "/usr/local/lib/python2.7/dist-packages/gevent/bu
iltins.py", line 93, in __import__\n        result = _import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 7, in <module>\n        import gevent, gevent.pool, gevent.coros, gevent.event\n
     File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n        result = _import(*args, **kwargs)\n    ImportError: No module named coros\n     in function system, file /usr/local/lib/python2.7/dist
-packages/mwlib/nslave.py, line 64'
error finish: bab34137f302840d:render-rl: 'RuntimeError: RuntimeError: command failed with returncode 256: [\'mw-zip\', \'-o\', \'/data/mwcache/ba/bab34137f302840d/collection.zip\', \'-m\', \'/data/mwcache/ba/bab34137f302840d/metabook.js
on\', \'--status\', \'qserve://localhost:14311/bab34137f302840d:makezip\', \'--config\', \'https://wiki.senedo.de\', \'--username\', \'senedo\', \'--password\', \'{OMITTED}\']\nLast Output:\n    1%  creating nuwiki in u\'/data/mwcache/ba
/bab34137f302840d/tmpqb3ZCc/nuwiki\'\n    removing tmpdir u\'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc\'\n    memory used: res=17.1 virt=124.3\n    1% error Traceback (most recent call last):\n      File "/usr/local/bin/mw-zip", line 1
1, in <module>\n        sys.exit(main())\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main\n        make_zip(output, options, env.metabook, podclient=podclient, status=status)\n      File "/us
r/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip\n        from mwlib.apps.make_nuwiki import make_nuwiki\n      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__\n
      result = _import(*args, **kwargs)\n      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>\n        from mwlib.net import fetch, sapi as mwapi\n      File "/usr/local/lib/python2.7/dist-Tr
aceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 150, in handle_one_job
    result = workhandler(qs).dispatch(job)
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 50, in dispatch
    return m(**tmp)
  File "/usr/local/lib/python2.7/dist-packages/mwlib/nslave.py", line 171, in rpc_render
    return doit(**params)
  File "/usr/local/lib/python2.7/dist-packages/mwlib/nslave.py", line 158, in doit
    self.qaddw(channel="makezip", payload=dict(params=params), jobid="%s:makezip" % (collection_id, ), timeout=20 * 60)
  File "/usr/local/lib/python2.7/dist-packages/qs/slave.py", line 66, in qaddw
    raise RuntimeError(error)
RuntimeError: RuntimeError: command failed with returncode 256: ['mw-zip', '-o', '/data/mwcache/ba/bab34137f302840d/collection.zip', '-m', '/data/mwcache/ba/bab34137f302840d/metabook.json', '--status', 'qserve://localhost:14311/bab34137f
302840d:makezip', '--config', 'https://wiki.senedo.de', '--username', 'senedo', '--password', '{OMITTED}']
Last Output:
    1%  creating nuwiki in u'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc/nuwiki'
    removing tmpdir u'/data/mwcache/ba/bab34137f302840d/tmpqb3ZCc'
    memory used: res=17.1 virt=124.3
    1% error Traceback (most recent call last):
      File "/usr/local/bin/mw-zip", line 11, in <module>
        sys.exit(main())
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 155, in main
        make_zip(output, options, env.metabook, podclient=podclient, status=status)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/buildzip.py", line 49, in make_zip
        from mwlib.apps.make_nuwiki import make_nuwiki
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/apps/make_nuwiki.py", line 6, in <module>
        from mwlib.net import fetch, sapi as mwapi
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
      File "/usr/local/lib/python2.7/dist-packages/mwlib/net/fetch.py", line 7, in <module>
        import gevent, gevent.pool, gevent.coros, gevent.event
      File "/usr/local/lib/python2.7/dist-packages/gevent/builtins.py", line 93, in __import__
        result = _import(*args, **kwargs)
    ImportError: No module named coros
     in function system, file /usr/local/lib/python2.7/dist-packages/mwlib/nslave.py, line 64
127.0.0.1 - - [2017-04-12 13:25:52] "POST / HTTP/1.0" 200 2187 0.005049
127.0.0.1 - - [2017-04-12 13:25:53] "POST / HTTP/1.0" 200 2187 0.004951
/***************************************** End of log.txt *********************************************************/
82.135.96.88 (talkcontribs)

Hi, I have the same problem. Did you find any solution?

217.17.16.146 (talkcontribs)

It's a problem with gevent. Coros was deprecated and is now removed.

Downgrade to a previous version with

pip install gevent==1.0

82.135.96.88 (talkcontribs)

Thanks a lot, it seems to work now, at least for simple articles.

I had an older version of the render server running on Ubuntu 14.04 and it seemed to work better:

- The rendering page updated automatically. Now I have to reload the page manually.

- It seemed to work for more complex pages. Now I very often get the error "WARNING: Article could not be rendered - ouputting plain text.

Potential causes of the problem are: (a) a bug in the pdf-writer software (b) problematic Mediawiki markup (c) table

is too wide"

82.135.96.88 (talkcontribs)

Okay, I hacked Image.py of Pillow to make tostring available again. Now more pages can be rendered properly.

I wish the render server components would be upgraded as they seem to be quite outdated.

ChristophJahn (talkcontribs)

I ran into the same problem. I found out that gevent.coros was renamed to gevent.lock. Therefore you could also update the code of mwlib by doing:

find . -type f -exec sed -i 's/coros/lock/g' {} \;

I also got the error KeyError: 'revisions', and solved it by replacing revs = e["revisions"] with revs = e.get("revisions","") in pp/local/lib/python2.7/site-packages/mwlib/net/sapi.py:311.

Reply to "Getting error when using PdfBook on render server"

Datei nicht gefunden Die Datei, die du versuchst herunterzuladen, ist nicht vorhanden: Möglicherweise wurde sie gelöscht oder sie muss neu generiert werden.

2
92.214.160.71 (talkcontribs)

Running newest mwlib and collection...

Jongfeli (talkcontribs)

Not a reel fix but see: https://www.mediawiki.org/wiki/Topic:Rokm611ol9px8vcg

Reply to "Datei nicht gefunden Die Datei, die du versuchst herunterzuladen, ist nicht vorhanden: Möglicherweise wurde sie gelöscht oder sie muss neu generiert werden."

Print/Export menu not showing on custom namespaces

3
70.120.85.152 (talkcontribs)

Can't get the print/export menu in the sidebar to show up in custom namespaces. It shows everywhere else. Any ideas of how to fix this?

Kghbln (talkcontribs)

I believe you will have to add this namespace to the $wgCollectionArticleNamespaces configuration parameter.

70.120.85.152 (talkcontribs)

Thanks. I had already done that but rechecked and had a typo. All good now.

Reply to "Print/Export menu not showing on custom namespaces"
Wmat (talkcontribs)

I'm running the latest version of the extensions on MW1.25alpha and I'm seeing that when I click 'Download as PDF', the licensing information shows License: unknown. I have the following variables configured for the extension, as well as the default for the whole wiki:

$wgLicenseName = "Creative Commons Attribution-Share Alike 3.0 license"; $wgLicenseURL = "http://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License";

Shouldn't the licensing info appear on the PDFs as configured?

Thanks

Wmat (talkcontribs)

I tried setting:

$wgLicenseName = null;
$wgLicenseURL = null;

As I have the following configured:

$wgRightsUrl = "http://creativecommons.org/licenses/by-sa/3.0/";
$wgRightsText = "Creative Commons Attribution-ShareAlike";

According to the ReadMe in the Collections source tree, the license should default to the Rights text in this case. It doesn't. Rendered PDFs and Books still have a License: unknown.

Kghbln (talkcontribs)

It's probably time to report an issue at phabricator.

Wmat (talkcontribs)

You're right.

https://phabricator.wikimedia.org/T91262

Kghbln (talkcontribs)

Thank you for doing this. This issue is actually a bit worrisome since licensing is an integral part of providing content. So the correct attribution is something that should not be missed.

Nemo bis (talkcontribs)

Indeed, I've already argued that the extension should not output at all any content which can't be attributed.

Wmat (talkcontribs)

I'm very curious how this is working on WP and not my wiki? Can anyone else reproduce this?

49.207.57.238 (talkcontribs)

Same here, I'm on MW 1.27 and Collection 1.7

License for all the images is Unknown even though it is clearly mentioned as CC-BY-SA 4.0 and this is also same for images of commons used with InstantCommons

Reply to "License: unknown in PDFs"

How to create collection_id from my article list

1
140.115.51.162 (talkcontribs)

Hello guys, I would like to generate the collection_id from set of article and use this collection_id to create the pdf file from Wikipedia

Now

Reply to "How to create collection_id from my article list"

How to get code review for this extension

1
Mutante (talkcontribs)

How to get https://gerrit.wikimedia.org/r/#/c/336342/ merged? Are there any active maintainers of this extension? Is there any deployment step besides the merge in Gerrit itself? ~~~~

Reply to "How to get code review for this extension"

Can this extension be deployed to [https://zh.wikipedia.org zh.wikipedia]?

2
星耀晨曦 (talkcontribs)

see community consensus.

Kghbln (talkcontribs)

Such requests are piped in via Phabricator. I suggest to create a task for this.

Reply to "Can this extension be deployed to [https://zh.wikipedia.org zh.wikipedia]?"