Extension talk:EPubExport

Requested features

 * Additional languages
 * A way to export the whole wiki in one operation
 * Permissions:
 * Add an "exportable" property to pages. Pages with exportable=false won't allowed to be exported. Also the "ePub export" link will not be shown in these pagese. When exporting a group of pages, if one of them is not exportable, export will failed with detailed message.
 * Administrator would be able to change the exportable permission for any page.
 * Configuration of default exportable permission shall be added.

Done on version 0.02

 * Version Warning - place a warning on the top of each page, that this is a static version of the article, from a specific date and time (as in printable version).

Done on version 0.03

 * Image support: Images are now embedded in the epub file.

Done in version 0.04

 * User-defined CSS

Bug?
I have an italian mediawiki installation. When I click on the tools link to export to ePub, the page http://mywiki.org/w/index.php?title=Speciale:Stampa_ePub&page=Page_Name is loaded, but I get an error "La pagina speciale richiesta non è stata riconosciuta.", which means "the requested special page hasn't been recognized". But "Speciale:Stampa_ePub" special page loads!

My installation uses short urls.

It Seems like the problem is with the "raw url" special page form. The other special pages load with the raw url.

Is it a bug?

Thank you for your attention. DonPaolo 19:47, 6 July 2010 (UTC)

Confirmation
Yes I got the same message (in English), i.e. a Firefox popup, so the web client tries to retrieve an epub file from a local tmp dir, but can't find it.

Content of the FF popup: /tmp/zD1EI3lu.part could not be saved, because the source file could not be read. Try again later, or contact the server administrator.

Version of ePubExport Version 0.5.4 (2010-05-16)) from CVS it says version 47

My setup: MediaWiki 	1.16.0 PHP 	5.2.6-3ubuntu4.6 (apache2handler) MySQL 	5.0.75-0ubuntu10.5

Short URLS (LocalSettings.php) $wgSitename        = "EduTech Wiki"; $wgScriptPath	   = "/mediawiki"; $wgScript          = "$wgScriptPath/index.php"; $wgRedirectScript  = "$wgScriptPath/redirect.php"; $wgArticlePath     = "/en/$1";

Apache aliases: Alias /mediawiki "/data/portails/mediawiki" Alias /en "/data/portails/mediawiki/index.php"

So it does get stuck somewhere, but can't figure out where. Btw. the documentation doesn't say that one should create an epub/temp directory and chown to the web server user. I did that but won't help (the path to temp is fine btw.)

A mediawiki version or a short URL problem ?

cheers ! - Daniel K. Schneider 19:50, 29 November 2010 (UTC)

Graphic embedding
I'm sorry to say that thumbnails of graphics do not end up embedded in the epub file. The calibre web inspector shows a correct href-entry, a correctly named file is put into /../images but the file is 0 bytes long. Manual substitution of the file by downloading & inserting doesn't help either. Tested on Mediawiki 1.16.0. with Calibre (running on Vista) & iBooks. Regards, F. Helm
 * It seems lines 96 and 97 of ePubExport_body.php are duplicating an action as on majority of systems variable $wgScriptPath is usually set to /w/. Therefore, on my installation I have commented/deleted the line 97 - this resolved the above issue with images. Regards, Uzgen

Looking to test suitability for English Wikisource
The ability to export our completed works at English Wikisource is something that is of interest to that community, and we see that this extension may be a great tool to have on offer. We develop our works hierarchically as subpages of a work, usually chapters.  With that sort of design, would this extension be able to generate an EPUB of the whole work easily?  Or is that not going to work?  If it isn't going to work to generate our e-works, what sort of structure or guidance would be required to implement the extension so that we can offer whole works. Thanks. — [[user:billinghurst|billinghurst  sDrewth  10:39, 17 May 2011 (UTC)


 * What would be the impact if we just loaded EPubExport into Wikisource like it is? Jeepday 22:33, 15 November 2011 (UTC)

Trouble ePub from it.ws
There's something wrong into new version of ePub generator - it seems that it manages wrongly annotations by ref tag. See ePub from it:s:La cavalleria italiana e le sue riforme. Left menu of epub file contains an abnormal entri for each annotation; it contains too some abnormal menu entry for some link pointing to Autore namespace. --Alex brollo (talk) 12:12, 11 June 2012 (UTC)

Seems to be broken in MW 1.21.2
Hello,

After updating my MediaWiki to the latest version 1.21.2 I began receiving this error message:

Fatal error: Call to undefined function wfLoadExtensionMessages in /var/www/clients/client1/web4/web/w/extensions/ePubExport/ePubExport.php on line 38

Could you help me find a way to get the extension to work again? Thanks!

Same problem in MW 1.22.x
Fix:

In files ePubExport.php and ePubExport.php kill all the lines containing wfLocaExtensionMessage. It works for me :)

Disclaimer: I am not a developer, but I just read this

- Daniel K. Schneider (talk) 19:02, 11 November 2013 (UTC)

Tried, not working
I did what Daniel K. Schneider suggested, but unfortunately clicking on the "Export to ePub" link in the toolbox still produces all sorts of warnings.


 * Sorry I can't help (my technical skills are low), but if you want proof: http://edutechwiki.unige.ch/t/Main_Page (MW 1.22) - Daniel K. Schneider (talk) 17:00, 12 December 2013 (UTC)


 * Could you please post the code after your modifications? I can't figure out how much to remove/comment out, actually. Thank you!

Other errors in MW 1.22
I'm getting the following error:


 * Notice: Undefined variable: splitDefaultSize in /var/www/clients/client1/web4/web/w/extensions/ePubExport/epub/EPubChapterSplitter.php on line 36
 * Output buffer is not empty. Now contins

Can someone help me troubleshoot it, please?

Fix:

I had the same problem and solved it by using trim function. In ePubExport_body.php there's the function CleanBOMs which should clean the output buffer. But in my case at the end the output buffer wasn't empty. There were still some spaces which can be deleted by using PHP trim function. Just add the line "$output = trim($output)" in the right place.

private function CleanBOMs {               $BOM =  chr(0xef).chr(0xbb).chr(0xbf); $output = ob_get_contents; $toRemove = Array; // Characters that may be already written to the output buffer and must be clened. $toRemove[] = $BOM; $toRemove[] = chr(10); $toRemove[] = chr(13); if ( $output !== false ) { if ( $output != "" ) { $output = str_replace($toRemove, "", $output); // delete all BOMs $output = trim($output); if ( $output != "" ) { return false; }                       }                        // only BOM(s) in output, output buffer may be cleaned. ob_clean; }               return true; }

Headlines
Could the chapters of a Wikipedia article be embedded as subchapters in the TOC of the ePub file? That would make the navigation easier.

Export by Category
It would be pretty useful (to me at least) to be able to specify a Category to be aggregated into an exported ePub. --JosefAssad (talk) 11:00, 1 September 2015 (UTC)