Extension talk:DumpHTML/LQT Archive 1

Current issues:
--92.195.50.177 14:17, 20 April 2008 (UTC)
 * 1) Notice: Use of OutputPage::setParserOptions is deprecated in ...\GlobalFunctions.php on line 2480
 * 2) Pagenames with nonstandard characters (äöüß etc.) crash the script with a can't open file error

Installation instructions
Some installation instructions would be helpful. Here's what I did. Hopefully someone more knowledgeable than me can edit this and move it to the article page.

If you have web access from your MediaWiki server, this should suffice:

cd /whatever/mediawiki/extensions svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/DumpHTML

I don't, so I had to do this on a separate machine:

cd /tmp svn export http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/DumpHTML

Subversion retrieves files, and reports their names and the revision number.

tar cjvf ~/DumpHTML-version.tar.bz2 DumpHTML rm -rf DumpHTML

Then on my MediaWiki machine:

cd /whatever/mediawiki/extensions tar xjvf ~/DumpHTML-version.tar.bz2

Invocation:

php /whatever/mediawiki/extensions/DumpHTML/dumpHTML.php options

Localsettings.php
Is it possible to call this extension with a line in localsettings.php? --Rovo 01:43, 13 June 2008 (UTC)


 * Rovo, sorry no. It's built to be run from a shell. --Gadlen 09:12, 18 August 2008 (UTC)

Usage Instructions
DumpHTML.php expects to be run from the maintenance directory. The "skin" directory won’t get included in the HTML package if you run it from another directory. So if you are running it on a cron job and putting together a .tar.gz of your wiki for downloading, your shell script might look something like this:


 * 1) !/bin/sh

cd /YourWikiDirectory/extensions/DumpHTML

/bin/php dumpHTML.php -d /YourTargetDirectoryForHTML -k monobook --image-snapshot --force-copy

/bin/tar -czf /TemporaryDirectoryForTarball/YourWikiAllWrappedUp.tar.gz /YourTargetDirectoryForHTML

mv /TemporaryDirectoryForTarball/YourWikiAllWrappedUp.tar.gz /YourWebAccessibleDirectory/YourWikiAllWrappedUp.tar.gz --Gadlen 09:12, 18 August 2008 (UTC)

Downloading external links
I'm developing a resource that will most likely include links to other sites or at least the reports hosted on those other sites. Is it possible for this extension to perhaps download the first level/depth of external links too? --Charlener 02:53, 15 September 2008 (UTC)


 * You could perhaps modify my script (see Bugzilla 8147) accordingly. --Wikinaut 06:43, 15 September 2008 (UTC)

Dumping pages for commons images
I'm able to generate image pages for local images. What data dumps must be loaded to generate the shared (commons) image pages? Best regards. Naudefj 15:06, 7 October 2008 (UTC)


 * I don't know, I haven't used that feature yet. --Wikinaut 16:25, 7 October 2008 (UTC)

I've noticed that the provided HTML dumps include commons image pages. Are they generated with this script? If not, what dumper program is used to generate them? Best regards. Naudefj 21:05, 8 October 2008 (UTC):


 * Have a look to the source. As far as I understand, they _are_ copied indeed. --Wikinaut 06:08, 9 October 2008 (UTC)

Usage with symlinked MW core?
I inherited a setup that has a single MW install and a number of wikis. Each wiki is setup as symlinks for all of MW except ./images and LocalSettings.php. My problem is this script wants to refer to the installation directory of MW instead of the "home" of each wiki. I tried export MW_INSTALL_PATH=".../home_of_wiki" which almost works except the image code still tries to go to the actual install path. I've noticed this behavior in other MW scripts. The core maintenance scripts are supposed to have an option to solve this problem. --Cymen 18:13, 21 October 2008 (UTC)

getFriendlyName
Whats the delay in moving to getFriendlyName? Why is getHasedFilename used?