Extension:Offline/Install

You have the choice to install mediawiki and the offline extension yourself, or download a binary package preconfigured for standalone use.

See the Extension:Offline overview for what to add to LocalSettings.php. Visit the Special:Offline page for status and diagnostic hints once your system is running.

Dependencies
a) Linux
 * yum install php xapian-core xapian-bindings php-pecl-apc

b) Windows copy zlib1.dll C:\windows\system32\
 * The windows package comes with all dependencies installed in subdirectories, but you might still have to take these steps:
 * You need the MSVC 2008 Runtime compatibility library.
 * If you are not using the bundled mongoose webserver and php, you will need to include windows/php.ini during php startup.

c) MacOS For example: PHP_CONFIG=/usr/local/php5/bin/php-config XAPIAN_CONFIG=/usr/local/bin/xapian-config ./configure --with-php
 * If you are running MacOS 10.4, you will need to upgrade to php5, installing it into `/usr/local` is recommended. Install the "apc" accelerator and xapian php binding, and enable in `php.ini`.
 * Somehow install the xapian library.
 * you may need to edit "-Wstrict-null-sentinel" out of xapian's configure script, and recompile the indexer.
 * You might also need to deal with `ld.config` and `ldd`.

Your Encyclopedia and its Index
Download an xml dump of your wikipedia, probably distributed in a file named `pages-articles.xml.bz2`, from a disc or the net. Check in with bittorrent networks. Try to download the corresponding index files, in any of the formats (well only Xapian for the moment). Otherwise, you will have to generate the index yourself, read on brave soul. An index is necessary in order to match a given article with its location in your dumpfiles. Fortunately, index creation is performed by a script in the `extensions/Offline` distribution, `indexer.py`.

python indexer.py PAGES-ARTICLES.xml.bz2

This could take some time, on the order of days for a large encyclopedia. It will create a directory `splits` which contains the index, and a copy of your bz2 dump split into smaller chunks. This convention will probably all change in the future.

Now, edit `LocalSettings.php` and set the `$wgOfflineWikiPath` variable to point to the full path to this "splits" directory.

Serve the Pages
You should have received a bundle including a portable web server for your platform. Or, you can run your own just as well. Follow the instructions below and head a browser to port `http://127.0.0.1/8000`

You are now running the same software as da Big One, not an imitation!

If you need to install a web server, pick one capable of last-updated caching and php cgi. FastCGI support is easy to configure and will help performance.

To install "hiawatha":

wget http://www.hiawatha-webserver.org/files/hiawatha-7.4.tar.gz tar xzf hiawatha-7.4.tar.gz  cd hiawatha-7.4 ./configure --prefix=`pwd`/../hiawatha/ --disable-ipv6 --disable-ssl --disable-xslt && make

cd wikipedia-offline-patch/ vi config/hiawatha.conf ...then do :%s/WebsiteRoot//g cp config/hiawatha.conf hiawatha/etc/hiawatha/

php-cgi -b 2005 < /dev/null &
 * 1) run a FastCGI server.  Otherwise, edit the conf file to disable.


 * 1) Start hiawatha (running on port 8000 by default)

./hiawatha/sbin/hiawatha

Or, to use another web server such as Apache, edit your config and add:

DocumentRoot mediawiki-1.16.1-sa/ /wikipedia-offline-patch/mediawiki-1.16.1-sa> # enable the rewriter RewriteEngine On   RewriteBase /wiki # anything under /wiki is treated as an article title, unless it is a file RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^/(.+)$ index.php?title=$1 [PT,L,QSA] 

= Appendix: Plans for an Automated Installer =
 * Select content manager (similar to the books project?) to modify; hook appropriate dump import code
 * Configure languages
 * Download from dump directories
 * generate LocalSettings.php lines to read from configured sources, or web-writeable configuration?