Extension:LocalS3Repo

From MediaWiki.org
Jump to: navigation, search
MediaWiki extensions manualManual:Extensions
Crystal Clear action run.svg
LocalS3Repo2 (aka LocalS3Repo)

Release status:Extension status stable

ImplementationTemplate:Extension#type File repository
DescriptionTemplate:Extension#description File repo for Amazon S3 system. Simulates a local filesystem where the data is stored on the S3 system.
Author(s)Template:Extension#username antonydandrea
Latest versionTemplate:Extension#version 2.0 (2017-07-20)
MediaWikiTemplate:Extension#mediawiki v1.21 - 1.28
PHPTemplate:Extension#php 5.2
LicenseTemplate:Extension#license GNU General Public License 2.0 or later
Download

Translate the LocalS3Repo extension if it is available at translatewiki.net

Check usage and version matrix.

The LocalS3Repo extension implements the Amazon S3 as a filerepo. The local model was used as a base, and the wiki will use the S3 instead of the local disk drive for the storage of all uploaded files.

Usage[edit]

Once installed, it will operate automatically. There are no user interface issues. You have to have an account with Amazon S3, explained here.

Download instructions[edit]

  • Version by antonydandrea stable for 1.28m may work for 1.21+: repository.
  • Version by Cariaso for 1.23+ (note from the author: I'm the author of that, and can warn you that while it mostly works, it's a bit crashy'): repository.
  • Very old version: LocalS3Repo.zip.

Choose a version, download and unzip it to $IP/extensions/, with folders. The files should be in the $IP/extensions/LocalS3Repo/ directory. Note: $IP stands for the root directory of your MediaWiki installation, the same directory that holds LocalSettings.php.

Installation[edit]

Below are 2 versions of what should be added to the bottom of your LocalSettings.php file. Believe the first one will work from v1.21 onwards as much as the second one, as well as the code it is attached to. This repository should be tagged if not.

/*
	Modified to work with 1.25.
	Antony D'Andrea - contactme at antonydandrea dot com

        This also has a couple of bug fixes on the code, but the config below
        along with the bug fixes definitely works in MediaWiki 1.25.
*/

// s3 filesystem repo
$wgUploadDirectory = 'wiki';
$wgUploadS3Bucket = 'YOUR S3 BUCKET';
$wgUploadS3SSL = false; // true if SSL should be used
$wgPublicS3 = true; // true if public, false if authentication should be used

$wgS3BaseUrl = "http".($wgUploadS3SSL?"s":"")."://s3.amazonaws.com/$wgUploadS3Bucket";

//viewing needs a different url from uploading. Uploading doesnt work on the below url and viewing doesnt work on the above one.
$wgS3BaseUrlView = "http".($wgUploadS3SSL?"s":"")."://".$wgUploadS3Bucket.".s3.amazonaws.com";
$wgUploadBaseUrl = "$wgS3BaseUrlView/$wgUploadDirectory";

// leave $wgCloudFrontUrl blank to not render images from CloudFront
$wgCloudFrontUrl = '';//"http".($wgUploadS3SSL?"s":"").'://YOUR_CLOUDFRONT_SUBDOMAIN.cloudfront.net/';
$wgLocalFileRepo = array(
        'class' => 'LocalS3Repo',
        'name' => 's3',
        'directory' => $wgUploadDirectory,
        'url' => $wgUploadBaseUrl ? $wgUploadBaseUrl . $wgUploadPath : $wgUploadPath,
        'urlbase' => $wgS3BaseUrl ? $wgS3BaseUrl : "",
        'hashLevels' => $wgHashedUploadDirectory ? 2 : 0,
        'thumbScriptUrl' => $wgThumbnailScriptPath,
        'transformVia404' => !$wgGenerateThumbnailOnParse,
        'initialCapital' => $wgCapitalLinks,
        'deletedDir' => $wgUploadDirectory.'/deleted',
        'deletedHashLevels' => $wgFileStore['deleted']['hash'],
        'AWS_ACCESS_KEY' => 'YOUR_AWS_ACCESS_KEY',
        'AWS_SECRET_KEY' => 'YOUR_AWS_SECRET_KEY',
        'AWS_S3_BUCKET' => $wgUploadS3Bucket,
        'AWS_S3_PUBLIC' => $wgPublicS3,
        'AWS_S3_SSL' => $wgUploadS3SSL,
        'cloudFrontUrl' => $wgCloudFrontUrl,
);
require_once("$IP/extensions/LocalS3Repo/LocalS3Repo.php");
/*
	Modified to work with 1.21 and CloudFront.
	Owen Borseth - owen at borseth dot us

	LocalS3Repo modified to work with MediaWiki 1.21, maybe others, and CloudFront CDN. A maintenance script that I used to move my current
	files over to S3 has been included; it will probably need to be slightly modified to work for you.
*/

// s3 filesystem repo settings - start
// Modify below with tyour settings and paste it all into your LocalSettings.php file.
// Basically, just modify the values that are in all uppercase and all should be fine.

// $wgUploadDirectory is the directory in your bucket where the image directories and images will be stored.
// If "images" doesn't work for you, change it.
$wgUploadDirectory = 'images';
$wgUploadS3Bucket = 'YOUR S3 BUCKET';
$wgUploadS3SSL = false; // true if SSL should be used
$wgPublicS3 = true; // true if public, false if authentication should be used
$wgS3BaseUrl = "http".($wgUploadS3SSL?"s":"")."://s3.amazonaws.com/$wgUploadS3Bucket";
$wgUploadBaseUrl = "$wgS3BaseUrl/$wgUploadDirectory";
// leave $wgCloudFrontUrl blank to not render images from CloudFront
$wgCloudFrontUrl = "http".($wgUploadS3SSL?"s":"").'://YOUR_CLOUDFRONT_SUBDOMAIN.cloudfront.net/';
$wgLocalFileRepo = array(
        'class' => 'LocalS3Repo',
        'name' => 's3',
        'directory' => $wgUploadDirectory,
        'url' => $wgUploadBaseUrl ? $wgUploadBaseUrl . $wgUploadPath : $wgUploadPath,
        'urlbase' => $wgS3BaseUrl ? $wgS3BaseUrl : "",
        'hashLevels' => $wgHashedUploadDirectory ? 2 : 0,
        'thumbScriptUrl' => $wgThumbnailScriptPath,
        'transformVia404' => !$wgGenerateThumbnailOnParse,
        'initialCapital' => $wgCapitalLinks,
        'deletedDir' => $wgUploadDirectory.'/deleted',
        'deletedHashLevels' => $wgFileStore['deleted']['hash'],
        'AWS_ACCESS_KEY' => 'YOUR_AWS_ACCESS_KEY',
        'AWS_SECRET_KEY' => 'YOUR_AWS_SECRET_KEY',
        'AWS_S3_BUCKET' => $wgUploadS3Bucket,
        'AWS_S3_PUBLIC' => $wgPublicS3,
        'AWS_S3_SSL' => $wgUploadS3SSL,
        'cloudFrontUrl' => $wgCloudFrontUrl,
);
require_once("$IP/extensions/LocalS3Repo/LocalS3Repo.php");
// s3 filesystem repo settings - end

To transfer your files from an existing wiki:

  • Create the bucket and the "folder" (the S3Fox tool for FireFox works well for this)
  • Transfer all the files/folders in your existing wiki's image directory to your "folder".
  • Set the ACL for the folder (and subfolders, etc) to be readable by everyone, if you do not use authentication

I have tested this with Windows 7 and with Linux, with Mediawiki versions 1.15.4 and 1.16.0beta3. I currently have a wiki running on Linux with this extension. I have tested the funcionality of upload, thumbnails, delete, archive, and rollback. There could be some other functions which some other extensions use which I didn't implement or test, watch out!

This extension isn't by any means perfect, because the files are sent first to the wiki server, and from there they are sent to the S3 system on uploads. On file reading, they come directly from the S3 system, which is correct. I am waiting for a new upload screen which has a progress bar, then I will implement sending the files directly to the S3 system on upload. So for now, the file size limit of your PHP server will be the file size limit. I deal with that on my system by doing it manually, because I have few files larger than the limit.

Note that the S3.php file, from Donovan Schönknecht, is included in the zip file, and requires that curl be installed on your PHP server (check your phpinfo).

Configuration[edit]

  • $wgUploadS3Bucket --> The name of the bucket you created for the wiki
  • $wgUploadDirectory --> The "folder" under the bucket where you wish the files to be stored
  • $wgUploadS3SSL --> true if SSL should be used
  • $wgPublicS3 --> true if public S3 file access, false if signature authentication should be used
  • 'AWS_ACCESS_KEY' --> Your S3 access key, a long string of characters assigned by Amazon
  • 'AWS_SECRET_KEY' --> -Your S3 secret key, a long string of characters assigned by Amazon

See also[edit]

Note[edit]