Manuel:Paramètres de Special:Export

From mediawiki.org
This page is a translated version of the page Manual:Parameters to Special:Export and the translation is 100% complete.
Voir aussi : Manual:DumpBackup.php

Wiki pages can be exported in a special XML format to upload into another MediaWiki.[1] See Help:Export for more details.

Paramètres disponibles

Below is the list of available parameters for Special:Export as of version 1.16. Tous les paramètres ne sont pas accessibles à partir de l'interface utilisateur.

Paramètre Type de variable Description
action chaîne Unused; set to "submit" in the export form.
Sélection de page
/ (no parameter) Selects up to one page, e.g. Special:Export/Sandbox.
pages ? A list of page titles, separated by linefeed (%0A) characters. Maximum of 35 pages.
addcat/nom_de_catégorie chaîne ? These were added later. addcat returns all members of the category catname added to it. If $wgExportFromNamespaces is enabled, addns and nsindex do the same, but with namespaces and their numerical indexes. A maximum of 5000 page titles will be returned.

For example, the following is for all pages in en:Category:Books:

https://en.wikipedia.org/w/index.php?title=Special:Export&addcat&catname=Books&pages=XXXX

addns/index_espace_de_noms
Trier
dir[2] chaîne Should be set to "desc" to retrieve revisions in reverse chronological order.

The default, with this parameter omitted, is to retrieve revisions in ascending order of timestamp (oldest to newest).

Limiter les résultats
offset[2] ? The timestamp at which to start, which is non-inclusive. The timestamp may be in several formats, including the 14-character format usually used by MediaWiki, and an ISO 8601 format like the one that is output by the XML dumps.
limit[2] entier The maximum number of revisions to return. If you request more than a site-specific maximum (defined in $wgExportMaxHistory: 1000 on Wikimedia projects at present), it will be reduced to this number.

This limit is cumulative across all the pages specified in the pages parameter. For example, if you request a limit of 100, for two pages with 70 revisions each, you will get 70 from one and 30 from the other.[3]

curonly booléen Inclure uniquement la version courante (par défaut pour les requêtes GET)
history ? Include the full history, overriding dir, limit, and offset.

This is not working for all say https://en.wikipedia.org/w/index.php?title=Special:Export&pages=US_Open_(tennis)&history=1&action=submit works fine and gives all revisions but https://en.wikipedia.org/w/index.php?title=Special:Export&pages=India&history=1&action=submit doesn't.

Autres
templates ? Inclut tout modèle transclus dans les pages listées pour l'export.
listauthors booléen Inclut pour chaque page, la liste des noms de tous les contributeurs avec leur identifiant d'utilisateur. Functionality is disabled by default; can be enabled by changing $wgExportAllowListContributors.
pagelink-depth entier Inclut toutes les pages liées jusqu'au niveau de profondeur indiqué. Limited to $wgExportMaxLinkDepth (defaults to 0, disabling the feature), or 5 if user does not have permission to change limits.
wpDownload ? Enregistre le gtéléchargement dans un fichier dont le nom inclut l'horodatage du moment. Implemented through content-disposition:attachment HTTP header.

Le paramètre URL des requêtes ne fontionne pas

The dir, offset and limit parameters only work for POST requests. Les requêtes GET via une URL sont ignorées.

Lorsque vous utilisez l'URL comme dans un navigateur, l'envoi est fait avec GET. Dans le script ruby, vous utilisez POST.

As an example, the following parameter request does not work, it returns all revisions of a page despite the parameter limit=5.

https://en.wikipedia.org/w/index.php?title=Special:Export&pages=XXXX&offset=1&limit=5&action=submit

Récupérer les 5 dernières versions

A POST request is generated by cURL when passing -d "". Ci-dessous nous récupérons les 5 dernières versions de la page d'accueil de la Wikipedia anglophone et sa page de discussion :

curl -d "" 'https://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page%0ATalk:Main_Page&offset=1&limit=5&action=submit'

Et voici les 5 versions suivantes de la page d'accueil seulement :

curl -d "" 'https://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page&offset=2002-01-27T20:25:56Z&limit=5&action=submit'

Ici l'horodatage de la dernière révision de la requête précédente est copié dans le champ de décalage de l'URL. Because the offset field is non-inclusive, that 5th revision is not displayed again, and instead we get revisions 6-10.[4]

Requêtes POST pour télécharger

A more explicit example, especially if you also want to save the darn thing, would be

curl -d "&pages=Main_Page&offset=1&action=submit" https://en.wikipedia.org/w/index.php?title=Special:Export -o "somefilename.xml"

The URL root needs to follow the MediaWiki parameters... also, note the fact that you need to add the curl parameters at the end for saving the file as something. Sinon les résultats défileront sur votre écran et rien ne sera sauvegardé. Ici nous voyons que les serveurs Wikipedia sont en maintenance, c'est pourquoi la méthode précédente affiche une erreur et ne fournit pas le xml.

If you instead have the list of titles in a file, say title-list, you must pass the list as a parameter to curl and encode the linefeeds correctly (for some reason, --data-urlencode and @ do not work):

curl -d "&action=submit&pages=$(cat title-list | hexdump -v -e '/1 "%02x"' | sed 's/\(..\)/%\1/g' )" https://en.wikipedia.org/w/index.php?title=Special:Export -o "somefilename.xml"

Si vous souhaitez économiser de la bande passante, vous pouvez aussi ajouter les arguments suivants :

--compressed -H 'Accept-Encoding: gzip,deflate'

Arrêter l'export de votre MediaWiki

N'oubliez pas que si vos utilisateurs ont des difficultés pour sauvegarder leur travail, cela les découragera de contribuer à votre wiki.

If $wgExportAllowHistory is set to false in LocalSettings.php, only the current version can be exported, not the full history.

Par défaut avec les requêtes GET, on ne renvoie que la dernière version (la courante) de chaque page.

If the $wgExportAllowHistory parameter is true in LocalSettings.php, and the "Include only the current revision, not the full history" is unchecked, then all versions of each page are returned.

To disable export completely, you need to set a callback-function in your LocalSettings.php:

function removeExportSpecial(&$aSpecialPages)
{
	unset($aSpecialPages['Export']);
	return true;
}
$wgHooks['SpecialPage_initList'][] = 'removeExportSpecial';

If you want to define a permission for export, put the following in your LocalSettings.php:

// Override SpecialExport, which is work for MW1.35
// the parameters of __construct() are changed in later versions
class SpecialExport2 extends SpecialExport {
    public function __construct() {
        parent::__construct();
        $this->mRestriction = 'export';
    }
    public function execute( $par ) {
        $this->checkPermissions();
        parent::execute( $par );
    }
}
function adjustExportSpecial(&$aSpecialPages)
{
	$aSpecialPages['Export'] = SpecialExport2::class;
	return true;
}
$wgHooks['SpecialPage_initList'][] = 'adjustExportSpecial';
$wgGroupPermissions['sysop']['export'] = true; // Add export permission to sysop only

Notez-bien que l'export est encore possible si l'API est activée.

Notes

  1. If this function is enabled on the destination wiki, and the user is a sysop there. The export can be used for analyzing the content. See also Syndication feeds for exporting other information but pages and Help:Import on importing pages.
  2. 2.0 2.1 2.2 These parameters are ignored if either curonly or history are supplied, or if passed via a GET request (e.g., a browser address bar). See URL parameter requests do not work for more information.
  3. The order is by page_id, pages with lower page_id get more revisions. The reason for this is that Special:Export only ever does one database query per HTTP request. If you want to request all the history of several pages with many revisions each, you have to do it one page at a time.
  4. This parameter convention is very similar to the one for UI history pages.

Voir aussi

Liens externes