Jump to content

Manual:Sitemap

From mediawiki.org

Sitemaps are files that make it more efficient for search engine robots (like googlebot) to crawl a website (so long as the bot supports the sitemap protocol).

Starting MediaWiki 1.45, MediaWiki provides a REST API to generate a sitemap on demand. You can configure the robots.txt of your website, to make use of this API:

Sitemap https://example.com/w/rest.php/site/v1/sitemap/0

If you cannot use the new REST API, then generateSitemap.php can be used as a maintenance script to generate sitemap files.

In the past, Sitemap extensions were used to generate these.