User:NeilK/Sitemaps

Google Image Search asks about Sitemaps. Meeting on February 28, 2011.

pre-meeting TODO

 * Review sitemaps protocol to understand what Jens is talking about re: permission...
 * Probably the problem is that we would use a single domain (e.g. downloads, files, secure?) to publish sitemaps and they need multiple. Still that's got to be easy to fix, maybe RewriteRules
 * Question: does Google (or whoever) ever re-change-freq a URL or do we just get one shot at setting a value? (See docs at Sitemaps.org ) What is best for URLs which we guess might be frequently changed, or possibly speedily deleted?
 * Question: Google may have had special arrangements to crawl Wikipedia, perhaps via Special:AllPages or Special:RecentChanges. Jens asserts that our sitemaps weren't being used -- even when we stopped publishing them, new pages still appeared quickly.
 * For Commons, Special:AllPages can be used to browse the File: namespace so this is doable. Special:RecentChanges works okay for similar purposes. It doesn't seem to track uploads per se, but since we have a nearly 1:1 relationship with File: page changes and uploads, that's okay.

Resources
Manual:GenerateSitemap.php is the standard way to do this.

The manual page indicates these are not compatible with Google as of 1.16, but a patch can fix that. Did this get fixed in 1.17? Manual talk:GenerateSitemap.php
 * Yes, it was fixed in 75650. Max Semenik 21:48, 25 February 2011 (UTC)

There are a number of other tools: Extension:Google Sitemap (may be obsolete)

This user created another script... it is unclear why he thought it was necessary to write his own, perhaps this works better with multiple sites. User:DaSch/generateSitemap.php

Questions
How well do these tools scale? Is it feasible to redump all the titles on a frequent basis, or should we go to a more incremental strategy?

The standard sitemaps script is well-written, but selects all pages in alphabetical order and then iterates, writing entries as it goes. For enwiki this is obviously going to take a longish time. (Jens says there was no issue, it was run from a cronjob and completed just fine, at least in late 2007. enwiki was about half the size it is now, although there are more wikis generally today.)

Still, this is very inefficient since almost all the results are the same anyway, and search engines will catch a 404 on the rare time they look up a page and it's gone. So mostly it's pointless to iterate through them all.

Interestingly the sitemaps script sets a different priority for every namespace. It creates a new sitemap for every namespace (numbered, which is the only sane way given that they change in different languages or wikis sometimes) and then creates one index file for them all.

See Ideas below for a more incremental approach.

History
Consensus from Brion Vibber, Ariel T. Glenn, et al., is that we used to run Sitemaps but haven't since 2008. suggests the exact date was 2007-12-27.

Brion believes the standard generateSitemap.php script was the one being used.

It is unclear why we stopped. Brion believes that Jens Frank (JeLuF on IRC) was the one in charge of this. E-mailing him to find out.

Jens replies:
 * Google was not really using it. They apparently also had some special engine to crawl wikipedia, so there wasn't a real need for it.
 * My note: However, this says nothing about other search engines.
 * Additionally, there were some problems with the configuration. If I remember correctly, there were some issues with the location of the sitemaps and our disk layout. We have only one DocumentRoot per project, so e.g. de.wikipedia.org and en.wikipedia.org share the same /sitemap/ directory and - even worse - they share the file that Google needs to verify that I'm allowed to generate sitemaps for *.wikipedia.org. This was reported to Google, but they never came back to us - probably because they have their special crawler already.
 * My note: But this could be fixed with a simple change to the script?
 * Not that easy. Lots of scripts assume that there's one directory per project. And it would increase the size of our installation dramatically. We already have problems with the rollout of changes (aka "scap") since it takes too long and the apache farm is havering different software releases during the scap.

Ideas
Dumps are NOT regular again yet, so perhaps this should be decoupled from that...

It is also sometimes important to be timely. It would be nice if we had a script to run hourly (or more frequently) to append new articles to the last leaf of a tree of sitemap index files (or entire new leaves as appropriate). Then we can regenerate the entire tree now and then, perhaps at the same moment as a dump, to get rid of deleted pages.

Change frequency
Question: does Google allow changes to the change-frequency of an item? It might be interesting to set change-frequency to items based on the date of their last edit. This presumes that edits come in "waves".

Note: the generateSitemap.php script sets lastmod, but does not set change-frequency. Presumably we would have to dip into article history to do a true guess at frequency of changes, which would be hella slow. But we could also take a wild guess based on the difference between script run time and the last modification time. Or, maybe that's what Google does anyway, so we should just let them handle it.