Talk:Interwiki integration

All configuration settings will need to be unified for the one big wikifarm. This is not necessarily a disadvantage from the point of view of efficiency Yes it is. It makes it more difficult to split up wikis between different db servers. One giant glob of stuff is going to be less efficient than multiple globs of stuff. Bawolff (talk) 19:36, 14 August 2012 (UTC)
 * Is it a big difference in efficiency? Is there any way to mitigate or get around that problem? Task-based servers were mentioned as one way of dealing with giant globs of stuff (e.g. enwiki). Leucosticte (talk) 20:02, 14 August 2012 (UTC)
 * I'm not a db-efficiency expert, so the answer to those questions is mostly I don't know. I do imagine that having all of Wikimedia's wikis on a single master db would represent a significant performance challange. For wikifarms that don't have wikipedia-scale wikis in them, it is probably less of an issue. I'm not even sure what domas meant in that email by task-based db servers. (I would guess something like watchlist queries go to one server, other things go to another server, or something along those lines). Bawolff (talk) 13:05, 15 August 2012 (UTC)

Wikispaces and langspaces
I don't mind implementing this wikispaces and langspaces idea as core functionality, provided that the patch would be accepted. Given how extensive the necessary code changes would be, I don't feel like forking the core code and then trying to sync up the fork with every new MW release to keep it up-to-date with the other MW changes. And I don't know of a way the functionality could be implemented via an extension. On the other hand, system administrators wouldn't have to use the functionality; it would be optional. Leucosticte (talk) 12:32, 15 August 2012 (UTC)
 * In order to get this sort of thing accepted, it would probably be a "hard sell". Mostly because of, as you say, the changes are so extensive. Furthermore, even if that's accomplished convincing Wikimedia to use such a set up, would probably be difficult (If that is your goal). (btw, if you aim to do this, this page should be a subpage of RFC) Bawolff (talk) 13:05, 15 August 2012 (UTC)

Central wiki
I notice that CentralAuth relies on a central database. This does seem to make more sense than having each wiki maintain a copy of the same table, which would have to be updated every time there is a change. Aside from that, I wonder, what are the efficiency implications of retrieving data from dozens or hundreds of databases in order to, say, generate a list of recent changes on the whole wiki farm, or a list of recent changes to watchlisted pages on the whole wiki farm. Similar issues are involved in parsing a page that has transclusions from a bunch of different wikis or interwiki links to a bunch of different wikis, if we're going to have page existence detection. Leucosticte (talk) 01:44, 16 August 2012 (UTC)
 * CentralAuth isn't the only one. There's also extension:GlobalUsage. Scanning through the local recentchanges table on each and every wiki individually to compute a global RC is scary once you get to have a lot of wikis [Wikimedia currently operates 853 wikis for example] (And even before that, you end up getting much more information then you need as you can't just limit the results since you don't know how many... Plus if you want watchlists you need to do joins, and if each rc table is in separate db's, that aint going to work very well... so basically unfessible). I think the most likely to succede approach for what you want to do is have a central db, and all wikis put RC entries into the global rc table, and their local rc table any time an edit is made. One would also have to replicate the watchlist table similarly. Bawolff (talk) 18:38, 16 August 2012 (UTC)