Manual talk:Database layout

Jump to navigation Jump to search

The list of tables is good...[edit]

The list of tables is good, but can it be sorted based on whether it can be regenerated, contains archive info, etc.? -- 20:08, 26 Oct 2004 (UTC)

Automatic generation of Wiki Pages?[edit]

I would like to automatically generate a series of MediaWiki pages. These pages may link to one another or have images. How can I do this, say from Java or PhP? Thanks for your help.

same question from here..:(

Description is here![edit]

Tables.sqlRichardguk (talk) 01:18, 6 January 2013 (UTC)

Profiling table creation and description?[edit]

I'm a new MediaWiki hacker, so please bear with my lack of knowledge!

I'd like to get this profiling stuff working on 1.5.5. But I have 2 BIG problems:

  • The profiling table is not created by default, nor is it in the tables.sql script
  • It does not seem to be documented anywere (e.g. I would have thought the Help:Database layout page would be a good place to look

Please help!

I can't get the newsreader thingie working yet either so have to come here!
Mediashrek 15:45, 25 January 2006 (UTC)

Uploading Articles in Bulk[edit]

I have a 1,000 articles to be uploaded (imported). I assume that doing so at the database level would be easiest. What tables need to be updated?

Importing articles direct into the database is never a good idea because you don't know what de software does in other tables. The best thing you can do is to use the import function. I thought there was an extension that allowed you to import articles from Word files but I'm not sure. --GrandiJoos 09:50, 27 September 2006 (UTC)
This is actually a similar issue to my question below, but the question (well, for me) isn't so much whether or not it's a good idea to import articles directly into the database, but whether or not it is even possible...? In my case, the articles were all built within MediaWiki and therefore still in database-form, which I assume wouldn't be a problem importing individually. The problem is, I'm trying to import an entire wiki database over a brand new MediaWiki installation somehow and I'm not sure how to make it happen because you can't install the software without having it automatically create and/or change the database it is being associated with. -- Randall00 20:25, 13 July 2007 (UTC)

Deleting of tables?[edit]

I have a MediaWiki installation since a few years and did every update up to 1.6.8. Now in the database there are a lot more tables then needed by mw1.6 (blobs, brokenlinks...). Can i delete this tables without problems? And another question: There is a problem with one of the links tables. On image pages under "The following pages link to this file:" there is always the image page of this image listet, and so the specialpage unusedimages gives no results. Any ideas? -- 10:15, 9 February 2007 (UTC)

Hmm... try rebuilding the link tables by running the rebuildAll script in the maintenance folder. As for the tables; you can try, but be advised to delete tables only one at a time, and to back up your database before you begin. Those tables should not be being used. Titoxd(?!?) 20:40, 11 February 2007 (UTC)

Fresh MediaWiki installation over existing MediaWiki database?[edit]

With regards to the database layout and the information provided the information here enough to be able to extract the text from user's web-side submissions via a MySQL database in its raw form (or exported [CSV,etc.], I guess)?

Basically, I created a wiki a little while back (still version 1.10 though) and it ceased to function due to a web host/network disaster. I still have the MySQL database and would like to be able to port that content into a new installation of MediaWiki. This must be possible since the database didn't exist prior to installing MediaWiki and all of the contributions to said database were made through the MediaWiki software on the user side.

Nevertheless, I can't find much in the way of documentation about how to do something like that, and since the base procedure requires you to specify an administrator user just before installation, there must be a conflict with my existing database's administrator user that prevents me from being able to install and link directly to that database.

So I'm trying to engineer a way to do that manually (even if it does require copying all of the users' inputs straight out of the database into fresh articles on a totally fresh install), but if you happened to know of an easier way, by all means let me know! The key is that I need to keep the data from the original wiki. Skins and styles are less of a concern. -- Randall00 20:19, 13 July 2007 (UTC)

Simplified Diagram[edit]

There's a simplified diagram in this slideshare presentation, which might go well on this page. Dont know the copyright status of that though -- Harry Wood 14:59, 1 November 2007 (UTC)

In which table does wiki save all the articles[edit]

I've seen the database-layout and searched with phpmyadmin in the wiki-database, but I can't find the table in which the articles are saved.

They are in the table "text". See the Article about this table here in this wiki. In the introduction it tells you how to find the current revision of a text when you know the page name. -- 20:06, 12 January 2012 (UTC)

Missing tables[edit]

Updated from 1.16.0 to 1.17.0, script didnt create the new tables. Would be really cool if the developers could update the manual too, so people like me can create those tables manually.-- 11:44, 12 July 2011 (UTC)

I'd recommend running update.php again. It can be re-ran any number of times without a problem. It basically covers anything that ever changed in the database. So if there's a table missing that was added in a new version, it should be added by running that. I'd recommend against manually creating tables. Krinkle 23:31, 12 January 2012 (UTC)
I updated from 1.11 to 1.18 and also had the problem that new tables were not created. I've gone through the update process many times before and this was the first time that I had any problems. I ran update.php several times without success. I ended up recreating all the missing tables manually, and had to guess at a couple of indicies (they don't all seem to be documented), but I finally got everything to run. If I hadn't be a database programmer in a past life (way back in the 1990s!) I would have been totally lost. I've found some other comments from people that have had this problems, so I hope it can be looked into. -- Sam 07:05, 23 January 2012 (UTC)
Sam, you are crazy! ;-)
Have you created an issue in the bugtracker? You should do that! -- 00:39, 8 March 2013 (UTC)

"Red rows indicate tables that have been dropped in a particular version."[edit]

By "dropped" does it mean there's an actual DROP executed when you run update.php, or just that the installer won't add the table anymore, and it won't be used by the core? Leucosticte (talk) 14:45, 9 December 2013 (UTC)

Tables are never dropped. That's just a quick shot without checking all cases, but there definitely are important cases, where such a drop has not been executed. The point is that users might have actually used the tables - otherwise they would not have been added. However, if that was the only point, that the tables might have been used, then we would not need the "end=xx" in our overview as the tables stay - in many cases obviously forever. I am e.g. running a wiki, which still had the "cur" table in the 1.19 DB until I manually removed it. In the past I have always set the "end" column to those versions, in which the MediaWiki Core did no longer add the according table when installed. And that makes sense. It is true that in older installations outdated data from the table might still be used by external tools, however: We are speaking about Core usage here and when you set up MediaWiki 1.22 the table will no longer be created. The tables "blobs", "cur" and friends also have not been dropped and it would oppose the purpose of the table to show them in green as that way you basically have to show all tables, that were ever there, in green. There would be no red as no tables were dropped. No, that is not the prupose of this table. -- 01:03, 10 December 2013 (UTC)
PS: I also thought about whether it would make sense to actually drop these old tables, but we do not know, whether some old tools still rely on them and - even more important: We also have to think of the performance of running such queries on major Wikipedia projects and there is no need risking to potentially break them just to delete "unused" data. I am not a DB expert and I always like to learn things, but at least that's what I was told. -- 01:06, 10 December 2013 (UTC)
Okay, well, the terminology used by the page was ambiguous/misleading. I don't see any drop table items in MysqlUpdater.php. Leucosticte (talk) 02:51, 10 December 2013 (UTC)
Agreed. To round this up: Are there cases, where a table is no longer used by the Core, but the same table still is used by an extension, which is written for the new Core version? I am particularly thinking about the Math extension. Do the versions of the Math extension for 1.18 and newer still use the table "math"? If such cases exist, we should still add a note to that introductory text that these tables will no longer be used by the Core, but that they however might still be in use by extensions. -- 03:02, 10 December 2013 (UTC)
I looked that up now: According to the documentation, which has been created this October, that is the case for the table math. I added an according comment. -- 03:21, 13 December 2013 (UTC)

Want to add all the database table pages to your watchlist?[edit]

What a noble goal! Here ya go, copy and paste this into Special:EditWatchlist/raw:

Leucosticte (talk) 14:27, 13 March 2014 (UTC)

singular and plural table names[edit]

Some tables are singular (archive, category, change_tag, config, filearchive, hitcounter, image, interwiki, job, l10n_cache, logging, log_search, msg_resource, objectcache, oldimage, page, querycache, querycachetwo, querycache_info, redirect, revision, searchindex, tag_summary, text, transcache, updatelog, uploadstash, user, tag_summary, valid_tag, watchlist) at least grammaticaly (as watchlist or updatelog imply a collection) and some are plural (bot_passwords, categorylinks, externallinks, imagelinks, ipblocks, iwlinks, langlinks, msg_resource_links, module_deps, pagelinks, page_props, page_restrictions, protected_titles, recentchanges, sites, site_identifiers, site_stats, templatelinks, user_former_groups, user_groups, user_properties). What are the reasons/history/convention behind this? --Ilya (talk) 08:28, 20 August 2017 (UTC)

ipblocks_old table[edit]

The page says that the ipblocks_old table was added in 1.6 and was removed in 1.7 (Special:Diff/484529). Really?

The table was added in r15482. --Shirayuki (talk) 12:53, 16 July 2019 (UTC)

How outdated is that schema drawing?[edit]

I see the schema drawing was commented out in this edit because it was outdated (I had been wondering where it had gone). The large "Version history" table is a bit confusing, it could maybe be stored in a subpage while this page would only show the current tables? Þjarkur (talk) 19:51, 29 August 2019 (UTC)

See Topic:Vf90jqxbapc69dlh. --Krinkle (talk) 20:28, 20 January 2020 (UTC)

How does MediaWiki/Wikipedia exactly store all of their pages+revisions, etc and how does the whole process work?[edit]

Hello everyone, so I am currently creating my own website that involves a a "wiki" system almost exactly like MediaWiki but with some changes to obviously fit the site goal. One of the biggest questions I have been asking myself is what is exactly the process that MediaWiki/Wikipedia takes to send newly created pages to the database, how it's exactly stored in there and what format, how Wikipedia easily searches through it without any delays, and how its exactly extracted to then be seen and used?

So first off I did download MediaWiki to look through it to see if I could get any ideas along with looking through the Manual:Contents section too. I found that each page on the page table is stored inside a blob of data but my question is, how is it exactly formatted? I saw some video on YouTube of a guy that was playing around with the Wikipedia API and it seemed like WIkipedia stores a JSON file that contains all the contents of the page among other pieces of information, inside the blob of data. I asked a similar question on the Manual talk:Page table and I got a reply from an administrator that told me that what was inside the blob wasn't JSON data like I thought. I also asked about it being stored as XML and he said no and that its used for importing and exporting pages, which makes it a whole lot more confusing.

I also wan't to point out a few things, first is that I am not doing formatting code just like MediaWiki does, rather I am just going to do the HTML editor (pretty much like the visual editor) and save it like that in the database.

so can someone fully and easily explain to me the whole entire process of how Wikipedia/MediaWiki creates new pages, stores them inside the database (and what format) , how it archives each version of the page, how it's exported to your computer, and displayed as a normal Wikipedia page? And also I would like to know how exactly Wikipedia is able to search through the blob of data for each page in the database when using the search bar? If anyone could answer these questions I would be beyond thankful!

p.s. Don't try to convince me to just "use MediaWiki" because I have had quite a bit of trouble and time wasted just trying to download some stuff for it (which I need to get this website up and running as soon as physically possible. Plus the "wiki" part of the site is just one major part of the site and the other is something beyond MediaWiki's use

Again I would be highly thankful if someone could give me an answer to my questions

Thanks, --TheSplatGuy (talk) 21:04, 15 July 2020 (UTC)