Project:Support desk/Sections/Database

__NEWSECTIONLINK__ = MediaWiki Database Support =

Sorting of category page

 * MediaWiki: 1.11.0
 * PHP: 5.1.4 (apache2handler)
 * MySQL: 5.0.21-community-nt

On a Category:XX page of my wiki I see page links grouped by first letter, that is OK, but groups themselves (letters) are arranged in random order (page names are non-english!), is it possible to alphabetize them?

—Konstbel 16:55, 7 February 2008 (UTC)


 * The order is imposed by the database, and the order is probably not "random", but byte-wise, instead of alphabetically. You may get better results when using the "experimental utf8" veriation of the database setup, instead of "compatibility mode". This is an option diring installation, i have no idea how to change it later. Also note that "experimental utf8" mode relies on mysql's own utf8 support, which is incomplete. It may work better than "compatibe" (binary) mode for your language, but as soon as you use a character from some ''very' odd language (like, for example, gothic), it will not work at all, producing a fatal error.
 * Sadly, I don't know a good way out of this dilemma, short of waiting for real unicode support in mysql. -- Duesentrieb ⇌ 10:09, 8 February 2008 (UTC)


 * I looked at wiki database, and noticed, that most varchar fields have utf8_bin comparison instead of utf8_general_ci. Is this the reason? I tried to change to utf8_general_ci, but without visible effect :-( --Konstbel 15:13, 12 February 2008 (UTC)


 * Is your Wiki set to the language that you are writing in? And can you give a URL? -PatPeter, [[Image:Tournesol.png|20px]] MediaWiki Support Team  23:29, 28 February 2008 (UTC)


 * Yes, the language is the same. Sorry, it is intranet wiki, without external access --89.175.73.253 12:41, 19 March 2008 (UTC)

Incorrect UTF-8 chars conversion

 * MediaWiki: 1.11.0
 * PHP: 5.1.4 (apache2handler)
 * MySQL: 5.0.21-community-nt

I suppose, that UTF-8 chars converted to lower/uppercase incorrectly. It makes a really HUGE problems in wiki: the search does not find what I'm looking for, the categories sorted incorrectly (see problem above), the Extension:SearchLog also displays incorrect characters (also see problem above).

Is it possible to fix that?

—Konstbel 10:12, 14 February 2008 (UTC)


 * Searching and sorting are done by the database, and per default, mediawiki tells mysql to treat all data as binary. The reason is that mysql's utf-8 support is broken for some "rare" scripts (those using 4-byte codes), like gothic. You can specify "experimental utf-8 mode" during installation, then unicode collation should apply correctly (but you will get database errors when you try to use "unsupported" characters). I don't know how this can be changed after the wiki has already been installed. I suppose you would have to change the charset/collations on all tables manually. -- Duesentrieb ⇌ 11:54, 14 February 2008 (UTC)


 * Do I need to convert only database or I need also some modifications in wiki code? --Konstbel 09:32, 15 February 2008 (UTC)


 * Have there been any updates on this? --Kimon 20:09, 6 August 2008 (UTC)
 * Never mind, I fixed it. Add the "$filteredText = $wgContLang->lc($filteredText);" line to the function parseQuery in SearchMySQL4.php:

--Kimon 20:47, 6 August 2008 (UTC)


 * Newbie question. Does a lack of using "experimental utf-8 mode" explain why some characters at wikipedia display fine but when used in mine it displays as the little boxes. For instance this reference ˈwɪkə which doesn't seem to convert here either. It certainly does at wikipedia though. --WilHatfield 21:29, 14 March 2008 (UTC)

Pagetitle Encoding Problems (Chinese) after Move & Upgrade of MediaWiki

 * MediaWiki: 1.12.0
 * PHP: 5.2.4 (cgi)
 * MySQL: 4.1.13
 * URL: Rock in China Wiki

Hi, I upgraded MediaWiki and moved Servers from to the one mentioned above.
 * MediaWiki: 1.6.8
 * PHP: 4.3.10-22 (cgi-fcgi)
 * MySQL: 4.1.11-Debian_4sarge7-log
 * URL: http://web125.burns.kundenserver42.de/rockinchina/wiki/

I exported the complete database via phpMyAdmin 2.11.0-rc1 on the old server and imported it via mysql command line. Settings for collation on database and tables seem to be the same. Files have been moved completely and then upgraded to the new version.

The problem now is that pages with Chinese titles are broken. An example can be seen here:
 * Old/Good: http://web125.burns.kundenserver42.de/rockinchina/wiki/index.php?title=Badhead_3_%E8%8A%B1%E5%9B%AD%E6%9D%91_%28VA%29
 * New/Broken: http://wiki.rockinchina.com/index.php?title=Badhead_3_%C3%A8%C5%A0%C2%B1%C3%A5%E2%80%BA%C2%AD%C3%A6%3F%E2%80%98_%28VA%29

The pages show up wrong in category lists and links to these pages are broken.

Any pointers on this problem would be highly appreciated.

Thank you very much!

—Matsch 22:49, 25 March 2008 (UTC)

Identifying the Name of the Namespace from the ID in the Page table
Where is the name of the namespace stored in the database tables? Want to utilize the namespace feature to origanize user information.

Does mediawiki work if I change table InnoDB to MyIsam ?

 * MediaWiki: 1.9.3
 * PHP: 5.2.5 (cgi)
 * MySQL: 5.0.45-log
 * URL: http://aspic-dti.ovh.org/index.php?title=Accueil

Hello,

To save sql space, I have passed all InnoDB sql tables in MyIsam type.

Will there be some problems or will it just be a more bit slower ?

Best Regards

—Aspic 22:28, 31 March 2008 (UTC)

DB Error

 * MediaWiki: 1.12.0.
 * PHP:
 * MySQL: 5.0.33-log
 * URL:

Hi

i need help. i have configured wiki and than this :(

Database error A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was:

(SQL query hidden)

from within function "Article::pageData". MySQL returned error "1267: Illegal mix of collations (latin2_bin,IMPLICIT)     and (latin1_swedish_ci,COERCIBLE) for operation '=' (db2.clevernet.cz)".

Could anyone help me what can i do with this?

Sari

—85.160.33.115 19:03, 4 April 2008 (UTC)


 * and my too :(

Wikipedia
I've downloaded from wikipedia the database. And there i didn't find any explications about how can i integrate the database into my wiki website. The database is in XML format. Please tell me how can i import that database into my wiki site.

Thank you!

—79.119.149.93 16:04, 6 April 2008 (UTC)

NDB Cluster issue during installation

 * MediaWiki: <1.11.2>
 * PHP: <5.25>
 * MySQL: <5.1.23>
 * URL: <[http://lab-lamp.ssd.loral.com >

* PHP 5.2.5 installed * Found database drivers for: MySQL * PHP server API is apache2handler; ok, using pretty URLs (index.php/Page_Title) * Have XML / Latin1-UTF-8 conversion support. * Warning: A value for session.save_path has not been set in PHP.ini. If the default value causes problems with saving session data, set it to a valid path which is read/write/execute for the user your web server is running under. * PHP's memory_limit is 128M. * Couldn't find Turck MMCache, eAccelerator, APC or XCache; cannot use these for object caching. * Found GNU diff3: /usr/bin/diff3. * Couldn't find GD library or ImageMagick; image thumbnailing disabled. * Installation directory: /mnt/www/apache/html/MediaWiki * Script URI path: * Installing MediaWiki with php file extensions * Environment checked. You can install MediaWiki. * Generating configuration file... * Database type: MySQL * Loading class: DatabaseMysql * Attempting to connect to database server as wikIT...success. * Connected to 5.1.23-rc * Database wikidb exists * There are already MediaWiki tables in this database. Checking if updates are needed...   * Warning: you requested the InnoDB storage engine, but the existing database uses the ndbcluster engine. This upgrade script can't convert it, so it will remain ndbcluster.

...hitcounter table already exists. Creating querycache table...ok Creating objectcache table...ok ...categorylinks table already exists. Creating logging table...ok ...user_newtalk table already exists. ...transcache table already exists. ...trackbacks table already exists. Creating externallinks table...Query "CREATE TABLE `externallinks` ( el_from int(8) unsigned NOT NULL default '0', el_to blob NOT NULL, el_index blob NOT NULL, KEY (el_from, el_to(40)), KEY (el_to(60), el_from), KEY (el_index(60)) ) TYPE=ndbcluster " failed with error code "BLOB column 'el_to' can't be used in key specification with the used table type (lab-mysql)--> —158.184.23.102 19:48, 9 April 2008 (UTC)


 * In the MySQL Cluster Documentation they explicitly state:

Indexes and keys in NDB tables. Keys and indexes on MySQL Cluster tables are subject to the following limitations: TEXT and BLOB columns. You cannot create indexes on NDB table columns that use any of the TEXT or BLOB data types.


 * So it seems like this simply does not work on NDBCLUSTER engines ... alas
 * —dec 16:00, 18 July 2008 (CET)

active/active mysql setup

 * MediaWiki: (Reported by your Wiki's Special:Version page)
 * PHP: NA
 * MySQL: NA
 * URL: NA

I was debating how to go about setting up a MediaWiki farm. Does mediawiki work with/ or support an active/active mysql setup?

Thank you for your time, Danko

—63.115.78.28 10:56, 11 April 2008 (UTC)

Wiki site launch preperation

 * MediaWiki: 1.12
 * PHP: 5.x
 * MySQL: 5.x

What should be done to prepare a wiki (in development) for release? I can think of removing old revision (with deleteOldRevisions.php) and permanently removing the history of deleted pages.

DELETE FROM archive;

But is there more needed? For example, how can the hit counters be cleared? Like maybe flush caches, or something? Clearing the log?

—Rebbyte 09:46, 18 April 2008 (UTC)

Failed move of Mediawiki between servers

 * MediaWiki: 1.6.5
 * PHP: 5.2.5
 * MySQL: 5.0.51a
 * URL: intranet

My moved mediawiki gives a blank screen on entering utf-8 (Chinese multibyte characters) in the text of a page. (On preview or save).

My log of the procedure is as follows.

Failing to move a wiki to another server:

hope this is a frequently asked question: I have been asked to move a small wikimedu maybe 100 - 200  pages  from Solaris to Linux it seems ok except for multibyte character entry, what am I missing? I have tried creating the mysql database in utf8 and default (latin1) format what should I do next?

Some older utf8 entries suggest that collation could be important. The actual error is browser screen goes white on Chinese multibyte character preview and save.

I tried strace on httpd and mysqld but there was no obvious smoking gun i.e. no obvious error returning code. I suppose there could be different php modules - but I would expect Linux to have more php options than sun not less.

Here is a rough log of what I tried so far:

Resources: http://www.mediawiki.org/wiki/Manual:Moving_a_wiki http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki http://kb.mediatemple.net/article.php?id=138 mediawiki mailing list search on utf8

The procedure is get a dump of the database: /usr/bin/nice -n 19 /usr/bin/mysqldump -u $USER -p$PASSWORD --default-character-set= $CHARSET $DATABASE -c > mydumpout.sql

[Tried with and without --default-charset]

backup all the mediawiki or wiki folder.

cp -Rp mediawiki ~username/tmp cd ~username/tmp zip -r mediawiki.zip mediawiki

cp the files to the destination machine.

unzip the mediawiki.zip alter the database login

create the database in the destination machine add the user

mysql> create database wikifinal default character set utf8 [tried with and without utf8] mysql> grant all privileges on wikifinal.* to wikiuser2@localhost identified by ' ' with grant option;

build the tables /opt/lampp/bin/mysql -u wikiuser2 -p wikifinal < dumpoutnonlatinalt.sql Enter password:

a few TYPE=InnoDB DEFAULT CHARSET=utf8; on "Specified key was too long; max key length is 1000 bytes" with utf8 errors since I am putting utf8 everywhere x2; dubious workaround on those two table switch to latin1 TYPE=InnoDB DEFAULT CHARSET=latin1;

stop apache link the media wiki to the same link as before in this case wiki make the images directory and sub directories writable. start apache

Tests: Viewing wikipedia contents:Pass uploaded an image:Pass editing with multibye content.FAIL

Multibyte edits fail could well be a symptom of something else, like trying to copy mediawiki with little in depth mediawiki knowledge.

Source and destination mediawiki versions 1.6.5 source mysql version 4.0.27-standard destination mysql version  5.0.51a Source OS Solaris, destination Linux and xampp

Addendum Apr24 21:00 Dublin Time.

Just tried to put in all the collations and sort order by hand- to no real benefit. Funny though I have a Chinese names page and I can view it, but previewing or posting with Chinese text results in a blank screen. There are some SSL warnings - I am assuming that would be a red herring to follow.

Addendum 19 May 2008


 * 1) I created a new wiki
 * 2) I exported/imported the 240 wiki pages using XML there is a script for doing this in the maintenance directory.
 * 3) I then copied over ther 20 or so images by hand.
 * 4) About 20 people had to re register but that was not a big issue.
 * 5) Added a few redirects to apache so the mediawiki was redirected to from the other server and mediawiki was the home page.

Worked for me your mileage may vary.

—193.32.3.83 11:05, 24 April 2008 (UTC)

Creating a MediaWiki mirror

 * MediaWiki: (Reported by your Wiki's Special:Version page)
 * PHP:
 * MySQL:
 * URL:

Hey. I wasent sure were to post this as its not really an error but a question of functionality.

Are there any known or premade ways to create a MediaWiki mirror.

We want to create 3 separate Wiki's which mirror each others content, every update, every edit should instantly be mirrored.

Thanks

—213.113.166.32 20:53, 28 April 2008 (UTC)

Database Error

 * MediaWiki: 1.10.1
 * PHP: 5.1.6 (apache2handler)
 * MySQL: 5.0.22
 * URL: http://redbookswiki.tap.ibm.com (note this is not accessible from the internet - it is an intranet only accessible wiki running within the IBM intranet. Including the URL just to be complete w/this post...)

Problem

Loading http://redbookswiki.tap.ibm.com/index.php/Special:Popularpages generates a database error:

This error roughly coincides with the timing of installation and first use of the DPL extension. This is the only page that generates this database error (that we've found so far). I'm including the Extensions and Hooks section of the version page output for the wiki (HTML paste) below my sig.

Thanks in advance for any help on this! - Chris -Calmo 17:36, 7 May 2008 (UTC)

Extensions (from affected server Version page) Hooks (from affected server Version page) —Calmo 17:36, 7 May 2008 (UTC)

We have exactly the same error!

--Danielp 07:37, 19 May 2008 (UTC)
 * MediaWiki 	1.12.0
 * PHP 	5.2.6 (apache2handler)
 * MySQL 	5.0.32-Debian_7etch5-log


 * We had a similar error on Fan History after upgrading from 1.10 to 1.12. The problem was resolved when we upgraded our php version to 5.2.6.  For DPL, you might also want to post the request here as they've been helpful. --PurplePopple 03:30, 18 June 2008 (UTC)

Extracting wiki text as plain text from MySQL database

 * MediaWiki: ??? that's part of my problem
 * PHP: 5.2.5
 * MySQL: 4.1.12
 * URL: internal

Hi, I've backed up my Wiki install but stupidly have got only the partial filesystem, but all of the database. I can't work out what version of Wiki I had, and I can't figure out how to extract the content out so I can rebuild into a new wiki by hand.

Does anybody know how to extract the BLOB's from the text table as plain text so i can rebuild my wiki?

Thanks in advance, Jase. —150.101.163.28 22:30, 16 May 2008 (UTC)

error message when running configure wiki - Could not find a suitable database driver

 * MediaWiki: (Reported by your Wiki's Special:Version page)
 * PHP:
 * MySQL:
 * URL:

I am getting this error when running the config page for the wiki, I have both PHP and MSQL installed with a user created.

* PHP 5.2.3 installed Could not find a suitable database driver! o For MySQL, compile PHP using --with-mysql, or install the mysql.so module o For PostgreSQL, compile PHP using --with-pgsql, or install the pgsql.so module

—Groedel99 16:40, 6 June 2008 (UTC)

backuped version shows all articles as ???????

 * MediaWiki: (Reported by your Wiki's Special:Version page)
 * PHP:
 * MySQL:
 * URL:

Hi! I have prblem by custom backup of my Wiki DB. On both server I have default-character-set UTF8. Everything is in UTF-8 and the data into the backuped database is UTF-8 too. But the backuped version of the wiki shows all articles like this ??????? and could find anything on the main page. The data into the tables are the same like the original wiki. What can I do?

How can I solve this problem?

Regards Nik —62.143.184.50 07:38, 18 July 2008 (UTC)

Put data into MediaWiki with SQL-Dump

 * MediaWiki: 1.12.0
 * PHP: 5.2.5
 * MySQL: 5.0.51a
 * URL: its local

I wanted to ask, if it is possible to move data into MediaWiki with using SQL-Dumps.

I have tonnes of Text & Pictures/MediaFiles I need to insert into MediaWiki. So I've created a small Tool, which converts the text into the MediaWiki format with its StyleTags for Title and so on. But now, I need to create a SQL-Dump (Not direct to Databese, first I have to make a dump) to insert the Data into the Database. Problem is, I don't get it by looking trough the MediaWiki Database.

I dont know on which Tables I must do my Inserts for it. I was looking around, but I didn't find any example. So I would be glad, if someone can help.

—193.26.130.189 09:29, 25 July 2008 (UTC)

Searchindex always crashing

 * MediaWiki: 1.12.0
 * PHP: 5.2.5
 * MySQL: 5.0.51a-log
 * URL:

Hello everybody,

i am new here, but i've got an big issue. My wiki is showing me this failure:

"SearchMySQL4::update". MySQL returned error "145: Table './wikidb/searchindex' is marked as crashed and should be repaired (localhost)"

And i tried to repair it, and the repair function of mysql don't help. It seems that mediawiki is crashing the searchindex again and again?

My configuration: Server; Mac OS X Server Leopard Environment: Xampp for Mac OS X 0.7.2 (Apache, MySQL 5.0.51a-log, Phpmyadmin 2.11.4)

The failure: I am configurating the wiki. (/config/...) After that, i am moving the LocalSettings.php to the right place. Then i am opening the Wiki Website. I am changing something on the Wiki, adding some text. -> OK First time: everyting ok. No failure. When i then editing the Wiki again, and saving then ->

SearchMySQL4::update "145: Table './wikidb/searchindex' is marked as crashed and should be repairded (localhost)".

(i installed mediawiki and the mysql database two times now..)

Try to repair: Phpmyadmin - check table "Table is marked as crashed - Can't read key from filepos:0 - Incorrect key file for table './wikidb/searchindex.MYI'; try to repair it. - Corrupt Phpmyadmin - repair table 'searchindex' repair - status - ok then i check again with Phpmyadmin - check table: "Table is marked as crashed - Can't read key from filepos:0 - Incorrect key file for table './wikidb/searchindex.MYI'; try to repair it. - Corrupt

I also tried it on mysql commandline and with "myisamchk -r /path/to/the/table"

like described in: http://www.karakas-online.de/EN-Book...upt-table.html

But i cant repair it! It is a real shit. Because i wanted to put data into the Wiki, and work on it all the time. And now i am just reading thrue google "try -repair table '$tablename'".

Does someone have any knowledge about this problem? And how to fix it?

Best regards,

eibi —213.3.36.247 14:15, 30 July 2008 (UTC)

delete old revisions

 * MediaWiki: 1.13.0rc1
 * PHP: 	5.2.6
 * MySQL: 5.0.51a
 * URL:

Hello,

I use a MediaWiki as private CMS. For saving webspace, I want to delete the old revisions.

The problems:
 * I have no shell access
 * phpShell wont work on my webspace, because clamav claims, phpshell is a virus an deletes it immediately
 * Extension:SpecialDeleteOldRevisions doesn't work since 1.12. Well, I could copy the missing htmlform.php from an older MediaWiki version, but I still don't know if Extension:SpecialDeleteOldRevisions will mess up my database due to changes since 1.9.

Is there a way to use the maintenance scripts without shell access, or can I use Extension:SpecialDeleteOldRevisions without danger, or are there other ways to delete old revisions?

Regards, —Tuwan 20:43, 30 July 2008 (UTC)

SQL Database and Media-Wiki Issue (Is there a way to restore the database or pages to a specific date and time if the database was copied over)

 * MediaWiki: (Reported by your Wiki's Special:Version page)
 * PHP: PHP5, PHP6
 * MySQL: Appserv SQL Database
 * URL: [http://69.21.97.17/Dictionary/index.php Aoiro Joukai Dictionary

Database/Media-wiki issue:

I was working on my wiki the other day and I noticed that there were two pages that were mysteriously missing. I do not work with sql much and I just started somewhat working on media wiki for my site, so after trying to find the pages I ended up copying the database over from an earlier version to see if I could recover them.

I knew that more or less I was pushing back pages that were not updated, but I was sort of suprised that it wiped out 3 pages that I had worked on just that day. I was wondering if there was any way to restore those pages in the database or if there was somehow an area that would list those three pages for me to simply restore. I know that there is a restore feature on media wiki but I am not sure how to use it.

I was not sure if just by copying the older database over to what I had now if the pages were simply, and completely, removed that were not there previously? Or if they are simply hiding since the old version did not have those 3 pages whatsoever. Everything else is up to date, its just that I cannot find those three particular pages anymore.

If there is a way to restore the database then that would be great, I am not sure if there is a way to do this but I would love it if anyone did know how to solve the issue was able to tell me how to get back the pages from the past 24 hours. Is there a dump file or something that holds that information?

—69.21.97.170 06:47, 5 August 2008 (UTC)

"Something's not quite right yet; make sure everything below is filled out correctly."

 * MediaWiki: 1.11.0
 * PHP: 5.2.6
 * MySQL: 5.0.51b
 * URL:

Hi. I get this error after hitting submit on the installation: Something's not quite right yet; make sure everything below is filled out correctly. It is reffering to my SQL details, that re correct, and I have re-entered them 14 times, and asked 2 other people to do it for me tocheck i'm not being a fool. If someone could help me sort this problem it would be great.

Chris christopher.phillip.king@googlemail.com —91.111.20.170 14:43, 11 August 2008 (UTC)

"Read database pages text - Wrong correspondency of text in the db with the one in the wiki"
Hello,

I can't find my the actual pages text in the database, i find but it isn't the actual version but an old version or not correct one. I correspond the last rev_text_id (of a page) in the table_revision with the table_text old_id but the text isn't the actual one (it's an old or a completely different). Did i made something wrong? Did i miss something? What can i do? I need to acces the pages actual text (all revisions if possible).

António antonioavf@gmail.com —213.138.228.242 23:54, 16 August 2008 (UTC)