Topic on Project:Support desk

Easy way to upgrade from version 1.8 to 1.22?

35
203.25.255.17 (talkcontribs)

I have a mediawiki which is well established and it is version 1.8.

Is there an easy upgrade path/method?

I'd like to install on a new VM if that is possible, as the old one is running an out of date older version of Linux. I intend to create a new VM with a new OS. That way I don't have to worry about updating all the various support packages for the old system to support the new version of mediawiki.

88.130.77.169 (talkcontribs)

Hi!

Yes, there is! When you set up a new machine, you will have to import the database into the new MySQL server and you should put the wiki files back into place on the new machine again. (At that point the wiki might be broken.) Then go through the steps on Upgrade and you will be able to go directly to version 1.22. Also update extensions and should you only get a blank page make sure to check for the PHP error, which is happening in order to get it solved.

203.25.255.17 (talkcontribs)

Thank you for your help. OK, what I did was:

  1. backup the mysql database
  2. backup the old mediawiki directory
  3. transfer them to the new VM
  4. I then unpacked the old mediawiki directory
  5. restored the mysql database to the new mysql system
  6. edited the httpd.conf file to create a virtual server to allow access to the mediawiki directory as the root directory of the webserver
  7. ran the update script.

Here I encountered an error:

# php maintenance/update.php
PHP Notice:  Undefined index: HTTP_USER_AGENT in /var/www/html/mediawiki/extensions/FCKeditor/fckeditor/fckeditor_php5.php on line 37
PHP Fatal error:  Call to undefined function wfLoadExtensionMessages() in /var/www/html/mediawiki/extensions/CategoryWatch/CategoryWatch.php on line 239

Any ideas?

88.130.66.189 (talkcontribs)

I cannot say anything to step 6; no idea what a virtual machine needs to be set up. But since you obviously can basically access the files, that should be fine.

Step 7 is wrong: You need to update the MediaWiki source code first. After putting the old files in place again, overwrite them with those files, which are in the 1.22 tarball.

The extension FCKeditor is no longer maintained; remove it from LocalSettings.php and remove it from the folder extensions/.

Then try running update.php again.

203.25.255.17 (talkcontribs)

Again, thanks for your help.

I removed FCKeditor as you suggested and then got an error when rerunning the update.php:

PHP Fatal error: Call to undefined function wfLoadExtensionMessages() in /var/www/html/mediawiki/extensions/CategoryWatch/CategoryWatch.php on line 239

So I also removed CategoryWatch.

That made the Update script run successfully.

When I then browsed to the updated mediawiki it failed to show any of the content which was from the old mediawiki. All I got was the replaced logo image and the same startup main page.  :(

203.25.255.17 (talkcontribs)

I decided to start again from scratch and follow the upgrade instructions explicitly.

After several problems with the upgrade script complaining about various extensions I fixed all those and ended up with the following errore:

php maintenance/update.php A database query error has occurred. Query: SELECT page_id,page_len,page_is_redirect,page_latest,page_content_model FROM `page` WHERE page_namespace = '10' AND page_title = 'Extension_DPL' LIMIT 1 Function: LinkCache::addLinkObj Error: 1146 Table 'wikidb.page' doesn't exist (localhost)

Ciencia Al Poder (talkcontribs)

The page table should be there. Be sure you imported your old database correctly and that the target database is correct (wikidb)

88.130.66.189 (talkcontribs)

You do not have to remove all extensions, which do not work properly. For the extension CategoryWatch, the last version seems to be from 2011; likely that it also is broken nowadays.

I guess that you have new MySQL credentials and that in LocalSettings.php you are still using the old ones making MediaWiki think you want to start with a new installation. That's why you see only the default stuff in your installation. Make sure to fix the database username, host and password in LocalSettings.php and then try running update.php again. It should then update the correct database. ;-)

203.25.255.17 (talkcontribs)

I have set up my new MySQL database with the same names and credentials as the old. I have tested this by logging into mysql.

I have restored my old MySQL database into the new one and that was successful. However, when I attempted to run the update script, I got the result I've already mentioned. Having checked the table structure, from the error it appears that my old database doesn't have the same structure as the new one, missing several fields. Have there been significant changes made between the versions?

88.130.66.189 (talkcontribs)

Yes, the database structure basically changes with each major version. Coming from 1.8 to 1.22 there will be a lot of changes. Every change can be significant. update.php is the script, which is used to apply these changes to your DB.

A way to solve this cleanly is to remove the content in the new DB again and to import it from backup again. Before you do anything, make sure that really the complete database got imported! E.g. the table page obviously was part of an 1.8 installation and so also must be there in the new DB as well! Depending on how you do the import, it might be interrupted in the middle leaving you with an incomplete DB. See Manual:database layout for the tables and columns, which must be there for each version.

Once you have verified everything is there, try running update.php again...

203.25.255.17 (talkcontribs)

Thank you for your help thus far.

I am though, becoming increasingly frustrated by this process. It does not appear to be doing what is required of it.

  • I have backed up my original database using both mysqldump and dumpBackup.php.
  • I have backed up my old mediawiki directory.
  • I have transferred these files to my new server.
  • I Have unpacked the tarball of the new version of mediawiki on my new server.
  • I have unpacked my old mediawiki directory into a different directory to the new version of mediawiki.
  • I have copied the LocalSettings.php and the images/ directory from my old nediawiki to the new mediawiki directory.
  • I have copied the extensions directory contents from my old mediawiki to the new mediawiki directory.
  • I have restored the mysqldump created file successfully.
# /usr/bin/mysql -u root --password=9xbMkl7M wikidb < /home/mydirectory/mysql_backup-wikidb-20140123.sql
  • I have restored the files created with dumpBackup.php to the new mysql database.
  • I now get an error:
"# php /var/www/html/mediawiki/maintenance/importDump.php < /home/mydirectory/mediawiki_backup-wikidb-20140123.xml
PHP Warning:  Invalid argument supplied for foreach() in /var/www/html/mediawiki/includes/Block.php on line 288
100 (88.73 pages/sec 88.73 revs/sec)
PHP Fatal error:  Call to a member function fetch_array() on a non-object in /var/www/html/mediawiki/includes/db/DatabaseMysqli.php on line 144"

I am sorry but I appear to just be running around in circles. This is very frustrating. I have followed the process on the page I was directed to and it DOES NOT WORK.

Ciencia Al Poder (talkcontribs)

I do not understand why you do this:

  • I have unpacked my old mediawiki directory into a different directory to the new version of mediawiki.

You only need to unpack MediaWiki tarball, don't transfer the old files to the new server!

About this one:

  • I have copied the LocalSettings.php and the images/ directory from my old nediawiki to the new mediawiki directory.

You need to also edit this file to point it to the new database server! And correctly configure any filesystem path that has changed from the old installation, and maybe $wgServer.

  • I have copied the extensions directory contents from my old mediawiki to the new mediawiki directory.

You shouldn't be doing this. You're basically copying outdated and incompatible extensions on your new installation. That won't work. You should instead download the new versions of those files and place them on the new extensions directory. But in case you have problems, it would make sense to just install without enabling any extension, and once you get the wiki upgraded, you would enable the extensions, run update.php again (in case any extension needs to perform some schema changes) and test it again.

  • I have restored the files created with dumpBackup.php to the new mysql database.

If you have imported the mysql dump that you got with mysqldump on the new database, you don't have to import the XML dump, because the contents are already present on the database! Please skip this one

After you have imported the database to the new server, and corrected LocalSettings.php, you should run update.php.

This is basically what Manual:Upgrading says. I don't know how you get it so badly confused!

203.25.255.17 (talkcontribs)

Thanks for your help again.

My understanding is that if I don't copy my LocalSettings.php and images directory from my old mediawiki I lose the customised settings in place and the image files which have been uploaded.

I am afterall, trying to transfer an existing mediawiki which has extensive content to the new mediawiki server.

As for the extensions, several of them have not been replaced. If I don't copy them, I lose that functionality.

It increasingly looks to me like there is no sure process to transfer the old mediawiki content to the new. I find it bizarre. I've now done this a dozen times or more. As you should understand, I am becoming increasingly frustrated by what is described as "easy" not working. I will attempt it again, using your method but if the content doesn't appear to have been transferred, I will throw in the towel.

88.130.111.240 (talkcontribs)

I understand that this situation is frustating. However, I do not know a way around this trial and error process. :-(

Try seeing it this way: If you did each single update on its own, then you would have done fourteen(!) major updates in the last six years. From my experience I can tell you that an update from one version to the next one mostly causes only minor or no trouble at all. I estimate 8 or maybe even 10 of these updates would have worked flawlessly, while a few would not. The difference for you only is that you now see all those errors, which you normally would have seen in the course of six years, at one day. No matter if you updated back then or if you do it now, the number of errors does not become more. But you have to solve them.

The problem is that update.php does not have anything like a dry-run mode. You will not know, if it works properly (if it works properly now) until you test it. And when it did not, then you have a halfly-updated DB and you have to replace it with a backup.

To make things easier I would put the wiki database into the MySQL server twice: One DB for testing the upgrade (e.g. called "actual-wiki-db") and another one, not upgraded or touched in any way, as a backup ("backup-db"). Each time update.php crashed, copy over the backup again, solve the PHP error or what update.php complained about and try again. I would make this a oneliner in the shell, basically with this syntax:

 mysqldump -h [server] -u [user] -p[password] backup-db | mysql -h [server] -u [user] -p[password] actual-wiki-db

With that it's really only pressing Enter and waiting some time until actual-wiki-db is OK again.

You have to keep LocalSettings.php and the content from the folder images/. Which extensions do you have in use? I think to do the update even more easily, you can try deactivating ALL extensions in LocalSettings.php by commenting the require_once lines. Then you can at least be sure that - should there be an error - it is not caused by any extension.

After you did the above, what is the error message you get when you run update.php now?

203.25.255.17 (talkcontribs)

OK, these are the steps I did.

[root@pdcvwik050 html]# hp /var/www/html/mediawiki/maintenance/importDump.php < /home/bross/mediawiki_backup-wikidb-20140123.xml^C
[root@pdcvwik050 html]# service httpd.stop
httpd.stop: unrecognized service
[root@pdcvwik050 html]# service httpd stop
Stopping httpd:                                            [  OK  ]
[root@pdcvwik050 html]# hp /var/www/html/mediawiki/maintenance/importDump.php < /home/bross/mediawiki_backup-wikidb-20140123.xml
-bash: hp: command not found
[root@pdcvwik050 html]# php /var/www/html/mediawiki/maintenance/importDump.php < /home/bross/mediawiki_backup-wikidb-20140123.xml
A copy of your installation's LocalSettings.php
must exist and be readable in the source directory.
Use --conf to specify it.
[root@pdcvwik050 html]# service httpd start
Starting httpd:                                            [  OK  ]
[root@pdcvwik050 html]# vi mediawiki/LocalSettings.php
[root@pdcvwik050 html]# service httpd restart
Stopping httpd:                                            [  OK  ]
Starting httpd:                                            [  OK  ]
[root@pdcvwik050 html]# service httpd stop
Stopping httpd:                                            [  OK  ]
[root@pdcvwik050 html]# php /var/www/html/mediawiki/maintenance/importDump.php < /home/bross/mediawiki_backup-wikidb-20140123.xml
100 (2.42 pages/sec 2.42 revs/sec)
200 (3.46 pages/sec 3.46 revs/sec)
300 (3.24 pages/sec 3.24 revs/sec)
400 (2.84 pages/sec 2.84 revs/sec)
500 (3.06 pages/sec 3.06 revs/sec)
600 (3.34 pages/sec 3.34 revs/sec)
700 (3.26 pages/sec 3.26 revs/sec)
800 (3.06 pages/sec 3.06 revs/sec)
900 (2.93 pages/sec 2.93 revs/sec)
1000 (2.70 pages/sec 2.70 revs/sec)
1100 (2.42 pages/sec 2.42 revs/sec)
1200 (2.19 pages/sec 2.19 revs/sec)
1300 (2.18 pages/sec 2.18 revs/sec)
1400 (2.17 pages/sec 2.17 revs/sec)
1500 (1.82 pages/sec 1.82 revs/sec)
1600 (1.82 pages/sec 1.82 revs/sec)
1700 (1.82 pages/sec 1.82 revs/sec)
1800 (1.89 pages/sec 1.89 revs/sec)
1900 (1.98 pages/sec 1.98 revs/sec)
2000 (2.02 pages/sec 2.02 revs/sec)
2100 (2.04 pages/sec 2.04 revs/sec)
2200 (2.08 pages/sec 2.08 revs/sec)
2300 (2.13 pages/sec 2.13 revs/sec)
2400 (2.14 pages/sec 2.14 revs/sec)
2500 (2.08 pages/sec 2.08 revs/sec)
2600 (2.08 pages/sec 2.08 revs/sec)
2700 (2.09 pages/sec 2.09 revs/sec)
2800 (2.11 pages/sec 2.11 revs/sec)
2900 (2.15 pages/sec 2.15 revs/sec)
3000 (2.19 pages/sec 2.19 revs/sec)
3100 (2.22 pages/sec 2.22 revs/sec)
3200 (2.25 pages/sec 2.25 revs/sec)
3300 (2.27 pages/sec 2.27 revs/sec)
3400 (2.30 pages/sec 2.30 revs/sec)
3500 (2.29 pages/sec 2.29 revs/sec)
Done!
You might want to run rebuildrecentchanges.php to regenerate RecentChanges
[root@pdcvwik050 html]# ls -al mediawiki/maintenance/rebuildrecentchanges.php
-rw-rw-r-- 1 root asgadmin 9179 Dec  7 06:13 mediawiki/maintenance/rebuildrecentchanges.php
[root@pdcvwik050 html]# php mediawiki/maintenance/rebuildrecentchanges.php
Loading from page and revision tables...
$wgRCMaxAge=7862400 (91 days)
Updating links and size differences...
Loading from user, page, and logging tables...
Flagging bot account edits...
Flagging auto-patrolled edits...
Deleting feed timestamps.
Done.

And then

[root@pdcvwik050 html]# php mediawiki/maintenance/update.php
MediaWiki 1.22.0 Updater

Going to run database updates for wikidb
Depending on the size of your database this may take a while!
Abort with control-c in the next five seconds (skip this countdown with --quick) ... 0
...have ipb_id field in ipblocks table.
...have ipb_expiry field in ipblocks table.
...already have interwiki table
...indexes seem up to 20031107 standards.
...hitcounter table already exists.
...have rc_type field in recentchanges table.
...have user_real_name field in user table.
...querycache table already exists.
...objectcache table already exists.
...categorylinks table already exists.
...have pagelinks; skipping old links table updates
...il_from OK
...have rc_ip field in recentchanges table.
...index PRIMARY already set on image table.
...have rc_id field in recentchanges table.
...have rc_patrolled field in recentchanges table.
...logging table already exists.
...have user_token field in user table.
...have wl_notificationtimestamp field in watchlist table.
...watchlist talk page rows already present.
...user table does not contain user_emailauthenticationtimestamp field.
...page table already exists.
...have log_params field in logging table.
...logging table has correct log_title encoding.
...have ar_rev_id field in archive table.
...have page_len field in page table.
...revision table does not contain inverse_timestamp field.
...have rev_text_id field in revision table.
...have rev_deleted field in revision table.
...have img_width field in image table.
...have img_metadata field in image table.
...have user_email_token field in user table.
...have ar_text_id field in archive table.
...page_namespace is already a full int (int(11)).
...ar_namespace is already a full int (int(11)).
...rc_namespace is already a full int (int(11)).
...wl_namespace is already a full int (int(11)).
...qc_namespace is already a full int (int(11)).
...log_namespace is already a full int (int(11)).
...have img_media_type field in image table.
...already have pagelinks table.
...image table does not contain img_type field.
...already have unique user_name index.
...user_groups table exists and is in current format.
...have ss_total_pages field in site_stats table.
...user_newtalk table already exists.
...transcache table already exists.
...have iw_trans field in interwiki table.
...wl_notificationtimestamp is already nullable.
...index times already set on logging table.
...have ipb_range_start field in ipblocks table.
...no page_random rows needed to be set
...have user_registration field in user table.
...templatelinks table already exists
...externallinks table already exists.
...job table already exists.
...have ss_images field in site_stats table.
...langlinks table already exists.
...querycache_info table already exists.
...filearchive table already exists.
...have ipb_anon_only field in ipblocks table.
...index rc_ns_usertext already set on recentchanges table.
...index rc_user_text already set on recentchanges table.
...have user_newpass_time field in user table.
...redirect table already exists.
...querycachetwo table already exists.
...have ipb_enable_autoblock field in ipblocks table.
...index pl_namespace on table pagelinks includes field pl_from.
...index tl_namespace on table templatelinks includes field tl_from.
...index il_to on table imagelinks includes field il_from.
...have rc_old_len field in recentchanges table.
...have user_editcount field in user table.
...page_restrictions table already exists.
...have log_id field in logging table.
...have rev_parent_id field in revision table.
...have pr_id field in page_restrictions table.
...have rev_len field in revision table.
...have rc_deleted field in recentchanges table.
...have log_deleted field in logging table.
...have ar_deleted field in archive table.
...have ipb_deleted field in ipblocks table.
...have fa_deleted field in filearchive table.
...have ar_len field in archive table.
...have ipb_block_email field in ipblocks table.
...index cl_sortkey on table categorylinks includes field cl_from.
...have oi_metadata field in oldimage table.
...index usertext_timestamp already set on archive table.
...index img_usertext_timestamp already set on image table.
...index oi_usertext_timestamp already set on oldimage table.
...have ar_page_id field in archive table.
...have img_sha1 field in image table.
...protected_titles table already exists.
...have ipb_by_text field in ipblocks table.
...page_props table already exists.
...updatelog table already exists.
...category table already exists.
Populating category table, printing progress markers. For large databases, you
may want to hit Ctrl-C and do this manually with maintenance/
populateCategory.php.
Category population complete.
Done populating category table.
...have ar_parent_id field in archive table.
...have user_last_timestamp field in user_newtalk table.
Populating rev_parent_id fields, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateParentId.php.
Populating rev_parent_id column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
rev_parent_id population complete ... 0 rows [0 changed]
...protected_titles table has correct pt_title encoding.
...have ss_active_users field in site_stats table.
...ss_active_users user count set...
...have ipb_allow_usertalk field in ipblocks table.
...pl_namespace, tl_namespace, il_to indices are already UNIQUE.
...change_tag table already exists.
...tag_summary table already exists.
...valid_tag table already exists.
...user_properties table already exists.
...log_search table already exists.
...have log_user_text field in logging table.
Populating log_user_text field, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateLogUsertext.php.
Nothing to do.
done.
Populating log_search table, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateLogSearch.php.
Nothing to do.
done.
...l10n_cache table already exists.
...index ls_field_val already set on log_search table.
...index change_tag_rc_tag already set on change_tag table.
...have rd_interwiki field in redirect table.
Converting tc_time from UNIX epoch to MediaWiki timestamp ...done.
Altering all *_mime_minor fields to 100 bytes in size ...done.
...iwlinks table already exists.
...index iwl_prefix_title_from already set on iwlinks table.
...have ul_value field in updatelog table.
...have iw_api field in interwiki table.
...iwl_prefix key doesn't exist.
...have cl_collation field in categorylinks table.
Updating categorylinks (again) ...done.
...collations up-to-date.
...msg_resource table already exists.
...module_deps table already exists.
...ar_page_revid key doesn't exist.
...index ar_revid already set on archive table.
...ll_lang is up-to-date.
...user_last_timestamp is already nullable.
...index user_email already set on user table.
Modifying up_property field of table user_properties ...done.
...uploadstash table already exists.
...user_former_groups table already exists.
...index type_action already set on logging table.
...have rev_sha1 field in revision table.
...batch conversion of user_options: nothing to migrate. done.
...user table does not contain user_options field.
...have ar_sha1 field in archive table.
...index page_redirect_namespace_len already set on page table.
...have us_chunk_inx field in uploadstash table.
...have job_timestamp field in job table.
...index page_user_timestamp already set on revision table.
...have ipb_parent_block_id field in ipblocks table.
...index ipb_parent_block_id already set on ipblocks table.
...category table does not contain cat_hidden field.
...have rev_content_format field in revision table.
...have rev_content_model field in revision table.
...have ar_content_format field in archive table.
...have ar_content_model field in archive table.
...have page_content_model field in page table.
...site_stats table does not contain ss_admins field.
...recentchanges table does not contain rc_moved_to_title field.
...sites table already exists.
...have fa_sha1 field in filearchive table.
...have job_token field in job table.
...have job_attempts field in job table.
...have us_props field in uploadstash table.
Modifying ug_group field of table user_groups ...done.
Modifying ufg_group field of table user_former_groups ...done.
...index pp_propname_page already set on page_props table.
...index img_media_mime already set on image table.
...iwl_prefix_title_from index is already non-UNIQUE.
...index iwl_prefix_from_title already set on iwlinks table.
...have ar_id field in archive table.
...have el_id field in externallinks table.
...site_stats is populated...done.
Checking existence of old default messages...done.
Populating rev_len column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
rev_len population complete ... 0 rows changed (0 missing)
Populating rev_sha1 column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
Populating ar_sha1 column
...archive table seems to be empty.
Populating ar_sha1 column legacy rows
rev_sha1 and ar_sha1 population complete [0 revision rows, 0 archive rows].
Populating img_sha1 field

Done 0 files in 0.0 seconds
Fixing protocol-relative entries in the externallinks table...
Done, 0 rows updated.
Populating fa_sha1 field from fa_storage_key

Done 0 files in 0.0 seconds
Purging caches...done.

Done.
[root@pdcvwik050 html]# service httpd start
Starting httpd:                                            [  OK  ]

It does not appear to have transferred any of the content. I cannot see any of the many, many pages present in the old mediawiki.

All I get is the primitive start page, after I set up the new mediawiki. I have tried reloading. No pages. I am at my wits end.

Why is there no clear and easy means of updating this product? 14 updates or one, it should just work.

203.25.255.17 (talkcontribs)
88.130.85.234 (talkcontribs)

Ok, with the database please now do the following:

  • Remove the (upgraded) database. (Btw.: It looks like update.php now finishes successfully!!! But(!) I fear you might have broken the DB with the importDump.php script. Do not use it.)
  • Replace it with a backup.
  • Before running update.php again, check for the tables "pages" and "text" being present. They must contain some rows: "page" one for each page (in the sum not one row, but many rows), "text" basically one for each revision (read: very many).

Is that the case?

203.25.255.17 (talkcontribs)

I am unsure what you want me to do:

  • Remove the (upgraded) database. (Btw.: It looks like update.php now finishes successfully!!! But(!) I fear you might have broken the DB with the importDump.php script. Do not use it.)
  • Replace it with a backup.

Backup? I have only a backup from the old mediawiki. I used importDump.php to import it. Is that the one you want me to use? If I don't use importDump.php, how do I get the data from the old mediawiki database into the new one?

  • Before running update.php again, check for the tables "pages" and "text" being present. They must contain some rows: "page" one for each page (in the sum not one row, but many rows), "text" basically one for each revision (read: very many).

How do I do that?

88.130.85.234 (talkcontribs)

No, you should NOT use importDump.php. You only have to import the database into MySQL. The pages and all the old wiki content will come when you import the database. They are there after you have imported the database into MySQL. That you used importDump on that database can cause problems in the future and that is what I want to save you from.

I want that:

On the new MySQL server, throw the updated DB away. Instead, put the backup of the old DB in place. The one, that has not been updated.

You can look into the DB with phpmyadmin if you have that (easy to use, nice GUI) or with MySQL on the command line:

mysql -u [user] -h [host] -p[password]
use actual-wiki-db
SELECT COUNT(*) FROM page
SELECT COUNT(*) FROM text
203.25.255.17 (talkcontribs)

OK, you don't want me to use importDump.php.

Do you want me to use the backup I made with mysqldump?

Then use the command to import it:

/usr/bin/mysql -u root --password=[passwprd] wikidb < /home/mydirectory/mysql_backup-wikidb-20140123.sql

Then use the mysql commands you've just supplied to inspect the database?

Ciencia Al Poder (talkcontribs)

This is exactly what I've said in my last message. This is also frustrating that you are repeating the same steps that I've already pointed to you that are wrong.

88.130.85.234 (talkcontribs)

YES, YES, YES!!!

203.25.255.17 (talkcontribs)

Apologies for the delay in replying but we had a long weekend downunder.

This is the result I got:

# mysql -u root -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 89
Server version: 5.1.66 Source distribution

Copyright (c) 2000, 2012, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> use wikidb;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> SELECT COUNT(*) FROM page;
+----------+
| COUNT(*) |
+----------+
|     3540 |
+----------+
1 row in set (0.00 sec)

mysql> SELECT COUNT(*) FROM text;
+----------+
| COUNT(*) |
+----------+
|    14071 |
+----------+
1 row in set (0.24 sec)

So, now what?

88.130.100.168 (talkcontribs)

Earlier you said that the table "page" was missing in your DB which made me ask, if your DB got imported completely (it obviously did not that time).

Now you know that this time in this database you currently have a sum of 3540 pages (including deleted pages). There are 14071 text revisions in the DB for these pages; that means that on average each page has around 4 revisions. These numbers should reflect what you expect to have inside the wiki. When they do, then double check that in LocalSettings.php you have set $wgDBuser, $wgDBname and $wgDBpassword to the correct values. Check by using

mysql -u [user] -h [host] -p[password];
use [db-name];

on the shell. Using the username, password and DB name from LocalSettings.php should not give you an error message.

When that worked, then use update.php again and post its output here.

203.25.255.17 (talkcontribs)

Output from update.php: ____________________________

# php update.php
MediaWiki 1.22.0 Updater

Going to run database updates for wikidb
Depending on the size of your database this may take a while!
Abort with control-c in the next five seconds (skip this countdown with --quick) ... 0
...have ipb_id field in ipblocks table.
...have ipb_expiry field in ipblocks table.
...already have interwiki table
...indexes seem up to 20031107 standards.
...hitcounter table already exists.
...have rc_type field in recentchanges table.
...have user_real_name field in user table.
...querycache table already exists.
...objectcache table already exists.
...categorylinks table already exists.
...have pagelinks; skipping old links table updates
...il_from OK
...have rc_ip field in recentchanges table.
...index PRIMARY already set on image table.
...have rc_id field in recentchanges table.
...have rc_patrolled field in recentchanges table.
...logging table already exists.
...have user_token field in user table.
...have wl_notificationtimestamp field in watchlist table.
Adding missing watchlist talk page rows... done.
...user table does not contain user_emailauthenticationtimestamp field.
...page table already exists.
...have log_params field in logging table.
...logging table has correct log_title encoding.
...have ar_rev_id field in archive table.
...have page_len field in page table.
...revision table does not contain inverse_timestamp field.
...have rev_text_id field in revision table.
...have rev_deleted field in revision table.
...have img_width field in image table.
...have img_metadata field in image table.
...have user_email_token field in user table.
...have ar_text_id field in archive table.
...page_namespace is already a full int (int(11)).
...ar_namespace is already a full int (int(11)).
...rc_namespace is already a full int (int(11)).
...wl_namespace is already a full int (int(11)).
...qc_namespace is already a full int (int(11)).
...log_namespace is already a full int (int(11)).
...have img_media_type field in image table.
...already have pagelinks table.
...image table does not contain img_type field.
...already have unique user_name index.
...user_groups table exists and is in current format.
...have ss_total_pages field in site_stats table.
...user_newtalk table already exists.
...transcache table already exists.
...have iw_trans field in interwiki table.
...wl_notificationtimestamp is already nullable.
...index times already set on logging table.
...have ipb_range_start field in ipblocks table.
...no page_random rows needed to be set
...have user_registration field in user table.
...templatelinks table already exists
...externallinks table already exists.
...job table already exists.
...have ss_images field in site_stats table.
...langlinks table already exists.
...querycache_info table already exists.
...filearchive table already exists.
...have ipb_anon_only field in ipblocks table.
...index rc_ns_usertext already set on recentchanges table.
...index rc_user_text already set on recentchanges table.
...have user_newpass_time field in user table.
...redirect table already exists.
...querycachetwo table already exists.
...have ipb_enable_autoblock field in ipblocks table.
...index pl_namespace on table pagelinks includes field pl_from.
...index tl_namespace on table templatelinks includes field tl_from.
...index il_to on table imagelinks includes field il_from.
...have rc_old_len field in recentchanges table.
...have user_editcount field in user table.
...page_restrictions table already exists.
...have log_id field in logging table.
...have rev_parent_id field in revision table.
...have pr_id field in page_restrictions table.
...have rev_len field in revision table.
...have rc_deleted field in recentchanges table.
...have log_deleted field in logging table.
...have ar_deleted field in archive table.
...have ipb_deleted field in ipblocks table.
...have fa_deleted field in filearchive table.
...have ar_len field in archive table.
...have ipb_block_email field in ipblocks table.
...index cl_sortkey on table categorylinks includes field cl_from.
...have oi_metadata field in oldimage table.
...index usertext_timestamp already set on archive table.
...index img_usertext_timestamp already set on image table.
...index oi_usertext_timestamp already set on oldimage table.
...have ar_page_id field in archive table.
...have img_sha1 field in image table.
...protected_titles table already exists.
...have ipb_by_text field in ipblocks table.
...page_props table already exists.
...updatelog table already exists.
...category table already exists.
Populating category table, printing progress markers. For large databases, you
may want to hit Ctrl-C and do this manually with maintenance/
populateCategory.php.
Category population complete.
Done populating category table.
...have ar_parent_id field in archive table.
...have user_last_timestamp field in user_newtalk table.
Populating rev_parent_id fields, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateParentId.php.
Populating rev_parent_id column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
...doing rev_id from 3601 to 3800
...doing rev_id from 3801 to 4000
...doing rev_id from 4001 to 4200
...doing rev_id from 4201 to 4400
...doing rev_id from 4401 to 4600
...doing rev_id from 4601 to 4800
...doing rev_id from 4801 to 5000
...doing rev_id from 5001 to 5200
...doing rev_id from 5201 to 5400
...doing rev_id from 5401 to 5600
...doing rev_id from 5601 to 5800
...doing rev_id from 5801 to 6000
...doing rev_id from 6001 to 6200
...doing rev_id from 6201 to 6400
...doing rev_id from 6401 to 6600
...doing rev_id from 6601 to 6800
...doing rev_id from 6801 to 7000
...doing rev_id from 7001 to 7200
...doing rev_id from 7201 to 7400
...doing rev_id from 7401 to 7600
...doing rev_id from 7601 to 7800
...doing rev_id from 7801 to 8000
...doing rev_id from 8001 to 8200
...doing rev_id from 8201 to 8400
...doing rev_id from 8401 to 8600
...doing rev_id from 8601 to 8800
...doing rev_id from 8801 to 9000
...doing rev_id from 9001 to 9200
...doing rev_id from 9201 to 9400
...doing rev_id from 9401 to 9600
...doing rev_id from 9601 to 9800
...doing rev_id from 9801 to 10000
...doing rev_id from 10001 to 10200
...doing rev_id from 10201 to 10400
...doing rev_id from 10401 to 10600
...doing rev_id from 10601 to 10800
...doing rev_id from 10801 to 11000
...doing rev_id from 11001 to 11200
...doing rev_id from 11201 to 11400
...doing rev_id from 11401 to 11600
...doing rev_id from 11601 to 11800
...doing rev_id from 11801 to 12000
...doing rev_id from 12001 to 12200
...doing rev_id from 12201 to 12400
...doing rev_id from 12401 to 12600
...doing rev_id from 12601 to 12800
...doing rev_id from 12801 to 13000
...doing rev_id from 13001 to 13200
...doing rev_id from 13201 to 13400
...doing rev_id from 13401 to 13600
...doing rev_id from 13601 to 13800
...doing rev_id from 13801 to 14000
...doing rev_id from 14001 to 14200
...doing rev_id from 14201 to 14400
...doing rev_id from 14401 to 14600
...doing rev_id from 14601 to 14800
rev_parent_id population complete ... 0 rows [0 changed]
...protected_titles table has correct pt_title encoding.
...have ss_active_users field in site_stats table.
...ss_active_users user count set...
...have ipb_allow_usertalk field in ipblocks table.
...pl_namespace, tl_namespace, il_to indices are already UNIQUE.
...change_tag table already exists.
...tag_summary table already exists.
...valid_tag table already exists.
...user_properties table already exists.
...log_search table already exists.
...have log_user_text field in logging table.
Populating log_user_text field, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateLogUsertext.php.
...doing log_id from 1 to 100
...doing log_id from 101 to 200
...doing log_id from 201 to 300
...doing log_id from 301 to 400
...doing log_id from 401 to 500
...doing log_id from 501 to 600
...doing log_id from 601 to 700
...doing log_id from 701 to 800
...doing log_id from 801 to 900
...doing log_id from 901 to 1000
...doing log_id from 1001 to 1100
...doing log_id from 1101 to 1200
...doing log_id from 1201 to 1300
...doing log_id from 1301 to 1400
...doing log_id from 1401 to 1500
...doing log_id from 1501 to 1600
...doing log_id from 1601 to 1700
...doing log_id from 1701 to 1800
...doing log_id from 1801 to 1900
...doing log_id from 1901 to 2000
...doing log_id from 2001 to 2100
...doing log_id from 2101 to 2200
...doing log_id from 2201 to 2300
...doing log_id from 2301 to 2400
...doing log_id from 2401 to 2500
...doing log_id from 2501 to 2600
...doing log_id from 2601 to 2700
...doing log_id from 2701 to 2800
...doing log_id from 2801 to 2900
...doing log_id from 2901 to 3000
...doing log_id from 3001 to 3100
...doing log_id from 3101 to 3200
...doing log_id from 3201 to 3300
...doing log_id from 3301 to 3400
...doing log_id from 3401 to 3500
...doing log_id from 3501 to 3600
...doing log_id from 3601 to 3700
...doing log_id from 3701 to 3800
...doing log_id from 3801 to 3900
...doing log_id from 3901 to 4000
...doing log_id from 4001 to 4100
...doing log_id from 4101 to 4200
...doing log_id from 4201 to 4300
...doing log_id from 4301 to 4400
...doing log_id from 4401 to 4500
...doing log_id from 4501 to 4600
...doing log_id from 4601 to 4700
...doing log_id from 4701 to 4800
...doing log_id from 4801 to 4900
...doing log_id from 4901 to 5000
...doing log_id from 5001 to 5100
...doing log_id from 5101 to 5200
Done populating log_user_text field.
done.
Populating log_search table, printing progress markers. For large
databases, you may want to hit Ctrl-C and do this manually with
maintenance/populateLogSearch.php.
...doing log_id from 1 to 100
...doing log_id from 101 to 200
...doing log_id from 201 to 300
...doing log_id from 301 to 400
...doing log_id from 401 to 500
...doing log_id from 501 to 600
...doing log_id from 601 to 700
...doing log_id from 701 to 800
...doing log_id from 801 to 900
...doing log_id from 901 to 1000
...doing log_id from 1001 to 1100
...doing log_id from 1101 to 1200
...doing log_id from 1201 to 1300
...doing log_id from 1301 to 1400
...doing log_id from 1401 to 1500
...doing log_id from 1501 to 1600
...doing log_id from 1601 to 1700
...doing log_id from 1701 to 1800
...doing log_id from 1801 to 1900
...doing log_id from 1901 to 2000
...doing log_id from 2001 to 2100
...doing log_id from 2101 to 2200
...doing log_id from 2201 to 2300
...doing log_id from 2301 to 2400
...doing log_id from 2401 to 2500
...doing log_id from 2501 to 2600
...doing log_id from 2601 to 2700
...doing log_id from 2701 to 2800
...doing log_id from 2801 to 2900
...doing log_id from 2901 to 3000
...doing log_id from 3001 to 3100
...doing log_id from 3101 to 3200
...doing log_id from 3201 to 3300
...doing log_id from 3301 to 3400
...doing log_id from 3401 to 3500
...doing log_id from 3501 to 3600
...doing log_id from 3601 to 3700
...doing log_id from 3701 to 3800
...doing log_id from 3801 to 3900
...doing log_id from 3901 to 4000
...doing log_id from 4001 to 4100
...doing log_id from 4101 to 4200
...doing log_id from 4201 to 4300
...doing log_id from 4301 to 4400
...doing log_id from 4401 to 4500
...doing log_id from 4501 to 4600
...doing log_id from 4601 to 4700
...doing log_id from 4701 to 4800
...doing log_id from 4801 to 4900
...doing log_id from 4901 to 5000
...doing log_id from 5001 to 5100
...doing log_id from 5101 to 5200
Done populating log_search table.
done.
...l10n_cache table already exists.
...index ls_field_val already set on log_search table.
...index change_tag_rc_tag already set on change_tag table.
...have rd_interwiki field in redirect table.
Converting tc_time from UNIX epoch to MediaWiki timestamp ...done.
Altering all *_mime_minor fields to 100 bytes in size ...done.
Creating iwlinks table ...done.
...index iwl_prefix_title_from already set on iwlinks table.
Adding ul_value field to table updatelog ...done.
Adding iw_api field to table interwiki ...done.
...iwl_prefix key doesn't exist.
Adding cl_collation field to table categorylinks ...done.
...categorylinks up-to-date.
Updating category collations...Fixing collation for 5927 rows.
Selecting next 10000 rows... processing...5927 done.
5927 rows processed
...done.
Creating msg_resource table ...done.
Creating module_deps table ...done.
...ar_page_revid key doesn't exist.
Adding index ar_revid to table archive ...done.
...ll_lang is up-to-date.
Making user_last_timestamp nullable ...done.
Adding index user_email to table user ...done.
Modifying up_property field of table user_properties ...done.
Creating uploadstash table ...done.
Creating user_former_groups table ...done.
Adding index type_action to table logging ...done.
Adding rev_sha1 field to table revision ...done.
...batch conversion of user_options: done. Converted 0 user records.
done.
Table user contains user_options field. Dropping ...done.
Adding ar_sha1 field to table archive ...done.
Adding index page_redirect_namespace_len to table page ...done.
Adding us_chunk_inx field to table uploadstash ...done.
Adding job_timestamp field to table job ...done.
Adding index page_user_timestamp to table revision ...done.
Adding ipb_parent_block_id field to table ipblocks ...done.
Adding index ipb_parent_block_id to table ipblocks ...done.
Table category contains cat_hidden field. Dropping ...done.
Adding rev_content_format field to table revision ...done.
Adding rev_content_model field to table revision ...done.
Adding ar_content_format field to table archive ...done.
Adding ar_content_model field to table archive ...done.
Adding page_content_model field to table page ...done.
Table site_stats contains ss_admins field. Dropping ...done.
Table recentchanges contains rc_moved_to_title field. Dropping ...done.
Creating sites table ...done.
Adding fa_sha1 field to table filearchive ...done.
Adding job_token field to table job ...done.
Adding job_attempts field to table job ...done.
Adding us_props field to table uploadstash ...done.
Modifying ug_group field of table user_groups ...done.
Modifying ufg_group field of table user_former_groups ...done.
Adding index pp_propname_page to table page_props ...done.
Adding index img_media_mime to table image ...done.
Making iwl_prefix_title_from index non-UNIQUE ...done.
Adding index iwl_prefix_from_title to table iwlinks ...done.
Adding ar_id field to table archive ...done.
Adding el_id field to table externallinks ...done.
...site_stats is populated...done.
Checking existence of old default messages...done.
Populating rev_len column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
...doing rev_id from 3601 to 3800
...doing rev_id from 3801 to 4000
...doing rev_id from 4001 to 4200
...doing rev_id from 4201 to 4400
...doing rev_id from 4401 to 4600
...doing rev_id from 4601 to 4800
...doing rev_id from 4801 to 5000
...doing rev_id from 5001 to 5200
...doing rev_id from 5201 to 5400
...doing rev_id from 5401 to 5600
...doing rev_id from 5601 to 5800
...doing rev_id from 5801 to 6000
...doing rev_id from 6001 to 6200
...doing rev_id from 6201 to 6400
...doing rev_id from 6401 to 6600
...doing rev_id from 6601 to 6800
...doing rev_id from 6801 to 7000
...doing rev_id from 7001 to 7200
...doing rev_id from 7201 to 7400
...doing rev_id from 7401 to 7600
...doing rev_id from 7601 to 7800
...doing rev_id from 7801 to 8000
...doing rev_id from 8001 to 8200
...doing rev_id from 8201 to 8400
...doing rev_id from 8401 to 8600
...doing rev_id from 8601 to 8800
...doing rev_id from 8801 to 9000
...doing rev_id from 9001 to 9200
...doing rev_id from 9201 to 9400
...doing rev_id from 9401 to 9600
...doing rev_id from 9601 to 9800
...doing rev_id from 9801 to 10000
...doing rev_id from 10001 to 10200
...doing rev_id from 10201 to 10400
...doing rev_id from 10401 to 10600
...doing rev_id from 10601 to 10800
...doing rev_id from 10801 to 11000
...doing rev_id from 11001 to 11200
...doing rev_id from 11201 to 11400
...doing rev_id from 11401 to 11600
...doing rev_id from 11601 to 11800
...doing rev_id from 11801 to 12000
...doing rev_id from 12001 to 12200
...doing rev_id from 12201 to 12400
...doing rev_id from 12401 to 12600
...doing rev_id from 12601 to 12800
...doing rev_id from 12801 to 13000
...doing rev_id from 13001 to 13200
...doing rev_id from 13201 to 13400
...doing rev_id from 13401 to 13600
...doing rev_id from 13601 to 13800
...doing rev_id from 13801 to 14000
...doing rev_id from 14001 to 14200
...doing rev_id from 14201 to 14400
...doing rev_id from 14401 to 14600
...doing rev_id from 14601 to 14800
rev_len population complete ... 80 rows changed (0 missing)
Populating rev_sha1 column
...doing rev_id from 1 to 200
...doing rev_id from 201 to 400
...doing rev_id from 401 to 600
...doing rev_id from 601 to 800
...doing rev_id from 801 to 1000
...doing rev_id from 1001 to 1200
...doing rev_id from 1201 to 1400
...doing rev_id from 1401 to 1600
...doing rev_id from 1601 to 1800
...doing rev_id from 1801 to 2000
...doing rev_id from 2001 to 2200
...doing rev_id from 2201 to 2400
...doing rev_id from 2401 to 2600
...doing rev_id from 2601 to 2800
...doing rev_id from 2801 to 3000
...doing rev_id from 3001 to 3200
...doing rev_id from 3201 to 3400
...doing rev_id from 3401 to 3600
...doing rev_id from 3601 to 3800
...doing rev_id from 3801 to 4000
...doing rev_id from 4001 to 4200
...doing rev_id from 4201 to 4400
...doing rev_id from 4401 to 4600
...doing rev_id from 4601 to 4800
...doing rev_id from 4801 to 5000
...doing rev_id from 5001 to 5200
...doing rev_id from 5201 to 5400
...doing rev_id from 5401 to 5600
...doing rev_id from 5601 to 5800
...doing rev_id from 5801 to 6000
...doing rev_id from 6001 to 6200
...doing rev_id from 6201 to 6400
...doing rev_id from 6401 to 6600
...doing rev_id from 6601 to 6800
...doing rev_id from 6801 to 7000
...doing rev_id from 7001 to 7200
...doing rev_id from 7201 to 7400
...doing rev_id from 7401 to 7600
...doing rev_id from 7601 to 7800
...doing rev_id from 7801 to 8000
...doing rev_id from 8001 to 8200
...doing rev_id from 8201 to 8400
...doing rev_id from 8401 to 8600
...doing rev_id from 8601 to 8800
...doing rev_id from 8801 to 9000
...doing rev_id from 9001 to 9200
...doing rev_id from 9201 to 9400
...doing rev_id from 9401 to 9600
...doing rev_id from 9601 to 9800
...doing rev_id from 9801 to 10000
...doing rev_id from 10001 to 10200
...doing rev_id from 10201 to 10400
...doing rev_id from 10401 to 10600
...doing rev_id from 10601 to 10800
...doing rev_id from 10801 to 11000
...doing rev_id from 11001 to 11200
...doing rev_id from 11201 to 11400
...doing rev_id from 11401 to 11600
...doing rev_id from 11601 to 11800
...doing rev_id from 11801 to 12000
...doing rev_id from 12001 to 12200
...doing rev_id from 12201 to 12400
...doing rev_id from 12401 to 12600
...doing rev_id from 12601 to 12800
...doing rev_id from 12801 to 13000
...doing rev_id from 13001 to 13200
...doing rev_id from 13201 to 13400
...doing rev_id from 13401 to 13600
...doing rev_id from 13601 to 13800
...doing rev_id from 13801 to 14000
...doing rev_id from 14001 to 14200
...doing rev_id from 14201 to 14400
...doing rev_id from 14401 to 14600
...doing rev_id from 14601 to 14800
Populating ar_sha1 column
...doing ar_rev_id from 261 to 460
...doing ar_rev_id from 461 to 660
...doing ar_rev_id from 661 to 860
...doing ar_rev_id from 861 to 1060
...doing ar_rev_id from 1061 to 1260
...doing ar_rev_id from 1261 to 1460
...doing ar_rev_id from 1461 to 1660
...doing ar_rev_id from 1661 to 1860
...doing ar_rev_id from 1861 to 2060
...doing ar_rev_id from 2061 to 2260
...doing ar_rev_id from 2261 to 2460
...doing ar_rev_id from 2461 to 2660
...doing ar_rev_id from 2661 to 2860
...doing ar_rev_id from 2861 to 3060
...doing ar_rev_id from 3061 to 3260
...doing ar_rev_id from 3261 to 3460
...doing ar_rev_id from 3461 to 3660
...doing ar_rev_id from 3661 to 3860
...doing ar_rev_id from 3861 to 4060
...doing ar_rev_id from 4061 to 4260
...doing ar_rev_id from 4261 to 4460
...doing ar_rev_id from 4461 to 4660
...doing ar_rev_id from 4661 to 4860
...doing ar_rev_id from 4861 to 5060
...doing ar_rev_id from 5061 to 5260
...doing ar_rev_id from 5261 to 5460
...doing ar_rev_id from 5461 to 5660
...doing ar_rev_id from 5661 to 5860
...doing ar_rev_id from 5861 to 6060
...doing ar_rev_id from 6061 to 6260
...doing ar_rev_id from 6261 to 6460
...doing ar_rev_id from 6461 to 6660
...doing ar_rev_id from 6661 to 6860
...doing ar_rev_id from 6861 to 7060
...doing ar_rev_id from 7061 to 7260
...doing ar_rev_id from 7261 to 7460
...doing ar_rev_id from 7461 to 7660
...doing ar_rev_id from 7661 to 7860
...doing ar_rev_id from 7861 to 8060
...doing ar_rev_id from 8061 to 8260
...doing ar_rev_id from 8261 to 8460
...doing ar_rev_id from 8461 to 8660
...doing ar_rev_id from 8661 to 8860
...doing ar_rev_id from 8861 to 9060
...doing ar_rev_id from 9061 to 9260
...doing ar_rev_id from 9261 to 9460
...doing ar_rev_id from 9461 to 9660
...doing ar_rev_id from 9661 to 9860
...doing ar_rev_id from 9861 to 10060
...doing ar_rev_id from 10061 to 10260
...doing ar_rev_id from 10261 to 10460
...doing ar_rev_id from 10461 to 10660
...doing ar_rev_id from 10661 to 10860
...doing ar_rev_id from 10861 to 11060
...doing ar_rev_id from 11061 to 11260
...doing ar_rev_id from 11261 to 11460
...doing ar_rev_id from 11461 to 11660
...doing ar_rev_id from 11661 to 11860
...doing ar_rev_id from 11861 to 12060
...doing ar_rev_id from 12061 to 12260
...doing ar_rev_id from 12261 to 12460
...doing ar_rev_id from 12461 to 12660
...doing ar_rev_id from 12661 to 12860
...doing ar_rev_id from 12861 to 13060
...doing ar_rev_id from 13061 to 13260
...doing ar_rev_id from 13261 to 13460
...doing ar_rev_id from 13461 to 13660
...doing ar_rev_id from 13661 to 13860
Populating ar_sha1 column legacy rows
rev_sha1 and ar_sha1 population complete [14576 revision rows, 212 archive rows].
Populating img_sha1 field

Done 0 files in 0.0 seconds
Fixing protocol-relative entries in the externallinks table...
Done, 0 rows updated.
Populating fa_sha1 field from fa_storage_key

Done 11 files in 0.0 seconds
Purging caches...done.

Done.
88.130.64.74 (talkcontribs)

That looks very good. The database update finished successfully.

Now try opening your wiki with the webbrowser. Does it display? Do you see any error message?

Which extensions did you have in use before the upgrade? I know you have already removed FCKeditor and you have disabled CategoryWatch (of which the newest version from SVN might maybe work). Did you use any other extensions? If so: Which ones?

203.25.255.17 (talkcontribs)

Thank you for your patience and your help to this point.

It opens but what displays is rather strange in that it does not look like what my old mediawiki did. No images and none of the links appear to work.

So, how do I get the images across and the links to work?

88.130.64.74 (talkcontribs)

Did you have an .htaccess file inside the folder with the old wiki? This file is necessary when you use something like short URLs. Make sure that you copy that file over as well.

If that did not help, please upload a screenshot of what the main page now looks like! Is the wiki publicly accessable?

203.25.255.17 (talkcontribs)
88.130.92.11 (talkcontribs)

I see. You should take your old LocalSettings.php file and compare it to the new one. There might be settings in the old file, which you now do not have in the new one and those might cause some of these changes to happen.

E.g. in the old version you used the skin "monobook"; the new one looks like you have changed to "vector". The situation might improve again when you change the default skin in LocalSettings.php by changing the value of $wgDefaultSkin back to monobook. If that does not fix the rendering problems (clear the caches in between testing!), then you might want to have a look at the CSS styles. CSS styles might also be present on some pages in your wiki, namely on the wiki pages MediaWiki:Common.css, MediaWiki:Monobook.css (only used with the monobook skin) or MediaWiki:Vector.css (only used with vector). These also influence how stuff is rendered. Possibly you had also added styles to the files in the folder skins/ inside the filesystem.

I also see that $wgLogo is no longer set so that you only see the default logo.

Ciencia Al Poder (talkcontribs)
203.25.255.17 (talkcontribs)

Thanks for your help again. Changing the skin and enabling the ParserFunction seems to have got the look and feel back to normal.

However, when I try and access any of the pages in the mediawiki they don't seem to exist. I cannot get any listings nor anything like that on the various contents pages. This is most disconcerting, as the database claims there is all those pages in it.

Ciencia Al Poder (talkcontribs)

But your previos screenshots show content pages on the wiki, so they exist. Can you please elaborate? Also, please make a new thread for this.

88.130.112.154 (talkcontribs)

Yes, the screenshots do show some custom content so there must be pages in the wiki. And you have queried the DB and got around 3500 pages in it...

203.25.255.17 (talkcontribs)
Reply to "Easy way to upgrade from version 1.8 to 1.22?"