On this page, 3.x refers to the 'old' (current) version of the manual - 3.z refers to the new, to be created, version...

+

#Use phpmyadmin to backup all of the wiki tables!Locate all pages on the wiki that have 3.x in the title - from the wiki: toolbox, Special pages, All pages to get a full list of wiki pages.

+

#Copy/paste it into text processor - it is in three columns separated by tabs. Use regular expression search/replace to replace tabs with carriage returns. Using GEDIT, best to replace \t with \n rather than \r. You now have a list of all files on the wiki, one title per line. Save it as a text file (I'll call it myfile.txt for this example).

+

#Use grep to find only the pages with 3.x in the title: ''grep "3\.x" myfile.txt > mynewfile.txt'' The backslash will make the full stop a literal rather than trying tobe part of a regular expression.

+

#Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to 3.x in the title, but are not pages that we want duplicated into 3.z...)

+

#Copy/paste the new list of page titles into the wiki export using toolbox, Special pages, Export pages and create the xml to screen. Be sure box for Include only the current revision, not the full history is ticked.

+

#Copy/paste the xml into a text processor and use search/replace to replace 3.x with 3.z

+

#Save the file (I'll call it export.xml for this example).

+

#Import the new pages into the wiki: toolbox, Special pages, Restricted special pages (available only to Sysop members of the wiki), Import pages. Browse to find the .xml file and Upload file.

−

Locate all pages on the wiki that have 3.0 in the title - from the wiki: toolbox, Special pages, All pages to get a full list of wiki pages. Copy/paste it into Open Office word processor - it is in three columns separated by tabs. Use regular expression search/replace to replace tabs with carriage returns - now have a list of all files on the wiki, one title per line. Save it as a text file (I'll call it myfile.txt for this example).

+

[NOTE: Updating through multiple xml files not necessary now, as we have command line access to our hosting account.] There are serious restrictions for both file size and processing time for the scripts - I had to split the .xml file into 8 pieces to get it all uploaded...

−

Use grep to find only the pages with 3.0 in the title:

+

For the change to 3.4 and then from 3.4 to 4.0, I used a script in the maintenance directory to import xml from the command line. The GoDaddy hosting service uses an old version of php from the command line, but I was successful with:

−

grep "3.0" myfile.txt > mynewfile.txt

+

−

Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to 3.0 in the title, but are not pages that we want duplicated into 3.1...)

+

'''-bash-3.2$ /usr/local/php5/bin/php importDump.php 34.xml'''

−

Copy/paste the new list of page titles into the wiki export using toolbox, Special pages, Export pages and create the xml to screen. Be sure box for Include only the current revision, not the full history is ticked.

+

followed by:

−

Copy/paste the xml into a text processor and use search replace to replace 3.0 with 3.1

+

'''-bash-3.2$ /usr/local/php5/bin/php rebuildrecentchanges.php'''

−

Save the file (I'll call it export.xml for this example).

+

That should now have created all of the new pages.

−

+

−

Import the new pages into the wiki: toolbox, Special pages, Restricted special pages (available only to Sysop members of the wiki), Import pages. Browse to find the .xml file and Upload file. That should now have created all of the new pages.

+

New pages created by this sort of import do not automatically get added into the search index of the wiki. Use phpmyadmin from the cpanel and use the repair tool for the searchindex table.

New pages created by this sort of import do not automatically get added into the search index of the wiki. Use phpmyadmin from the cpanel and use the repair tool for the searchindex table.

+

+

Finally, if you want to set all the previous version pages to be protected, so only sysop mediawiki users are able to make changes (so all changes will be forced into the next set of pages...), you can use a series of sql statements in the form:

(You'll need to replace the spaces in the page title with underscores...)

+

+

I do that concatenation in a spreadsheet, creating the (long) SQL statements by combining a first part, the page names and the last part...

+

+

That updates a blob field to force protection without having to do the pages individually. You need to do the above sql statement for each page that you want to protect...

+

+

*Screenshots

+

As there is no [[Available_screenshots|naming scheme]] we are trying to follow a rule for manual upgrade :

+

filename-{number of version}-{lang}.extension

+

ex:

+

Edit-person-31-en.png

+

We need to know if we keep the {number of version} or if we use the new one on migration (3.1->3.2->4.0->15.0...) ?

Revision as of 19:21, 19 December 2012

On this page, 3.x refers to the 'old' (current) version of the manual - 3.z refers to the new, to be created, version...

Use phpmyadmin to backup all of the wiki tables!Locate all pages on the wiki that have 3.x in the title - from the wiki: toolbox, Special pages, All pages to get a full list of wiki pages.

Copy/paste it into text processor - it is in three columns separated by tabs. Use regular expression search/replace to replace tabs with carriage returns. Using GEDIT, best to replace \t with \n rather than \r. You now have a list of all files on the wiki, one title per line. Save it as a text file (I'll call it myfile.txt for this example).

Use grep to find only the pages with 3.x in the title: grep "3\.x" myfile.txt > mynewfile.txt The backslash will make the full stop a literal rather than trying tobe part of a regular expression.

Use a text editor to view the new file, and delete any pages that you don't want to roll over (That is, some pages may refer to 3.x in the title, but are not pages that we want duplicated into 3.z...)

Copy/paste the new list of page titles into the wiki export using toolbox, Special pages, Export pages and create the xml to screen. Be sure box for Include only the current revision, not the full history is ticked.

Copy/paste the xml into a text processor and use search/replace to replace 3.x with 3.z

Save the file (I'll call it export.xml for this example).

Import the new pages into the wiki: toolbox, Special pages, Restricted special pages (available only to Sysop members of the wiki), Import pages. Browse to find the .xml file and Upload file.

[NOTE: Updating through multiple xml files not necessary now, as we have command line access to our hosting account.] There are serious restrictions for both file size and processing time for the scripts - I had to split the .xml file into 8 pieces to get it all uploaded...

For the change to 3.4 and then from 3.4 to 4.0, I used a script in the maintenance directory to import xml from the command line. The GoDaddy hosting service uses an old version of php from the command line, but I was successful with:

-bash-3.2$ /usr/local/php5/bin/php importDump.php 34.xml

followed by:

-bash-3.2$ /usr/local/php5/bin/php rebuildrecentchanges.php

That should now have created all of the new pages.

New pages created by this sort of import do not automatically get added into the search index of the wiki. Use phpmyadmin from the cpanel and use the repair tool for the searchindex table.

Finally, if you want to set all the previous version pages to be protected, so only sysop mediawiki users are able to make changes (so all changes will be forced into the next set of pages...), you can use a series of sql statements in the form: