We should have a script to import from a mediawiki database dump into a discussion tool. Maybe use sqlite to load the database dump? But also have an option to connect to a live database, since we'll be able to take advantage of that in some situations.
See scripts/teamforge-import.py for reference on how to create wiki pages programmatically. That script uses 2 phases, first to extract the data and save it, second to load the data into Allura. We should do that, but make the data files be free from phpbb details, so the loading script can be generic. Some day people can write extractors from other wiki software and use the same loading script.
This code should be within the ForgeWiki tool, and ideally exposed as a paster command.
The import should handle as much as is possible: wiki pages (including converting the format to markdown - not sure if there is any existing conversion libraries available), attachments, history of each page (our wiki pages are versioned too), permissions, "Talk" pages (can go into the discussion for a page - not sure the best way to split up into separate comments), other config options, etc.
closed #85, all changes are in 42cc_4186
Run the script as following:
Available options:
created a new ticket #96: [#4186] Error handling and small amendments (1cp)
Related
Tickets:
#4186closed #96 and pushed changes into 42cc_4186
It probably would've been better to use sqlalchemy rather than directly depend on mysql, but that's not really a big deal at all since mediawiki is always on mysql.
I made a few changes to how imports are handled, so that the GPL'd mediawiki library is optional when using Allura in general.
Merged to dev. Created [#4660] for some followup improvements.
Related
Tickets:
#4660