https://sourceforge.net/apps/ideatorrent/sourceforge/ideatorrent/idea/824/
SF clearly says that the backup of the project data is left to project admins:
https://sourceforge.net/apps/trac/sourceforge/wiki/Backup%20your%20data
Fine.
However, there is currently no way to backup the SF2.0 tools, for instance the data from the ticket tool.
Backups would be the first step, then a way to restore (though we never offered that part in SF classic to my knowledge)?
And I realize that more enterprising users could probably figure something out themselves via the API (which we don't have for all tools anyway), but I think we need a simpler solution as well similar to the XML export tool that classic projects have: http://sourceforge.net/apps/trac/sourceforge/wiki/XML%20export
Originally by: olberger
May I suggest to support some format along the line of the ForgePlucker project (http://home.gna.org/forgeplucker/) which tries and address such needs in interoperable manners for different forge systems ?
Originally by: wohler
You mean the backups I've been performing (using a URL I saved years ago) have not been backing up the new stuff? Yikes. This item should be very high priority.
In my opinion, the ability to restore--while important--should not hold up the ability to back up.
Originally by: *anonymous
Here's a simple python script to backup your wiki. Doesn't require any permissions. Just requires the markdown python module and you'll have both the markdown and HTML. The conversion assumes that you're not using SF-specific markdown syntax.
Originally by: ypeels
Thank you for the script, Anonymous! This was exactly what I was looking for, to fill the (as of this writing) glaring hole at https://sourceforge.net/apps/trac/sourceforge/wiki/Backup%20your%20data
Updated the script to accept the project name as a command-line argument, and display usage notes too.
Sorry if my updates are messy or not idiomatic; this was my first time writing any Python code.
Requires Python 2? (Could not get it to run in Portable Python 3.2.1.1)
Originally by: adrelanos
Thanks for the script. I needed to use lowercase "whonix" instead of "Whonix". Output is as it follows:
Can you help please?
Design/implementation for backups, based on thread in allura-dev:
Add a bulk_export() method to Application which would be responsible for
generating json for all the artifacts in the tool. The format should match the
API format for artifacts so that we're consistent. Thus any tool that
implements bulk_export() would typically loop through all the artifacts for this
instance (matching app_config_id) and convert to json the same way the API json
is generated (e.g. call the
__json__
method or RestController method; somerefactoring might be needed). Multiple types of artifacts/objects could be
listed out in groups, e.g. Tracker app could have a list of tickets, list of
saved search bins, list of milestones, and the tracker config data. Discussion
threads would need to be included too, ideally inline with the artifact they go
with. No permission checks would be done since this export would only be
available to admins (makes it faster & simpler).
Provide a page on the Admin sidebar to generate a bulk export. Project admins
could choose individual tool instances, or all tools in the project (that
support it). That form would kick off a background task which goes through the
selected tools and runs their bulk_export() methods. Make sure we don't have multiple backups running for the same project (I think code snapshot has similar protection too). Save each tool's data as mount_point.json and zip them all together.
Store the zip files in a configurable directory. Should have an ini setting for this that is tiny template (similar to 'short_url.url_pattern') so that project name and stuff can be interpolated.
When the task is complete, notify the user via email. The basic content of the email (telling them that it's done, timestamp, etc) can be standard. Below that, we need configurable instructions to go into the email (telling users how to SFTP or whatever to get to their file). I guess an ini setting will be best, but the value will be quite long. This should also support project name interpolation.
So that a giant json string doesn't have to be held in memory for each tool, the
export task should open a file handle for mount_point.json and send call
bulk_export() with that open file handle and each App can append to their file
incrementally.
If mongo performance is slow, some refactoring may be needed to avoid lots of
individual mongo calls and be more batch oriented. We can see how it goes.
For now, this will NOT include:
Created:
After above all functionality will be implemented including wiki export. It'll be the point were we'll get a review.
Then, implementing export for other tools:
Don't sure about other tools. Seems like they doesn't have any API, so we don't have format which we should match. Maybe it's worth implementing
__json__
methods for their artifacts too, and use them during the export.What other tools do you want to be exported? I think it makes sense for:
Don't sure about SCM tools, what can be exported there?
Also, I didn't have any experience with following tools, so don't sure if they should/can be exported or not:
Related
Tickets:
#3154Sounds like a good plan. ForgeBlog and ForgeLink will be good to build APIs for and include in the export. We don't use ForgeShortUrl extensively, so we don't have a need for api/export for that right now.
Activity, Chat, and UserStats don't need to be exported at this point.
We'll also need to export some top-level project metadata. http://sourceforge.net/rest/p/allura for example just has the project name & tools, but we can include metadata like categories, members, etc. The SF classic system has a DOAP API with a lot of that information (e.g. http://sourceforge.net/api/project/name/allura/doap). We'll want to include most of the fields in there. We'll most likely want to have a version of Allura API for project info that is in the DOAP syntax. Probably will need to spec this out a bit more when you get to it.
For any new APIs that are created, we should also have new docs prepared at the same time. Currently documentation for APIs are at https://sourceforge.net/p/forge/documentation/Allura%20API/
Created:
Related
Tickets:
#3154Originally by: fina
Is this script only for wiki ? or it will work also for "Bugs" (tickets) ?
Closed #387.
je/42cc_3154
Originally by: *anonymous
As I get it, this will not really help for automated backups, will it?
With the old XML export I had a weekly running backup script that backed up source code, mail archives, and so on and also the XML export which was available through a simple URL.
Having an asynchronous backup generation with email notification will not help much with regular automated backups, will it?
Closed #386, #388.
je/42cc_3154
Implemented import functionality for wiki only. Please, do the initial review, after that we'll proceed with the rest of tickets.
Looks great so far.