#3154 Users need a way to backup allura data [idt 824] NEEDS INI

v1.0.1
closed
nobody
General
nobody
2015-08-20
2011-11-02
Chris Tsai
No

https://sourceforge.net/apps/ideatorrent/sourceforge/ideatorrent/idea/824/

SF clearly says that the backup of the project data is left to project admins:

https://sourceforge.net/apps/trac/sourceforge/wiki/Backup%20your%20data

Fine.

However, there is currently no way to backup the SF2.0 tools, for instance the data from the ticket tool.

Backups would be the first step, then a way to restore (though we never offered that part in SF classic to my knowledge)?

Related

Tickets: #3154
Tickets: #4213
Tickets: #4527

Discussion

1 2 3 > >> (Page 1 of 3)
  • Chris Tsai - 2011-11-02

    And I realize that more enterprising users could probably figure something out themselves via the API (which we don't have for all tools anyway), but I think we need a simpler solution as well similar to the XML export tool that classic projects have: http://sourceforge.net/apps/trac/sourceforge/wiki/XML%20export

     
  • Anonymous - 2011-11-16

    Originally by: olberger

    May I suggest to support some format along the line of the ForgePlucker project (http://home.gna.org/forgeplucker/) which tries and address such needs in interoperable manners for different forge systems ?

     
  • Dave Brondsema

    Dave Brondsema - 2011-11-17
    • milestone: limbo --> forge-backlog
     
  • Dave Brondsema

    Dave Brondsema - 2012-09-05
    • labels: support --> support, feature-parity
     
  • Chris Tsai - 2012-09-26
    • labels: support, feature-parity --> support, feature-parity, p2
     
  • Anonymous - 2013-02-18

    Originally by: wohler

    You mean the backups I've been performing (using a URL I saved years ago) have not been backing up the new stuff? Yikes. This item should be very high priority.

    In my opinion, the ability to restore--while important--should not hold up the ability to back up.

     
  • Anonymous - 2013-02-19

    Originally by: *anonymous

    Here's a simple python script to backup your wiki. Doesn't require any permissions. Just requires the markdown python module and you'll have both the markdown and HTML. The conversion assumes that you're not using SF-specific markdown syntax.

     
  • Anonymous - 2013-02-27

    Originally by: ypeels

    Thank you for the script, Anonymous! This was exactly what I was looking for, to fill the (as of this writing) glaring hole at https://sourceforge.net/apps/trac/sourceforge/wiki/Backup%20your%20data

    Updated the script to accept the project name as a command-line argument, and display usage notes too.

    Sorry if my updates are messy or not idiomatic; this was my first time writing any Python code.

    Requires Python 2? (Could not get it to run in Portable Python 3.2.1.1)

     
  • Anonymous - 2013-03-29

    Originally by: adrelanos

    Thanks for the script. I needed to use lowercase "whonix" instead of "Whonix". Output is as it follows:

    ~/pull-wiki $ ./pull-wiki-v2.py whonix
    About Computer (In)Security
    Traceback (most recent call last):
      File "./pull-wiki-v2.py", line 79, in <module>
        sys.exit(main())
      File "./pull-wiki-v2.py", line 72, in main
        markdown_file.write(source)
    UnicodeEncodeError: 'ascii' codec can't encode character u'\xb2' in position 6359: ordinal not in range(128)
    

    Can you help please?

     
  • Dave Brondsema

    Dave Brondsema - 2013-06-27

    Design/implementation for backups, based on thread in allura-dev:

    Add a bulk_export() method to Application which would be responsible for
    generating json for all the artifacts in the tool. The format should match the
    API format for artifacts so that we're consistent. Thus any tool that
    implements bulk_export() would typically loop through all the artifacts for this
    instance (matching app_config_id) and convert to json the same way the API json
    is generated (e.g. call the __json__ method or RestController method; some
    refactoring might be needed). Multiple types of artifacts/objects could be
    listed out in groups, e.g. Tracker app could have a list of tickets, list of
    saved search bins, list of milestones, and the tracker config data. Discussion
    threads would need to be included too, ideally inline with the artifact they go
    with. No permission checks would be done since this export would only be
    available to admins (makes it faster & simpler).

    Provide a page on the Admin sidebar to generate a bulk export. Project admins
    could choose individual tool instances, or all tools in the project (that
    support it). That form would kick off a background task which goes through the
    selected tools and runs their bulk_export() methods. Make sure we don't have multiple backups running for the same project (I think code snapshot has similar protection too). Save each tool's data as mount_point.json and zip them all together.

    Store the zip files in a configurable directory. Should have an ini setting for this that is tiny template (similar to 'short_url.url_pattern') so that project name and stuff can be interpolated.

    When the task is complete, notify the user via email. The basic content of the email (telling them that it's done, timestamp, etc) can be standard. Below that, we need configurable instructions to go into the email (telling users how to SFTP or whatever to get to their file). I guess an ini setting will be best, but the value will be quite long. This should also support project name interpolation.

    So that a giant json string doesn't have to be held in memory for each tool, the
    export task should open a file handle for mount_point.json and send call
    bulk_export() with that open file handle and each App can append to their file
    incrementally.

    If mongo performance is slow, some refactoring may be needed to avoid lots of
    individual mongo calls and be more batch oriented. We can see how it goes.

    For now, this will NOT include:

    • attachments in the zip
    • an API for backups
    • performance optimizations, like parallelization of bulk_export() across all tools
     
  • Dave Brondsema

    Dave Brondsema - 2013-06-27
    • labels: support, feature-parity, p2 --> support, feature-parity, p2, 42cc
     
  • Igor Bondarenko - 2013-06-28
    • status: open --> in-progress
     
  • Igor Bondarenko - 2013-06-28

    Created:

    • #387: [#3154] Bulk export: skeleton (2cp)
    • #386: [#3154] Bulk export: task to run export (4cp)
    • #388: [#3154] Bulk export: wiki (3cp)

    After above all functionality will be implemented including wiki export. It'll be the point were we'll get a review.

    Then, implementing export for other tools:

    • #389: [#3154] Bulk export: tracker (4cp)
    • #390: [#3154] Bulk export: discussion (3cp)

    Don't sure about other tools. Seems like they doesn't have any API, so we don't have format which we should match. Maybe it's worth implementing __json__ methods for their artifacts too, and use them during the export.

    What other tools do you want to be exported? I think it makes sense for:

    • ForgeBlog
    • ForgeLink
    • ForgeShortUrl

    Don't sure about SCM tools, what can be exported there?

    Also, I didn't have any experience with following tools, so don't sure if they should/can be exported or not:

    • ForgeActivity
    • ForgeChat
    • ForgeUserStats
     

    Related

    Tickets: #3154

  • Dave Brondsema

    Dave Brondsema - 2013-06-28

    Sounds like a good plan. ForgeBlog and ForgeLink will be good to build APIs for and include in the export. We don't use ForgeShortUrl extensively, so we don't have a need for api/export for that right now.

    Activity, Chat, and UserStats don't need to be exported at this point.

    We'll also need to export some top-level project metadata. http://sourceforge.net/rest/p/allura for example just has the project name & tools, but we can include metadata like categories, members, etc. The SF classic system has a DOAP API with a lot of that information (e.g. http://sourceforge.net/api/project/name/allura/doap). We'll want to include most of the fields in there. We'll most likely want to have a version of Allura API for project info that is in the DOAP syntax. Probably will need to spec this out a bit more when you get to it.

     
  • Dave Brondsema

    Dave Brondsema - 2013-06-28

    For any new APIs that are created, we should also have new docs prepared at the same time. Currently documentation for APIs are at https://sourceforge.net/p/forge/documentation/Allura%20API/

     
  • Igor Bondarenko - 2013-07-01

    Created:

    • #391 [#3154] ForgeBlog REST API (2cp)
    • #392 [#3154] ForgeLink REST API (1cp)
    • #393 [#3154] Bulk export: ForgeBlog (2cp)
    • #394 [#3154] Bulk export: ForgeLink (1cp)
    • #395 [#3154] Bulk export: project metadata (4cp)
     

    Related

    Tickets: #3154

  • Anonymous - 2013-07-03

    Originally by: fina

    Is this script only for wiki ? or it will work also for "Bugs" (tickets) ?

     
  • Igor Bondarenko - 2013-07-08

    Closed #387. je/42cc_3154

     
  • Anonymous - 2013-07-08

    Originally by: *anonymous

    As I get it, this will not really help for automated backups, will it?
    With the old XML export I had a weekly running backup script that backed up source code, mail archives, and so on and also the XML export which was available through a simple URL.
    Having an asynchronous backup generation with email notification will not help much with regular automated backups, will it?

     
  • Igor Bondarenko - 2013-07-09

    Closed #386, #388. je/42cc_3154

    Implemented import functionality for wiki only. Please, do the initial review, after that we'll proceed with the rest of tickets.

     
  • Igor Bondarenko - 2013-07-09
    • status: in-progress --> code-review
     
  • Cory Johns - 2013-07-10
    • QA: Cory Johns
     
  • Cory Johns - 2013-07-10
    • status: code-review --> open
    • QA: Cory Johns --> nobody
     
  • Cory Johns - 2013-07-10

    Looks great so far.

     
  • Igor Bondarenko - 2013-07-11
    • status: open --> in-progress
     
1 2 3 > >> (Page 1 of 3)

Log in to post a comment.