From Mark's mail:
I know this is interrupt driven work, but if you could take a look at this issue and let me know what you find, when trying to import this JSON dump to allura, that would be great.
I'd really like to have a major open source project come onto newforge, and I think TG is a good first candidate, as it's a reasonably active open source project where we have good ties already, so it should give us good feedback about the platoform.
Also, just so everybody knows if anybody else expresses interest in migrating, I think it's worth our while right now to provide them with some help.
--Mark Ramm
---------- Forwarded message ----------
From: Mark Ramm
Date: Tue, Mar 1, 2011 at 6:21 PM
Subject: Fwd: [tg-trunk] Trac Migration Status Update
---------- Forwarded message ----------
From: Christoph Zwerschke
Date: Sat, Feb 26, 2011 at 9:10 PM
Subject: [tg-trunk] Trac Migration Status Update
To: TurboGears Trunk turbogears-trunk@googlegroups.com
After the SF hosted trac option turned out to be unusable for our
purposes (it's only trac 11.2, no plugins installable, no API or
database access etc.), I tried the second (and preferred) option for
hosting the TG tracker on SF today, the Allura tracker.
The following scripts already exist to export from Trac as JSON and
then import JSON to Allura:
It turned out that the trac export script does not work with trac 0.10
which we are running on our old server, so as a first step I have
migrated it to trac 0.12 on my own server.
There were several more small problems with the export script which I
could fix easily. The script also stumbled over a oversized ticket
containing only Chinese spam (#852) which I have deleted already.
So I have now a ~10MB JSON file with our trac database.
Btw, there are ~1905 tickets for TG1 and ~540 tickets for TG2.
When I tried to import that to Allura, I got an "Request Entity Too
Large" error - obviously because the import script tries to import
everything with one API call only. So I modified the import script to
read the data in chunks (slicing the artifacts list), but now I only
get internal server errors with no further clue what's wrong. The
error message says more info may be available in the server error log,
but only SF staff has access to that log.
Another problem that needs to be solved if we go this way, is the user
mapping from our old Trac accounts to SF accounts. The import script
has a usermap option, and maybe we will be able to map some users
through their email addresses, but this is also something that can be
only done by SF staff.
@Mark: Any chance you will be able to solve these problems in the
short run? If not, I suggest we migrate to Florent's server for the
time being. However, since Trac cannot work with remote repositories,
this also means we will have to move the repositories there. We can
consider moving to SF at a later time again.
-- Christoph
Within scope of this ticket, following tasks were done:
The changes above were merged, and this ticket is closed now, with remaining work ticketed as [#1766], [#1767], [#1768]. Note that deliverance of generally usable import API also depends of resolving of some related issue, like [#1650], and settling on OAuth usage.
Related
Tickets: #1650
Tickets:
#1766Tickets:
#1767Tickets:
#1768