[forge:site-support:#3925]
Here's a wiki page (a fragment of my user manual) with exactly 40,000 characters:
https://sourceforge.net/p/waveshop/wiki/test_40000/
The HTML is correctly parsed: it has proper formatting, the links work, etc.
Here's the same page, but with one more character:
https://sourceforge.net/p/waveshop/wiki/test_40001/
Note that none of the HTML is parsed, presumably due to exceeding a hard-coded buffer size somewhere in the markdown parser.
This is new behavior, as of May 2013. I know this because my user manual was always bigger than 40KB, but the HTML didn't stop working until this month, maybe even as recently as this week.Opinion: This is a very silly and avoidable bug. In the age of 64-bit addressing and string containers, it's hard to see why any aspect of the markdown parser should be limited to 40KB or any other particular number.
While we're fixing this, any chance of doing something about the lame handling of definition lists in markdown? (issue #57, from February)
https://sourceforge.net/p/forge/feature-requests/57/
Looks like this is due to the behavior added in [#5607]. Cory said he has some ideas or alternative implementation though.
Cory's idea was to cache the markdown rendering result.
Caching for comments only was implemented in [#6735] and is generic enough to expand to other model classes, such as wiki pages. Cache invalidation will have to be considered though, since dynamic content (e.g. macros) is much more common in wiki pages.
Related
Tickets:
#6735See discussion at http://mail-archives.apache.org/mod_mbox/incubator-allura-dev/201310.mbox/%3C52655F37.5060608%40brondsema.net%3E
Created #466: [#6207] expand markdown caching to all artifact types (4cp)
Related
Tickets:
#6207Closed #466.
je/42cc_6207
There are still calls to non cached convert from repo's templates (e.g. to convert commit messages), but I assume that commit message is pretty small and wouldn't need caching anyway.