We still need to figure out how to reference multiple sitemap indexes from robots.txt, and determine if we need to generate them statically or can do it live (check to see performance of the URLs, and see how often the current sfpy-generated sitemap files are accessed in prod).
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
We'll need to generate these statically. Too slow to serve dynamically. I ran some time tests with curl against prod sitemaps, and these new Allura sitemaps are about 10 times slower, even with the tiny sandbox dataset.
That said, functionally this works fine (although url is /allura_sitemap, not /_sitemap) - merging to dev.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Diff:
Code on rc/2786. To QA, locally, visit /_sitemap and verify there is a sitemap index. Then visit /_sitemap/0 and verify there is a sitemap.
We still need to figure out how to reference multiple sitemap indexes from robots.txt, and determine if we need to generate them statically or can do it live (check to see performance of the URLs, and see how often the current sfpy-generated sitemap files are accessed in prod).
We can add multiple sitemap declarations to the robots.txt file: http://www.sitemaps.org/protocol.php#submit_robots
We'll need to generate these statically. Too slow to serve dynamically. I ran some time tests with curl against prod sitemaps, and these new Allura sitemaps are about 10 times slower, even with the tiny sandbox dataset.
That said, functionally this works fine (although url is /allura_sitemap, not /_sitemap) - merging to dev.