#6801 Options to parallelize last_commit_ids

v1.1.0
closed
General
2015-08-20
2013-10-26
No

For faster last_commit_ids processing, we should run N git log commands in parallel. We may want to require minimum number of files/dirs before running in parallel if there is significant enough overhead in spinning up new threads. (See runtests.py for an example of a threadpool running external commands)

This will largely remove the benefit of the current iterative approach, which can short-circuit some commands if files/dirs have the same last commit. We can try to keep some of it, though, for cases where the parallelism is less than the total number of files/dirs.

Discussion

  • Dave Brondsema

    Dave Brondsema - 2013-11-01
    • Size: --> 2
     
  • Cory Johns

    Cory Johns - 2013-11-11
    • status: open --> in-progress
    • assigned_to: Cory Johns
     
  • Cory Johns

    Cory Johns - 2013-11-11
    • status: in-progress --> code-review
     
  • Cory Johns

    Cory Johns - 2013-11-11

    allura:cj/6801

     
  • Dave Brondsema

    Dave Brondsema - 2013-11-15
    • Milestone: forge-nov-15 --> forge-nov-29
     
  • Tim Van Steenburgh

    • QA: Tim Van Steenburgh
     
  • Tim Van Steenburgh

    • status: code-review --> closed