Incrementally repacking too big files results in more problems than
solving them like the repack taking a long time to finish and eventually
culminating in a one huge pack file. Therefore, it is preferable to ignore
files over a certain size during the incremental repack.
Details
- Reviewers
durham - Group Reviewers
Restricted Project - Commits
- rFBHGX69ccc796d4ba: incremental-repack: do not repack files over a configuration based size
Added a test and ran all the tests.
Diff Detail
- Repository
- rFBHGX Facebook Mercurial Extensions
- Lint
Automatic diff as part of commit; lint not applicable. - Unit
Automatic diff as part of commit; unit tests not applicable.
Event Timeline
remotefilelog/repack.py | ||
---|---|---|
250 | Why 400MB? Seems low. What kind of numbers are we seeing for history pack size maximums? |
remotefilelog/repack.py | ||
---|---|---|
250 | I don't have a distribution to support the claim but usually the history packs are not a problem. They are usually lesser in number (1 or 2) even when there are plenty of datapacks or huge datapacks and max out at 1GB from all the instances I have seen. I am fairly confident that the number of > 400 MB history packs after this change would be quite small (~10-20). |
Why 400MB? Seems low. What kind of numbers are we seeing for history pack size maximums?