Change to use VLQ-encoded numbers for everything in the tree file. Block sizes
remain as u32s so that they can be read by the store in a single read
operation, but everything else is a VLQ as it is generally smaller and more
futureproof.
Details
- Reviewers
durham - Group Reviewers
Restricted Project - Commits
- rFBHGXa636c026aa4f: treedirstate: use vlqencoding for numbers
Diff Detail
- Repository
- rFBHGX Facebook Mercurial Extensions
- Lint
Automatic diff as part of commit; lint not applicable. - Unit
Automatic diff as part of commit; unit tests not applicable.
Event Timeline
rust/treedirstate/src/tree.rs | ||
---|---|---|
86 | noob question: why did this change? |
rust/treedirstate/src/tree.rs | ||
---|---|---|
86 | I'm actually not sure. I think it has something to do with the way read_vlq() expects the reference for the Read trait object, but in trying to explain it I realized I don't understand it. I know the compiler requires it, though. Would any Rust experts care to weigh in? |
rust/treedirstate/src/tree.rs | ||
---|---|---|
86 | My understanding is &mut Read as a whole implements Read and is not a typical reference despite having the letter &. So &mut &mut Read makes it a reference and works. I guess &mut Read itself can also be implicitly converted to a reference (how?). If itself is not marked as mut, it cannot be converted to a &mut reference. Hence the need of mut. |
rust/treedirstate/src/tree.rs | ||
---|---|---|
86 | Yeah, there's some subtle things going on here. It's because .read_vlq() is a method on the VLQDecode trait, not Read. But VLQDecode has a blanket implementation for all types which implement Read. In this case the object which implements Read is r, which has type &mut Read. The .read_vlq() method takes &mut self as its parameter, so it's getting &mut r as the parameter (hence the requirement for mut r). But &mut &mut Read doesn't implement Read (and therefore VLQDecode) itself, so the Rust compiler's deref coercion kicks in - effectively, it keeps adding * on the front until either the type doesn't implement Deref (or DerefMut in this case), or it gets a type that matches the signature. (Alternatively it will also add at most one &/&mut if needed, which is why you don't need to keep typing (&foo).bar() for bar(&self)). This means the end result is an effective method call of: VLQDecode::read_vlq(*&mut &mut r) (Though TBH I've probably got some details wrong here.) But echoing a comment I made earlier in the series, I think it would be better to express this as: fn read(r: &mut Read) -> Result<(Key, NodeEntry<T>)> { let mut r = r; to keep the confusing extra mut out of the function definition. |
noob question: why did this change?