Message ID | cover.1731253918.git.dev@icepic.de |
---|---|
Headers | show |
Series | Fix abi mismatch error on boot for cross-compiled images | expand |
Hi Christoph, Christoph Buck <dev@icepic.de> skribis: > During compilation guix stores a hash of the record field names in the > compiled go files. On runtime this has is recalcuated and checked against the > stored hash to verify that no abi mismatch occured. As described in [1] this > hash differs if the corresponding record was compiled in a cross-compiled > context. Guile uses internally an `unsigned long` to store the hash, which > results in hashes of different sizes depending on the platform the guile > compiler is executed on. Guix already tries to work around this problem by > limiting the size of the hash in a cross-compile context to the most positive > fixnum size of the target, but this is insufficient, because, as one can look > up in the guile source code, the size is limited by an modulo operation after > the hash was already calculated for an 8byte unsigned long. Therefore the > resulting hashes during compilation and execution are different and an abi > mismatch error is erroneously reported during runtime. > > An easy workaround is documented in the guile src namely in an comment of the > `JENKINS_LOOKUP3_HASHWORD2`, which is used to calculate the hash: > >> Scheme can access symbol-hash, which exposes this value. For >>cross-compilation reasons, we ensure that the high 32 bits of the hash on a >>64-bit system are equal to the hash on a 32-bit system. The low 32 bits just >>add more entropy. > > This suggest the following workaround. Always limit the hash size to 32bit > even if executed on a 64bit platform (or to be more specific a platform where > ulong is 8bytes big). Do this by right shift the hash value 32bits and don't > rely on the size parameter of the `string-hash` function. This is what this > patch tries to accomplish. Woow, thanks for the investigation & explanation! (I would say that the ‘scm_ihash’ implementation as a mere modulo is dubious, but that’s hard to change anyway.) > Imho this approach has two drawbacks. Lost entropy on 64 bit machines and the > abi break because on new compilation the hash values on 64bit platforms will > change. The lost entropy is irrelevant because the hash is not used in an > cryptophically relevant context. For the abi break i am not sure how severe > this change is. Capping at 32-bits means that potentially some ABI changes could go unnoticed, but that’s extremely unlikely if the hash function is good enough. I believe the ABI break is fine too: developers will have to “make clean-go && make”, but that’s okay. Thoughts? Opinions? Ludo’.
Ludovic Courtès <ludo@gnu.org> writes: > Woow, thanks for the investigation & explanation! Your are welcome :) I usually keep notes during investigation of bugs and append them to my patches. This keeps my train of thought transparent and makes it easier for others to follow along or spot obvious errors on my side. However, it can get a little bit noisy. So let met now if i should "keep it done" more. > Capping at 32-bits means that potentially some ABI changes could go > unnoticed, but that’s extremely unlikely if the hash function is good > enough. Yes, but this problem exits for 32bit builds in general. > I believe the ABI break is fine too: developers will have to > “make clean-go && make”, but that’s okay. Good to know.