There's a bug in the use_hint function where it adds 1 instead of subtracting 1 when the decomposed low bits r0 equal exactly zero. FIPS 204 Algorithm 40 is pretty clear that r0 > 0 means strictly positive, but the current code treats zero as positive. This causes valid signatures to potentially fail verification when this edge case gets hit.
The issue is in ml-dsa/src/hint.rs in the use_hint function. Here's what FIPS 204 Algorithm 40 says:
3: if h = 1 and r0 > 0 return (r1 + 1) mod m
4: if h = 1 and r0 <= 0 return (r1 − 1) mod m
Line 3 uses r0 > 0 (strictly greater than zero), and line 4 uses r0 <= 0 (less than or equal, which includes zero). So when r0 = 0, the spec says to subtract 1.
But the current implementation does this:
if h && r0.0 <= gamma2 {
Elem::new((r1.0 + 1) % m)
} else if h && r0.0 >= BaseField::Q - gamma2 {
Elem::new((r1.0 + m - 1) % m)
}
The problem is r0.0 <= gamma2 includes zero. When r0 = 0, this condition is true (since 0 <= gamma2), so it adds 1. But according to the spec, r0 = 0 should fall into the r0 <= 0 case and subtract 1 instead.
The result is +1 when it should be -1, which is an off by two error mod m.
Take MLDSA 44 where γ2 = 95,232 and m = 44.
If use_hint(true, 0) is called:
Decompose(0) gives (r1=0, r0=0)r0.0 <= gamma2 is 0 <= 95232 which is true(0 + 1) % 44 = 1But FIPS 204 says:
r0 > 0 is 0 > 0 which is falser0 ≤ 0 is 0 ≤ 0 which is true(0 - 1) mod 44 = 43The function returns 1 when it should return 43.
This can happen in real signatures whenever any coefficient of the w' vector happens to be a multiple of 2γ2, which makes its decomposed r0 equal zero. It's not super common but it's definitely possible, and when it hits, verification will fail for a completely valid signature.
This is a FIPS 204 compliance bug that affects...
0.1.0-rc.5Exploitability
AV:NAC:LAT:NPR:NUI:NVulnerable System
VC:NVI:LVA:NSubsequent System
SC:NSI:NSA:N5.5/CVSS:4.0/AV:N/AC:L/AT:N/PR:N/UI:N/VC:N/VI:L/VA:N/SC:N/SI:N/SA:N/E:P