Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

verify_strict and weak-key forgery #259

Closed
rozbb opened this issue Jan 6, 2023 · 9 comments
Closed

verify_strict and weak-key forgery #259

rozbb opened this issue Jan 6, 2023 · 9 comments
Labels
do-for-2.0 This should be resolved before a 2.0 release

Comments

@rozbb
Copy link
Contributor

rozbb commented Jan 6, 2023

This issue comes after a long time thinking about what verify_strict does and whether we want to remove it. I argue (surprise!) that we should not.

The verify_strict function checks that the verifying key A, and sig.R are not low-order points.

The original intent seems to have been a stronger check, namely checking torsion-freeness of those curve points, but that never happened afaict. These are very different checks.

Regardless, let's assume it is doing torsion-freeness checks. There is documentation in the README and in verify_strict claiming that permitting R or A with torsion (which still have large order) yields signature malleability. I don't think this is true though, as both the R and A components are included in the hash H(R || A || M), and A is still large enough order that the change in the hash yields an entirely different challenge "in the exponent". The docs are discussed more here. There's a similar claim in this comment, but again I haven't seen an example yet.

However! I think verify_strict is still valuable.

The low-order check on A means verify_strict provides resistance to weak-key forgeries. In particular, if A is low-order, then H(R || A || M) A == H(R || A || M') A with 1/8 probability, for any distinct messages M and M'. We have a demonstration of this in our tests. In terms of impact, Scuttlebutt had a weak-key forgery vuln due to precisely this issue (see here, section 7.1). The attack context is: the attacker can pick their pubkey and provide a signature but the message is not known to them. The attacker uses a weak pubkey A, generates an s, lets R = sB, and sends (A, σ=(s, R)) to the verifier. With 1/8 probability, the secret message M known to the verifier satisfies sB == R + H(R || A || M) A. Note that the forgery method here can also be thought of as a malleability attack, since you can find plenty of (s, R=sB) pairs that pass verification on (A, M) by simple trial and error.

Some things that aren't clear to me are whether there really is a malleability issue on large-order pubkeys with torsion, and whether the low-order check on R does anything for us. @isislovecruft if you could explain that'd be awesome.

Going forward, we can do a few things:

  1. Keep verify_strict and update the docs around it, explaining weak-key forgery and removing previous claims about malleability. According to the paper, Go's NaCl impl, Cloudflare's CIRCL, and Libsodium have all implemented low-order pubkey checks.
  2. Delete it and put a warning about weak-key forgery. Recommend to anyone concerned about this that they should use sigs over prime order curves. Like RedDSA, schnorrkel, or jq255e.
  3. Make verify_strict actually do the torsion-freeness checks. There's actually a reason you might want this. If you did this in batch verif as well, then it would agree perfectly with single verif. This way you could have consistent batch verif that rejects weak-key forgeries.

I like (1), especially since you can't really do a low-order check in our current API. IMO verify_strict should be the default functionality, but I don't think we can do that now.

(3) sounds OK too, though allegedly the torsion-freeness check is pretty expensive and it might not make sense in a batch setting from the start.

cc @tarcieri

@rozbb rozbb added the do-for-2.0 This should be resolved before a 2.0 release label Jan 6, 2023
@tarcieri
Copy link
Contributor

tarcieri commented Jan 6, 2023

@rozbb if verify_strict provides behavior that matches* libsodium and other libraries, I agree it's worth keeping around

*I know this has been exhaustively studied but I'm not sure what might've changed in the interim since those blog posts and papers were written

@rozbb
Copy link
Contributor Author

rozbb commented Jan 6, 2023

Libsodium does small order checks on A and R. It also does canonicity checks which we also do but only accidentally.

But I don't think HACL does low-order checks. Go's ed25519 impl does not (and they also have a faulty scalar check). Go's NaCl just uses their ed25519 impl. And CIRCL doesn't do the checks.

So I guess no one actually fixed this. I'm still pro keeping verify_strict. What do you think?

@tarcieri
Copy link
Contributor

tarcieri commented Jan 6, 2023

Ok, well it seems like some feature with roughly this shape is common in other libraries and thus probably worth supporting, albeit unfortunate they don't agree on what the actual validation rules are.

@rozbb
Copy link
Contributor Author

rozbb commented Jan 6, 2023

Ok cool. There's a related thing I'll make a PR for: the legacy_compatibility feature disable scalar checks even for verify_strict. This is bad imo. We should just disable verify_strict if that feature is set.

@rozbb
Copy link
Contributor Author

rozbb commented Jan 7, 2023

Closing. We're keeping verify_strict and updating docs.

@rozbb
Copy link
Contributor Author

rozbb commented Jan 27, 2023

Documenting here:

I think verify_strict implicitly tests canonical encoding for R and A. The docs mention this in passing here.

My reasoning, previously stated here, is that it you need to know dlog(R) in order to make a signature, and the only non-canonical points with known discrete log are the low-order points. This goes for A as well. Thus, checking for low-order is sufficient to check non-canonicity, assuming hardness of discrete log on those specific points.

@FiloSottile
Copy link

Go's ed25519 impl does not (and they also have a faulty scalar check).

Uuuuh I'd like to hear more about the scalar check? :)

@rozbb
Copy link
Contributor Author

rozbb commented Mar 6, 2024

Hi Fillipo! I just looked into it and I think I was mistaken. What I saw was this line

if len(sig) != SignatureSize || sig[63] & 224 != 0 {
    return false
}

This is the old libsodium scalar check, which only checks that the top 3 bits are unset, and leads to possible signature malleability.

But I see now that you later call edwards25519.NewScalar.SetCanonicalBytes(sig[32:]), which properly checks that isReduced is true.

So nevermind, sorry about that!

@FiloSottile
Copy link

Sweet, thank you for checking!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
do-for-2.0 This should be resolved before a 2.0 release
Projects
None yet
Development

No branches or pull requests

3 participants