13

I having trouble verifying an ECDSA signature signed using client side javascript with Java/BouncyCastle.

The javascript signing function source:

sign: function (hash, priv) {
        var d = priv;
        var n = ecparams.getN();
        var e = BigInteger.fromByteArrayUnsigned(hash);

        do {
            var k = ECDSA.getBigRandom(n);
            var G = ecparams.getG();
            var Q = G.multiply(k);
            var r = Q.getX().toBigInteger().mod(n);
        } while (r.compareTo(BigInteger.ZERO) <= 0);

        var s = k.modInverse(n).multiply(e.add(d.multiply(r))).mod(n);

  return ECDSA.serializeSig(r, s);
},

serializeSig: function (r, s) {
        var rBa = r.toByteArrayUnsigned();
        var sBa = s.toByteArrayUnsigned();

        var sequence = [];
        sequence.push(0x02); // INTEGER
        sequence.push(rBa.length);
        sequence = sequence.concat(rBa);

        sequence.push(0x02); // INTEGER
        sequence.push(sBa.length);
        sequence = sequence.concat(sBa);

        sequence.unshift(sequence.length);
        sequence.unshift(0x30) // SEQUENCE

        return sequence;
},

The server side verify function source:

  /**
     * Verifies the given ASN.1 encoded ECDSA signature against a hash using the public key.
     *
     * @param data      Hash of the data to verify.
     * @param signature ASN.1 encoded signature.
     * @param pub       The public key bytes to use.
     */
    public static boolean verify(byte[] data, byte[] signature, byte[] pub) {
        ECDSASigner signer = new ECDSASigner();
        ECPublicKeyParameters params = new ECPublicKeyParameters(ecParams.getCurve().decodePoint(pub), ecParams);
        signer.init(false, params);
        try {
            ASN1InputStream decoder = new ASN1InputStream(signature);
            DERSequence seq = (DERSequence) decoder.readObject();
            DERInteger r = (DERInteger) seq.getObjectAt(0);
            DERInteger s = (DERInteger) seq.getObjectAt(1);
            decoder.close();
            return signer.verifySignature(data, r.getValue(), s.getValue());
        } catch (IOException e) {
            throw new RuntimeException(e);
        }
    }

Does anyone know what format the javascript signature is in (DER?) and if so how to convert it to ASN.1?

I tried:

public static byte[] toASN1(byte[] data) {
            try {
                ByteArrayOutputStream baos = new ByteArrayOutputStream(400);
                ASN1OutputStream encoder = new ASN1OutputStream(baos);

                encoder.write(data);
                encoder.close();
                return baos.toByteArray();
            } catch (IOException e) {
                throw new RuntimeException(e);  // Cannot happen, writing to memory stream.
            }
}

but validation still fails.

Mike Edward Moras
  • 18,161
  • 12
  • 87
  • 240
Ben
  • 241
  • 1
  • 2
  • 6

2 Answers2

41

Disclaimer: I don't know Javascript and I do not practice BouncyCastle. However, I do know Java, and ASN.1.

ASN.1 is a notation for structured data, and DER is a set of rules for transforming a data structure (described in ASN.1) into a sequence of bytes, and back.

This is ASN.1, namely the description of the structure which an ECDSA signature exhibits:

ECDSASignature ::= SEQUENCE {
    r   INTEGER,
    s   INTEGER
}

When encoded in DER, this becomes the following sequence of bytes:

0x30 b1 0x02 b2 (vr) 0x02 b3 (vs)

where:

  • b1 is a single byte value, equal to the length, in bytes, of the remaining list of bytes (from the first 0x02 to the end of the encoding);
  • b2 is a single byte value, equal to the length, in bytes, of (vr);
  • b3 is a single byte value, equal to the length, in bytes, of (vs);
  • (vr) is the signed big-endian encoding of the value "$r$", of minimal length;
  • (vs) is the signed big-endian encoding of the value "$s$", of minimal length.

"Signed big-endian encoding of minimal length" means that the numerical value must be encoded as a sequence of bytes, such that the least significant byte comes last (that's what "big endian" means), the total length is the shortest possible to represent the value (that's "minimal length"), and the first bit of the first byte specifies the sign of the value (that's "signed"). For ECDSA, the $r$ and $s$ values are positive integers, so the first bit of the first byte must be a 0; i.e. the first byte of (vr) (respectively (vs)) must have a value between 0x00 and 0x7F.

For instance, if we were to encode the numerical value 117, it would use a single byte 0x75, and not a two-byte sequence 0x00 0x75 (because of minimality). However, the value 193 must be encoded as two bytes 0x00 0xC1, because a single byte 0xC1, by itself, would denote a negative integer, since the first (aka "leftmost") bit of 0xC1 is a 1 (a single byte of value 0xC1 represents the value -63).

I insist on these details because my guess is that it is what the Javascript code does incorrectly. The Javascript code invokes a method called toByteArrayUnsigned; that name is evocative of conversion to an unsigned representation (i.e. always positive, even if the first bit is a 1), and that's wrong for DER. I invite you to save the raw signature in a file and decode it "by hand" to see if it matches the fine details of ASN.1 encoding, as explained above (the openssl asn1parse command may help, too).

(If my guess is right, you should have a probability of about 1/4 of getting a correct signature nonetheless -- when the values $r$ and $s$ happen to be short enough to have a high bit set to 0 when using unsigned encoding, i.e. a situation where the bug is benign.)

Thomas Pornin
  • 88,324
  • 16
  • 246
  • 315
1

I do not have enough reputation to comment but want to improve Thomas' answer:

  1. The ASN.1 declaration, SEQUENCE of two integers $R$ and $S$, is correct;
  2. ASN.1 ALWAYS encodes integers in Big Endian format (aka network order);
  3. ASN.1 ALWAYS encodes integers in the two's complement, i.e. the most significant bit is the sign bit. There are no unsigned integers in ASN.1;
  4. Certain hardware crypto processors (smartcards, HSMs) and also some software libraries emit ECDSA signatures as excactly two bit strings of length $n$ if $n$ denotes the bit size of the used Elliptic Curve, i.e. an ECDSA signature is twice as big as the Elliptic Curve bit length;
  5. If the bit length is not divisible by eight (8), convert bit length to byte length as follows: byte_length := (bit_length + 7) / 8;
  6. Because $R$ and $S$ are ALWAYS unsigned integers, you have to convert both $R$ and $S$ from the two bit strings, which is often called 'canonicalization' or 'c14n':
  7. If the most significant bit is set, prepend another zero byte 0x00;
  8. Otherwise: skip all leading zero bytes until you reach the first non-zero byte whose most significant bit is not set. If this zero-byte prefix vector is not empty, cut it;
  9. Thomas' statement that b1 is a single byte, is not always correct. You have to perform a proper ASN.1 DER length encoding here. Lengths 0..127 are indeed one single byte. Lengths 128..255 are two bytes: 0x81,0x80..0xFF

If you use e.g. the brainpoolP512R1 curve, then the bit length of the Elliptic Curve is 64 bytes. $R$ and $S$ both occupy 64 bytes (more or less, see c14n above). The enclosing ASN.1 sequence may exceed the length 0..127 in this case:

    tag 0x02 (INTEGER) + length byte 0x40 + 0x40 bytes = 0x42 bytes for R
    tag 0x02 (INTEGER) + length byte 0x40 + 0x40 bytes = 0x42 bytes for S
    ============================
    tag 0x30 (SEQUENCE): length is (0x42+0x42 = 0x84)

The length of the enclosing ASN.1 sequence is 0x84, which does not fit in a single length byte. The encoding has to be 0x81,0x84 in this case.

Devvy
  • 21
  • 3