<br><br><div class="gmail_quote">On Thu, Jul 15, 2010 at 6:57 PM, Sean Mullan <span dir="ltr"><<a href="mailto:sean.mullan@oracle.com">sean.mullan@oracle.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
I would like to try to fix a long-standing XMLDSig issue with the current DSA and ECDSA signature bytes format.<br>
<br>
The format of the Signature bytes for these algorithms is an ASN.1 encoded sequence of the integers r and s:<br>
<br>
SEQUENCE ::= { r INTEGER, s INTEGER }<br>
<br>
Unfortunately, this is not compatible with XMLDSig (and other signature formats like .NET), which doesn't ASN.1 encode them and simply base64 encodes the raw bytes of r and s concatenated (the IEEE P1363 format).<br>
<br></blockquote><div><br>There are more standards that use the P1363 format. Personally I'm involved with the EAC specification for ePassports & Java. You'll find this kind of signature if you look at the EAC certificates for the inspection systems (and their CA's).<br>
<br> </div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
So, our XMLDSig implementation always has to strip off, or decode the ASN.1 stuff after calling Signature.sign() when generating signatures, and ASN.1 encode the signature bytes before calling Signature.verify() when verifying signatures. I could live with this until now because it was limited to DSA which wasn't in wide use. But now the same problem comes up with ECDSA.<br>
<br></blockquote><div><br>That is a very well known situation for me :). I don't directly remember though if I had to do normalization on the integers as well (stripping of 00h bytes at the front or adding 00h bytes to get to the correct bit-size of the signature elements), or if s & r were encoded as ASN.1 octet strings.<br>
</div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
I would really like to clean this up. There seems to be a couple of ways we could fix this:<br>
<br>
1. Add new standard signature format strings that identify the format: ex:<br>
<br>
SHA1withDSAandP1363<br>
SHA1withECDSAandP1363<br>
SHA256withECDSAandP1363<br>
SHA384withECDSAandP1363<br>
SHA512withECDSAandP1363<br>
<br>
I like this the best, but one issue with this is that the "and" extended format is reserved for MGF functions, ex: MD5withRSAandMGF1 and this is not a mask generation function. My suggestion is that we use a keyword (ex: Format) that clearly distinguishes it from an MGF:<br>
<br>
<digest>with<encryption>and<format>Format<br>
<br>
ex:<br>
<br>
SHA256withECDSAandP1363Format<br>
<br></blockquote><div><br>I second this solution, since they would also be usable by other applications. I've got a serious problem with the solution though: hardware providers may not support it. And if HW providers do not support it then you need to work around it. Fortunately, if I'm not mistaken, you can work around this by creating a very simple provider that performs the wrapping/unwrapping of the signature (as you don't need to sign).<br>
<br>Of course, by now the string build-up of the signature format is getting really complicated (you could say it is starting to imitate life). In the end it might be a good idea to replace it by something that can be selected/verified at compile time (e.g. a list of signature parameters). Currently it might be a good idea to create a constant somewhere for these kind of strings.<br>
<br> </div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
2. Add a new AlgorithmParameterSpec subclass that specifies the format, and then call Signature.setParameter before generating/verifying the signature.<br>
<br>
I'm not thrilled by this option, because this isn't really a standard input parameter, and will cause problems if/when you want to use it with an algorithm that does require input parameters (like an RSA PSSParameterSpec)<br>
<br></blockquote><div><br>I can see these problems as well. I would try to leave the parameter specs alone, they are more difficult in use. Maybe restrict their use to those places where changing integer input is required and such (against simple choices for the algorithm).<br>
</div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
3. Add a higher level DSA/ECDSA Signer API that returns the r and s as BigIntegers and leaves the encoding of those bytes to the application.<br>
<br>
This is a very clean solution, but is more of a significant API change as it would be introducing a new higher level API for generating/validating signatures.<br>
<br></blockquote><div><br>Would that not be a *lower* level API, since it does not do the encoding? I don't directly see the need. If people want to step outside those algorithms that are standardized or generally used, they could choose a lower level API like Bouncy.<br>
<br>Of course, in the end we might want to replace the current JCA with one that uses the factory principle and immutable Signer and Verifier classes, but that is an entirely different discussion :)<br><br> </div><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
4. Do nothing<br>
<br>
Live with it :(<br>
<br></blockquote><div><br>Nah, if you want to go for 1), then go for it. No current code would break, it's a standardized algorithm you are implementing and other people like me are using it.<br> <br>Regards,<br>Maarten<br>
</div></div><br>