From felix.yang at oracle.com Thu Jun 1 03:27:41 2017 From: felix.yang at oracle.com (Felix Yang) Date: Thu, 1 Jun 2017 11:27:41 +0800 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: Hi Chris and Daniel, new webrev with a few of explicit builds than wildcard. http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ Thanks, Felix On 2017/5/31 18:20, Chris Hegarty wrote: >> On 31 May 2017, at 10:42, Felix Yang wrote: >> >> Hi there, >> >> please review the patch to various jdk tests to explicitly compiling test libraries and the lib's dependencies. Though it could be a jtreg issue (I think so), it is necessary to get the tests running firstly. >> >> Bug: >> >> https://bugs.openjdk.java.net/browse/JDK-8181299 >> >> Webrev: >> >> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.00/ > This may be ok to get the tests running again, but explicit build targets > would be better. The contents, and module dependencies, from classes > in the test library are subject to change, so building all classes may > require more modules than in the @modules tags in the test. With latest webrev, no new @modules introduced by this change, though I fixed a missing from original tests. I prefer to keep "@build jdk.test.lib.process.*" here. Because, with current test lib package layout, "@build jdk.test.lib.process.*" equals with /@build jdk.test.lib.process.OutputAnalyzer //jdk.test.lib.process.OutputBuffer //jdk.test.lib.process.ProcessTools ////jdk.test.lib.process.//StreamPumper// ///jdk.test.lib.process.ExitCode/ /" It is a bit ugly and not productive, when I only use ProcessTools directly but have to declare a bunch of @builds. That is why I think this is not a fix but a workaround. Thanks, Felix > > I agree with Daniel, each test should be run separately in a clean > environment, to verify that it can build the necessary dependencies. This is actually not the case. I executed repeatedly each test works well separately. The problem occurs when there are more and more tests using the same test libs. As stated in the bugs [1] and [2], if there are multi tests using a lib, such as ProcessTools, there could be possible collision occurring on its dependencies. For ProcessTools, StreamPumper (ONLY) will be disappear sometimes. It looks some dependency classes were treated by jtreg as some-how shared, and removed unexpectedly. [1] https://bugs.openjdk.java.net/browse/JDK-8181299 [2] https://bugs.openjdk.java.net/browse/CODETOOLS-7901986. Thanks, Felix > This may be a straight forward way to identify explicit build dependencies > and avoid the wildcards. > > -Chris. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From igor.ignatyev at oracle.com Thu Jun 1 03:32:26 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Wed, 31 May 2017 20:32:26 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: Hi Felix, I have suggested the exact opposite change[1-3] to fix the same problem. [1] https://bugs.openjdk.java.net/browse/JDK-8181391 [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html Thanks, -- Igor > On May 31, 2017, at 8:27 PM, Felix Yang wrote: > > Hi Chris and Daniel, > > new webrev with a few of explicit builds than wildcard. > > http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ > > Thanks, > Felix > On 2017/5/31 18:20, Chris Hegarty wrote: >>> On 31 May 2017, at 10:42, Felix Yang wrote: >>> >>> Hi there, >>> >>> please review the patch to various jdk tests to explicitly compiling test libraries and the lib's dependencies. Though it could be a jtreg issue (I think so), it is necessary to get the tests running firstly. >>> >>> Bug: >>> >>> https://bugs.openjdk.java.net/browse/JDK-8181299 >>> >>> Webrev: >>> >>> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.00/ >> This may be ok to get the tests running again, but explicit build targets >> would be better. The contents, and module dependencies, from classes >> in the test library are subject to change, so building all classes may >> require more modules than in the @modules tags in the test. > With latest webrev, no new @modules introduced by this change, though I fixed a missing from original tests. > > I prefer to keep "@build jdk.test.lib.process.*" here. Because, with current test lib package layout, "@build jdk.test.lib.process.*" equals with > /@build jdk.test.lib.process.OutputAnalyzer > //jdk.test.lib.process.OutputBuffer > //jdk.test.lib.process.ProcessTools > ////jdk.test.lib.process.//StreamPumper// > ///jdk.test.lib.process.ExitCode/ /" > > It is a bit ugly and not productive, when I only use ProcessTools directly but have to declare a bunch of @builds. That is why I think this is not a fix but a workaround. > > Thanks, > Felix >> >> I agree with Daniel, each test should be run separately in a clean >> environment, to verify that it can build the necessary dependencies. > This is actually not the case. I executed repeatedly each test works well separately. The problem occurs when there are more and more tests using the same test libs. > > As stated in the bugs [1] and [2], if there are multi tests using a lib, such as ProcessTools, there could be possible collision occurring on its dependencies. > For ProcessTools, StreamPumper (ONLY) will be disappear sometimes. It looks some dependency classes were treated by jtreg as some-how shared, and removed unexpectedly. > > [1] https://bugs.openjdk.java.net/browse/JDK-8181299 > [2] https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 . > > Thanks, > Felix >> This may be a straight forward way to identify explicit build dependencies >> and avoid the wildcards. >> >> -Chris. -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix.yang at oracle.com Thu Jun 1 04:49:03 2017 From: felix.yang at oracle.com (Felix Yang) Date: Thu, 1 Jun 2017 12:49:03 +0800 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: <69795fc1-77a9-6ad4-bfb2-83f69a494549@oracle.com> Hi Igor, just noticed your change. It is indeed necessary to clarify what is the best practice here. Thanks, Felix On 2017/6/1 11:32, Igor Ignatyev wrote: > Hi Felix, > > I have suggested the exact opposite change[1-3] to fix the same problem. > > [1] https://bugs.openjdk.java.net/browse/JDK-8181391 > [2] > http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html > [3] > http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html > > > Thanks, > -- Igor >> On May 31, 2017, at 8:27 PM, Felix Yang > > wrote: >> >> Hi Chris and Daniel, >> >> new webrev with a few of explicit builds than wildcard. >> >> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ >> >> >> Thanks, >> Felix >> On 2017/5/31 18:20, Chris Hegarty wrote: >>>> On 31 May 2017, at 10:42, Felix Yang >>> > wrote: >>>> >>>> Hi there, >>>> >>>> please review the patch to various jdk tests to explicitly >>>> compiling test libraries and the lib's dependencies. Though it >>>> could be a jtreg issue (I think so), it is necessary to get the >>>> tests running firstly. >>>> >>>> Bug: >>>> >>>> https://bugs.openjdk.java.net/browse/JDK-8181299 >>>> >>>> Webrev: >>>> >>>> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.00/ >>>> >>> This may be ok to get the tests running again, but explicit build >>> targets >>> would be better. The contents, and module dependencies, from classes >>> in the test library are subject to change, so building all classes may >>> require more modules than in the @modules tags in the test. >> With latest webrev, no new @modules introduced by this change, though >> I fixed a missing from original tests. >> >> I prefer to keep "@build jdk.test.lib.process.*" here. Because, with >> current test lib package layout, "@build jdk.test.lib.process.*" >> equals with >> /@build jdk.test.lib.process.OutputAnalyzer >> //jdk.test.lib.process.OutputBuffer >> //jdk.test.lib.process.ProcessTools >> ////jdk.test.lib.process.//StreamPumper// >> ///jdk.test.lib.process.ExitCode/ /" >> >> It is a bit ugly and not productive, when I only use ProcessTools >> directly but have to declare a bunch of @builds. That is why I think >> this is not a fix but a workaround. >> >> Thanks, >> Felix >>> >>> I agree with Daniel, each test should be run separately in a clean >>> environment, to verify that it can build the necessary dependencies. >> This is actually not the case. I executed repeatedly each test works >> well separately. The problem occurs when there are more and more >> tests using the same test libs. >> >> As stated in the bugs [1] and [2], if there are multi tests using a >> lib, such as ProcessTools, there could be possible collision >> occurring on its dependencies. >> For ProcessTools, StreamPumper (ONLY) will be disappear sometimes. It >> looks some dependency classes were treated by jtreg as some-how >> shared, and removed unexpectedly. >> >> [1]https://bugs.openjdk.java.net/browse/JDK-8181299 >> [2]https://bugs.openjdk.java.net/browse/CODETOOLS-7901986. >> >> Thanks, >> Felix >>> This may be a straight forward way to identify explicit build >>> dependencies >>> and avoid the wildcards. >>> >>> -Chris. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chris.hegarty at oracle.com Thu Jun 1 08:20:07 2017 From: chris.hegarty at oracle.com (Chris Hegarty) Date: Thu, 1 Jun 2017 09:20:07 +0100 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: Igor, > On 1 Jun 2017, at 04:32, Igor Ignatyev wrote: > > Hi Felix, > > I have suggested the exact opposite change[1-3] to fix the same problem. I?m sorry, but this is all just too confusing. After your change, who, or what, is responsible for building/compiling the test library dependencies? Test library code has no @modules tags, so does not explicitly declare its module dependencies. Instead module dependencies, required by test library code, are declared in the test using the library. If we wildcard, or otherwise leave broad build dependencies, from tests then there is no way to know what new module dependencies may be added in the future. That is, one of, the reason(s) I asked Felix to be explicit about the build dependencies. -Chris. > [1] https://bugs.openjdk.java.net/browse/JDK-8181391 > [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html > [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html From daniel.fuchs at oracle.com Thu Jun 1 08:56:09 2017 From: daniel.fuchs at oracle.com (Daniel Fuchs) Date: Thu, 1 Jun 2017 09:56:09 +0100 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: Hi Felix, This looks much better to me. Thanks! -- daniel On 01/06/2017 04:27, Felix Yang wrote: > Hi Chris and Daniel, > > new webrev with a few of explicit builds than wildcard. > > http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ > > Thanks, > Felix > On 2017/5/31 18:20, Chris Hegarty wrote: >>> On 31 May 2017, at 10:42, Felix Yang wrote: >>> >>> Hi there, >>> >>> please review the patch to various jdk tests to explicitly compiling test libraries and the lib's dependencies. Though it could be a jtreg issue (I think so), it is necessary to get the tests running firstly. >>> >>> Bug: >>> >>> https://bugs.openjdk.java.net/browse/JDK-8181299 >>> >>> Webrev: >>> >>> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.00/ >> This may be ok to get the tests running again, but explicit build targets >> would be better. The contents, and module dependencies, from classes >> in the test library are subject to change, so building all classes may >> require more modules than in the @modules tags in the test. > With latest webrev, no new @modules introduced by this change, though I > fixed a missing from original tests. > > I prefer to keep "@build jdk.test.lib.process.*" here. Because, with > current test lib package layout, "@build jdk.test.lib.process.*" equals > with > /@build jdk.test.lib.process.OutputAnalyzer > //jdk.test.lib.process.OutputBuffer > //jdk.test.lib.process.ProcessTools > ////jdk.test.lib.process.//StreamPumper// > ///jdk.test.lib.process.ExitCode/ /" > > It is a bit ugly and not productive, when I only use ProcessTools > directly but have to declare a bunch of @builds. That is why I think > this is not a fix but a workaround. > > Thanks, > Felix >> >> I agree with Daniel, each test should be run separately in a clean >> environment, to verify that it can build the necessary dependencies. > This is actually not the case. I executed repeatedly each test works > well separately. The problem occurs when there are more and more tests > using the same test libs. > > As stated in the bugs [1] and [2], if there are multi tests using a lib, > such as ProcessTools, there could be possible collision occurring on its > dependencies. > For ProcessTools, StreamPumper (ONLY) will be disappear sometimes. It > looks some dependency classes were treated by jtreg as some-how shared, > and removed unexpectedly. > > [1] https://bugs.openjdk.java.net/browse/JDK-8181299 > [2] https://bugs.openjdk.java.net/browse/CODETOOLS-7901986. > > Thanks, > Felix >> This may be a straight forward way to identify explicit build dependencies >> and avoid the wildcards. >> >> -Chris. >> > From chris.hegarty at oracle.com Thu Jun 1 09:11:45 2017 From: chris.hegarty at oracle.com (Chris Hegarty) Date: Thu, 1 Jun 2017 10:11:45 +0100 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: <7927ADA3-303F-4E8E-9DCF-812CEBCED6E3@oracle.com> > On 1 Jun 2017, at 04:27, Felix Yang wrote: > > Hi Chris and Daniel, > > new webrev with a few of explicit builds than wildcard. > > http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ This seems very odd to me: http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/test/sun/security/tools/jarsigner/AltProvider.java.sdiff.html The test is using jdk.test.lib.util.JarUtils ( which comes from the top-level test library ), but the @build specifies the JarUtils ( in the unnamed package ) from the jdk test library. Also, shouldn?t it use jdk.test.lib.compiler.CompilerUtils, rather than CompilerUtils? -Chris. From felix.yang at oracle.com Thu Jun 1 09:41:26 2017 From: felix.yang at oracle.com (Felix Yang) Date: Thu, 1 Jun 2017 17:41:26 +0800 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <7927ADA3-303F-4E8E-9DCF-812CEBCED6E3@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <7927ADA3-303F-4E8E-9DCF-812CEBCED6E3@oracle.com> Message-ID: Hi Chris, updated webrev: http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ -Felix On 2017/6/1 17:11, Chris Hegarty wrote: >> On 1 Jun 2017, at 04:27, Felix Yang wrote: >> >> Hi Chris and Daniel, >> >> new webrev with a few of explicit builds than wildcard. >> >> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ > This seems very odd to me: > > http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/test/sun/security/tools/jarsigner/AltProvider.java.sdiff.html > > The test is using jdk.test.lib.util.JarUtils ( which comes from the > top-level test library ), but the @build specifies the JarUtils ( in > the unnamed package ) from the jdk test library. Fixed > Also, shouldn?t it use jdk.test.lib.compiler.CompilerUtils, rather than > CompilerUtils? Ok, refactored the test to use new CompilerUtils and then not referring /lib/testlibrary any more. I didn't touch other tests usage on CompilerUtils, because Igor has a single purpose task to unify the test libs. It will complicate his clean-up work. -Felix > > -Chris. From sean.coffey at oracle.com Thu Jun 1 16:23:20 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Thu, 1 Jun 2017 17:23:20 +0100 Subject: RFR : 8181205:JRE fails to load/register security providers when started from UNC pathname Message-ID: <05e5ab76-ac7a-d38f-197a-f7d5ea8523a6@oracle.com> The recent JDK-8163528 fix caused a regression for JDK binaries launched with a UNC pathname. We can use the Paths class to create the required File. I managed to put a test together which should test this code path. webrev : http://cr.openjdk.java.net/~coffeys/webrev.8181205/webrev/ JBS record : https://bugs.openjdk.java.net/browse/JDK-8181205 regards, Sean. From xuelei.fan at oracle.com Thu Jun 1 16:46:58 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 1 Jun 2017 09:46:58 -0700 Subject: Code Review Request, JDK-8181439 Test the jdk.tls.namedGroups System Property Message-ID: <15e96b15-0cf9-e776-339f-51c70e0a00b8@oracle.com> Hi Valerie, Please review the test update: http://cr.openjdk.java.net/~xuelei/8181439/webrev.00/ More test cases are added for FFDHE groups so as to check unavailable groups in some platforms or JDK releases. Thanks, Xuelei From igor.ignatyev at oracle.com Thu Jun 1 20:17:25 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Thu, 1 Jun 2017 13:17:25 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: > On Jun 1, 2017, at 1:20 AM, Chris Hegarty wrote: > > Igor, > >> On 1 Jun 2017, at 04:32, Igor Ignatyev wrote: >> >> Hi Felix, >> >> I have suggested the exact opposite change[1-3] to fix the same problem. > > I?m sorry, but this is all just too confusing. After your change, who, or what, is > responsible for building/compiling the test library dependencies? jtreg is responsible, there is an implicit build for each @run, and jtreg will analyze a test class to get transitive closure for static dependencies, hence you have to have @build only for classes which are not in constant pool, e.g. used only by reflection or whose classnames are only used to spawn a new java instance. > > > Test library code has no @modules tags, so does not explicitly declare its > module dependencies. Instead module dependencies, required by test > library code, are declared in the test using the library. If we wildcard, or > otherwise leave broad build dependencies, from tests then there is no > way to know what new module dependencies may be added in the future. > That is, one of, the reason(s) I asked Felix to be explicit about the build > dependencies. having explicit builds does not really help w/ module dependency, if someone change a testlibrary class so it starts to depend on another testlibrary class, jtreg will implicitly build it and if this class has some module dependencies, you will have to reflect them in the test. generally speaking, I don't like having explicit build actions because build actions themselves are implicit, so they don't really help, it's still will be hard to spot missed explicit builds. not having (unneeded) explicit builds is an easy rule to follow and we can easily find all places which don't follow this rule by grep. -- Igor > > -Chris. > >> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >> [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html > From igor.ignatyev at oracle.com Thu Jun 1 23:58:49 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Thu, 1 Jun 2017 16:58:49 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> Message-ID: > For example: doing this may be enough for now: > > * @build jdk.test.lib.process.* > > But what if in the future, jdk.test.lib.process is restructured to have a private package jdk.test.lib.process.hidden? To work around CODETOOLS-7901986, all the test cases that must be modified to the following, which unnecessarily exposes library implementation details to the library users: > > * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* and in fact, there is already similar problem and http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ does not address it. jdk/test/lib/process/ProcessTools depends on jdk/test/lib/Utils so all tests which have '@build jdk.test.lib.process.ProcessTools' will have to have '@build jdk.test.lib.Utils'. then we have OutputAnalyzer which depends on ProcessTools so all tests which @build jdk.test.lib.process.OutputAnalyzer will @build ProcessTools and Utils explicitly. many testlibrary classes which on jdk.test.lib.process.OutputAnalyzer, so one will have to specify OutputAnalyzer ProcessTools and Utils in the tests which depends on other testlibrary classes. to make things even worse, Utils depends on OutputAnalyzer and there are lots of tests and test library classes which depend on Utils, so all of them will have to have at least '@build jdk.test.lib.Utils jdk.test.lib.process.OutputAnalyzer jdk.test.lib.process.ProcessTools'. and they will work stable till someone refactors them and extract some new classes. that is to say, it's nearly impossible to have all explicit @build actions. Cheers, -- Igor > On Jun 1, 2017, at 3:37 PM, Ioi Lam wrote: > > > > On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty wrote: >>> >>> Igor, >>> >>>> On 1 Jun 2017, at 04:32, Igor Ignatyev wrote: >>>> >>>> Hi Felix, >>>> >>>> I have suggested the exact opposite change[1-3] to fix the same problem. >>> I?m sorry, but this is all just too confusing. After your change, who, or what, is >>> responsible for building/compiling the test library dependencies? >> jtreg is responsible, there is an implicit build for each @run, and jtreg will analyze a test class to get transitive closure for static dependencies, hence you have to have @build only for classes which are not in constant pool, e.g. used only by reflection or whose classnames are only used to spawn a new java instance. > > > I suspect the problem is caused by a long standing bug in jtreg that results in library classes being partially compiled. Please see my evaluation in > > https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 > > In the bug report, there is test case that can reliably reproduce the NoClassDefFoundError problem. > > I think adding all the @build commands in the tests are just band-aids. Things will break unless every test explicitly uses @build to build every class in every library that they use, including all the private classes that are not directly accessible by the test cases. > > For example: doing this may be enough for now: > > * @build jdk.test.lib.process.* > > But what if in the future, jdk.test.lib.process is restructured to have a private package jdk.test.lib.process.hidden? To work around CODETOOLS-7901986, all the test cases that must be modified to the following, which unnecessarily exposes library implementation details to the library users: > > * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* > > Just imagine this -- "in order to use malloc() you must explicitly build not only malloc(), but also sbrk() ... and every other function in libc". That seems unreasonable to me. > > By the way, we made a fix in the HotSpot tests (see https://bugs.openjdk.java.net/browse/JDK-8157957 ) that got rid of many (but not all) of the NoClassDefFoundErrors by *removing* the @build lines ..... > > My proposal is, instead of just adding @build for band-aid, we should fix CODETOOLS-7901986 instead. > > Thanks > - Ioi > > >>> >>> Test library code has no @modules tags, so does not explicitly declare its >>> module dependencies. Instead module dependencies, required by test >>> library code, are declared in the test using the library. If we wildcard, or >>> otherwise leave broad build dependencies, from tests then there is no >>> way to know what new module dependencies may be added in the future. >>> That is, one of, the reason(s) I asked Felix to be explicit about the build >>> dependencies. >> having explicit builds does not really help w/ module dependency, if someone change a testlibrary class so it starts to depend on another testlibrary class, jtreg will implicitly build it and if this class has some module dependencies, you will have to reflect them in the test. >> >> generally speaking, I don't like having explicit build actions because build actions themselves are implicit, so they don't really help, it's still will be hard to spot missed explicit builds. not having (unneeded) explicit builds is an easy rule to follow and we can easily find all places which don't follow this rule by grep. >> >> -- Igor >>> -Chris. >>> >>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>> [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From ioi.lam at oracle.com Thu Jun 1 22:37:58 2017 From: ioi.lam at oracle.com (Ioi Lam) Date: Thu, 1 Jun 2017 15:37:58 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >> On Jun 1, 2017, at 1:20 AM, Chris Hegarty wrote: >> >> Igor, >> >>> On 1 Jun 2017, at 04:32, Igor Ignatyev wrote: >>> >>> Hi Felix, >>> >>> I have suggested the exact opposite change[1-3] to fix the same problem. >> I?m sorry, but this is all just too confusing. After your change, who, or what, is >> responsible for building/compiling the test library dependencies? > jtreg is responsible, there is an implicit build for each @run, and jtreg will analyze a test class to get transitive closure for static dependencies, hence you have to have @build only for classes which are not in constant pool, e.g. used only by reflection or whose classnames are only used to spawn a new java instance. I suspect the problem is caused by a long standing bug in jtreg that results in library classes being partially compiled. Please see my evaluation in https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 In the bug report, there is test case that can reliably reproduce the NoClassDefFoundError problem. I think adding all the @build commands in the tests are just band-aids. Things will break unless every test explicitly uses @build to build every class in every library that they use, including all the private classes that are not directly accessible by the test cases. For example: doing this may be enough for now: * @build jdk.test.lib.process.* But what if in the future, jdk.test.lib.process is restructured to have a private package jdk.test.lib.process.hidden? To work around CODETOOLS-7901986, all the test cases that must be modified to the following, which unnecessarily exposes library implementation details to the library users: * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* Just imagine this -- "in order to use malloc() you must explicitly build not only malloc(), but also sbrk() ... and every other function in libc". That seems unreasonable to me. By the way, we made a fix in the HotSpot tests (see https://bugs.openjdk.java.net/browse/JDK-8157957) that got rid of many (but not all) of the NoClassDefFoundErrors by *removing* the @build lines ..... My proposal is, instead of just adding @build for band-aid, we should fix CODETOOLS-7901986 instead. Thanks - Ioi >> >> Test library code has no @modules tags, so does not explicitly declare its >> module dependencies. Instead module dependencies, required by test >> library code, are declared in the test using the library. If we wildcard, or >> otherwise leave broad build dependencies, from tests then there is no >> way to know what new module dependencies may be added in the future. >> That is, one of, the reason(s) I asked Felix to be explicit about the build >> dependencies. > having explicit builds does not really help w/ module dependency, if someone change a testlibrary class so it starts to depend on another testlibrary class, jtreg will implicitly build it and if this class has some module dependencies, you will have to reflect them in the test. > > generally speaking, I don't like having explicit build actions because build actions themselves are implicit, so they don't really help, it's still will be hard to spot missed explicit builds. not having (unneeded) explicit builds is an easy rule to follow and we can easily find all places which don't follow this rule by grep. > > -- Igor >> -Chris. >> >>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>> [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html From ioi.lam at oracle.com Thu Jun 1 23:14:48 2017 From: ioi.lam at oracle.com (Ioi Lam) Date: Thu, 1 Jun 2017 16:14:48 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> Message-ID: <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >> On Jun 1, 2017, at 1:20 AM, Chris Hegarty wrote: >> >> Igor, >> >>> On 1 Jun 2017, at 04:32, Igor Ignatyev wrote: >>> >>> Hi Felix, >>> >>> I have suggested the exact opposite change[1-3] to fix the same problem. >> I?m sorry, but this is all just too confusing. After your change, who, or what, is >> responsible for building/compiling the test library dependencies? > jtreg is responsible, there is an implicit build for each @run, and jtreg will analyze a test class to get transitive closure for static dependencies, hence you have to have @build only for classes which are not in constant pool, e.g. used only by reflection or whose classnames are only used to spawn a new java instance. Just to add to what Igor said: In fact, what JTREG does is fairly simple, but kind of hidden so it's not obvious what it does with the @library tags. Let's say your test uses "@library /test/lib" After your test completes, open the .jtr file (JTREG Report). It should have a command-line for javac, like this: DISPLAY=:2 \\ HOME=/home/iklam \\ JTREG_COMPILEJDK=/home/iklam/jdk/bld/foobar/images/jdk \\ LANG=en_US.UTF-8 \\ LD_LIBRARY_PATH=/home/iklam/jdk/bld/foobar/images/jdk/../../images/test/hotspot/jtreg/native \\ PATH=/bin:/usr/bin \\ /home/iklam/jdk/bld/foobar/images/jdk/bin/javac \\ -J-Dtest.src=/jdk/foobar/hotspot/test/runtime/SharedArchiveFile \\ -J-Dtest.src.path=/jdk/foobar/hotspot/test/runtime/SharedArchiveFile:/jdk/foobar/test/lib \\ -J-Dtest.classes=/jdk/tmp/jtreg/work/classes/14/runtime/SharedArchiveFile \\ -J-Dtest.class.path=/jdk/tmp/jtreg/work/classes/14/runtime/SharedArchiveFile:/jdk/tmp/jtreg/work/classes/14/test/lib \\ -J-Dtest.vm.opts= \\ -J-Dtest.tool.vm.opts= \\ -J-Dtest.compiler.opts= \\ -J-Dtest.java.opts= \\ -J-Dtest.jdk=/home/iklam/jdk/bld/foobar-fastdebug/images/jdk \\ -J-Dcompile.jdk=/home/iklam/jdk/bld/foobar/images/jdk \\ -J-Dtest.timeout.factor=4.0 \\ -J-Dtest.modules='java.base/jdk.internal.misc java.management' \\ -J-Dtest.nativepath=/home/iklam/jdk/bld/foobar/images/jdk/../../images/test/hotspot/jtreg/native \\ @/jdk/tmp/jtreg/work/runtime/SharedArchiveFile/SpaceUtilizationCheck.d/compile.0.jta The gem is hidden in the compile.0.jta file. It contains something like: -sourcepath :/jdk/foobar/test/lib: So if my test refers to a class under /test/lib, such as jdk.test.lib.process.ProcessTools, javac will be able to locate it under /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will build it automatically. So really, there's no reason why the test must explicitly do an @build of the library classes that it uses. - Ioi >> >> Test library code has no @modules tags, so does not explicitly declare its >> module dependencies. Instead module dependencies, required by test >> library code, are declared in the test using the library. If we wildcard, or >> otherwise leave broad build dependencies, from tests then there is no >> way to know what new module dependencies may be added in the future. >> That is, one of, the reason(s) I asked Felix to be explicit about the build >> dependencies. > having explicit builds does not really help w/ module dependency, if someone change a testlibrary class so it starts to depend on another testlibrary class, jtreg will implicitly build it and if this class has some module dependencies, you will have to reflect them in the test. > > generally speaking, I don't like having explicit build actions because build actions themselves are implicit, so they don't really help, it's still will be hard to spot missed explicit builds. not having (unneeded) explicit builds is an easy rule to follow and we can easily find all places which don't follow this rule by grep. > > -- Igor >> -Chris. >> >>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>> [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html From felix.yang at oracle.com Fri Jun 2 01:52:27 2017 From: felix.yang at oracle.com (Felix Yang) Date: Fri, 2 Jun 2017 09:52:27 +0800 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> Message-ID: <7a62f2ee-69bc-0281-3e2f-85a25d917e51@oracle.com> Hi Igor and Ioi, I partially agree with you. As initially stated in the proposal and bug(JDK-8181299 ), I don't think this patch is a fix but a quick workaround to make them runnable. "explicit" is reasonable for me. But "explicit" should not be restricted as "explicit all, including dependencies", as it is not productive or even realistic in the long term. Thanks, Felix On 2017/6/2 7:58, Igor Ignatyev wrote: >> For example: doing this may be enough for now: >> >> * @build jdk.test.lib.process.* >> >> But what if in the future, jdk.test.lib.process is restructured to >> have a private package jdk.test.lib.process.hidden? To work around >> CODETOOLS-7901986, all the test cases that must be modified to the >> following, which unnecessarily exposes library implementation details >> to the library users: >> >> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* > > and in fact, there is already similar problem and > http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ > does not > address it. > jdk/test/lib/process/ProcessTools depends on jdk/test/lib/Utils so all > tests which have '@build jdk.test.lib.process.ProcessTools' will have > to have '@build jdk.test.lib.Utils'. then we have OutputAnalyzer > which depends on ProcessTools so all tests which > @build jdk.test.lib.process.OutputAnalyzer will @build ProcessTools > and Utils explicitly. many testlibrary classes which on > jdk.test.lib.process.OutputAnalyzer, so one will have to > specify OutputAnalyzer ProcessTools and Utils in the tests which > depends on other testlibrary classes. to make things even worse, > Utils depends on OutputAnalyzer and there are lots of tests and test > library classes which depend on Utils, so all of them will have to > have at least '@build jdk.test.lib.Utils > jdk.test.lib.process.OutputAnalyzer jdk.test.lib.process.ProcessTools'. > and they will work stable till someone refactors them and extract some > new classes. that is to say, it's nearly impossible to have all > explicit @build actions. > > Cheers, > -- Igor > >> On Jun 1, 2017, at 3:37 PM, Ioi Lam > > wrote: >> >> >> >> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty >>> > wrote: >>>> >>>> Igor, >>>> >>>>> On 1 Jun 2017, at 04:32, Igor Ignatyev >>>> > wrote: >>>>> >>>>> Hi Felix, >>>>> >>>>> I have suggested the exact opposite change[1-3] to fix the same >>>>> problem. >>>> I?m sorry, but this is all just too confusing. After your change, >>>> who, or what, is >>>> responsible for building/compiling the test library dependencies? >>> jtreg is responsible, there is an implicit build for each @run, and >>> jtreg will analyze a test class to get transitive closure for static >>> dependencies, hence you have to have @build only for classes which >>> are not in constant pool, e.g. used only by reflection or whose >>> classnames are only used to spawn a new java instance. >> >> >> I suspect the problem is caused by a long standing bug in jtreg that >> results in library classes being partially compiled. Please see my >> evaluation in >> >> https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 >> >> In the bug report, there is test case that can reliably reproduce the >> NoClassDefFoundError problem. >> >> I think adding all the @build commands in the tests are just >> band-aids. Things will break unless every test explicitly uses @build >> to build every class in every library that they use, including all >> the private classes that are not directly accessible by the test cases. >> >> For example: doing this may be enough for now: >> >> * @build jdk.test.lib.process.* >> >> But what if in the future, jdk.test.lib.process is restructured to >> have a private package jdk.test.lib.process.hidden? To work around >> CODETOOLS-7901986, all the test cases that must be modified to the >> following, which unnecessarily exposes library implementation details >> to the library users: >> >> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >> >> Just imagine this -- "in order to use malloc() you must explicitly >> build not only malloc(), but also sbrk() ... and every other function >> in libc". That seems unreasonable to me. >> >> By the way, we made a fix in the HotSpot tests >> (seehttps://bugs.openjdk.java.net/browse/JDK-8157957) that got rid of >> many (but not all) of the NoClassDefFoundErrors by *removing* the >> @build lines ..... >> >> My proposal is, instead of just adding @build for band-aid, we should >> fix CODETOOLS-7901986 instead. >> >> Thanks >> - Ioi >> >> >>>> >>>> Test library code has no @modules tags, so does not explicitly >>>> declare its >>>> module dependencies. Instead module dependencies, required by test >>>> library code, are declared in the test using the library. If we >>>> wildcard, or >>>> otherwise leave broad build dependencies, from tests then there is no >>>> way to know what new module dependencies may be added in the future. >>>> That is, one of, the reason(s) I asked Felix to be explicit about >>>> the build >>>> dependencies. >>> having explicit builds does not really help w/ module dependency, if >>> someone change a testlibrary class so it starts to depend on another >>> testlibrary class, jtreg will implicitly build it and if this class >>> has some module dependencies, you will have to reflect them in the test. >>> >>> generally speaking, I don't like having explicit build actions >>> because build actions themselves are implicit, so they don't really >>> help, it's still will be hard to spot missed explicit builds. not >>> having (unneeded) explicit builds is an easy rule to follow and we >>> can easily find all places which don't follow this rule by grep. >>> >>> -- Igor >>>> -Chris. >>>> >>>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>>> [2] >>>>> http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>>> [3] >>>>> http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html >>>>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From igor.ignatyev at oracle.com Fri Jun 2 02:11:31 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Thu, 1 Jun 2017 19:11:31 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <7a62f2ee-69bc-0281-3e2f-85a25d917e51@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> <7a62f2ee-69bc-0281-3e2f-85a25d917e51@oracle.com> Message-ID: <4B7BC543-7A5A-4128-9433-DEDC2BCF27DB@oracle.com> Hi Felix, none of the jdk tests which fail w/ NCDFE: jdk/test/lib/process/StreamPumper depend on StreamPumper directly, they get this dependency transitively from jdk/test/lib/process/ProcessTools, so I don't see how you will find this good definition of "explicit" even for the failures at hand. I recommend to work around this the same way we did it in hotspot, which reliably removed almost all our NCDFE failures, -- remove explicit @build, if not all for all classes, then at least for jdk/test/lib/** classes. -- Igor > On Jun 1, 2017, at 6:52 PM, Felix Yang wrote: > > Hi Igor and Ioi, > > I partially agree with you. As initially stated in the proposal and bug(JDK-8181299 ), I don't think this patch is a fix but a quick workaround to make them runnable. > > "explicit" is reasonable for me. But "explicit" should not be restricted as "explicit all, including dependencies", as it is not productive or even realistic in the long term. > Thanks, > Felix > On 2017/6/2 7:58, Igor Ignatyev wrote: >>> For example: doing this may be enough for now: >>> >>> * @build jdk.test.lib.process.* >>> >>> But what if in the future, jdk.test.lib.process is restructured to have a private package jdk.test.lib.process.hidden? To work around CODETOOLS-7901986, all the test cases that must be modified to the following, which unnecessarily exposes library implementation details to the library users: >>> >>> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >> >> and in fact, there is already similar problem and http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ does not address it. >> jdk/test/lib/process/ProcessTools depends on jdk/test/lib/Utils so all tests which have '@build jdk.test.lib.process.ProcessTools' will have to have '@build jdk.test.lib.Utils'. then we have OutputAnalyzer which depends on ProcessTools so all tests which @build jdk.test.lib.process.OutputAnalyzer will @build ProcessTools and Utils explicitly. many testlibrary classes which on jdk.test.lib.process.OutputAnalyzer, so one will have to specify OutputAnalyzer ProcessTools and Utils in the tests which depends on other testlibrary classes. to make things even worse, Utils depends on OutputAnalyzer and there are lots of tests and test library classes which depend on Utils, so all of them will have to have at least '@build jdk.test.lib.Utils jdk.test.lib.process.OutputAnalyzer jdk.test.lib.process.ProcessTools'. and they will work stable till someone refactors them and extract some new classes. that is to say, it's nearly impossible to have all explicit @build actions. >> >> Cheers, >> -- Igor >> >>> On Jun 1, 2017, at 3:37 PM, Ioi Lam > wrote: >>> >>> >>> >>> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>>>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty > wrote: >>>>> >>>>> Igor, >>>>> >>>>>> On 1 Jun 2017, at 04:32, Igor Ignatyev > wrote: >>>>>> >>>>>> Hi Felix, >>>>>> >>>>>> I have suggested the exact opposite change[1-3] to fix the same problem. >>>>> I?m sorry, but this is all just too confusing. After your change, who, or what, is >>>>> responsible for building/compiling the test library dependencies? >>>> jtreg is responsible, there is an implicit build for each @run, and jtreg will analyze a test class to get transitive closure for static dependencies, hence you have to have @build only for classes which are not in constant pool, e.g. used only by reflection or whose classnames are only used to spawn a new java instance. >>> >>> >>> I suspect the problem is caused by a long standing bug in jtreg that results in library classes being partially compiled. Please see my evaluation in >>> >>> https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 >>> >>> In the bug report, there is test case that can reliably reproduce the NoClassDefFoundError problem. >>> >>> I think adding all the @build commands in the tests are just band-aids. Things will break unless every test explicitly uses @build to build every class in every library that they use, including all the private classes that are not directly accessible by the test cases. >>> >>> For example: doing this may be enough for now: >>> >>> * @build jdk.test.lib.process.* >>> >>> But what if in the future, jdk.test.lib.process is restructured to have a private package jdk.test.lib.process.hidden? To work around CODETOOLS-7901986, all the test cases that must be modified to the following, which unnecessarily exposes library implementation details to the library users: >>> >>> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >>> >>> Just imagine this -- "in order to use malloc() you must explicitly build not only malloc(), but also sbrk() ... and every other function in libc". That seems unreasonable to me. >>> >>> By the way, we made a fix in the HotSpot tests (see https://bugs.openjdk.java.net/browse/JDK-8157957 ) that got rid of many (but not all) of the NoClassDefFoundErrors by *removing* the @build lines ..... >>> >>> My proposal is, instead of just adding @build for band-aid, we should fix CODETOOLS-7901986 instead. >>> >>> Thanks >>> - Ioi >>> >>> >>>>> >>>>> Test library code has no @modules tags, so does not explicitly declare its >>>>> module dependencies. Instead module dependencies, required by test >>>>> library code, are declared in the test using the library. If we wildcard, or >>>>> otherwise leave broad build dependencies, from tests then there is no >>>>> way to know what new module dependencies may be added in the future. >>>>> That is, one of, the reason(s) I asked Felix to be explicit about the build >>>>> dependencies. >>>> having explicit builds does not really help w/ module dependency, if someone change a testlibrary class so it starts to depend on another testlibrary class, jtreg will implicitly build it and if this class has some module dependencies, you will have to reflect them in the test. >>>> >>>> generally speaking, I don't like having explicit build actions because build actions themselves are implicit, so they don't really help, it's still will be hard to spot missed explicit builds. not having (unneeded) explicit builds is an easy rule to follow and we can easily find all places which don't follow this rule by grep. >>>> >>>> -- Igor >>>>> -Chris. >>>>> >>>>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>>>> [2] http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>>>> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html > -------------- next part -------------- An HTML attachment was scrubbed... URL: From felix.yang at oracle.com Fri Jun 2 05:13:28 2017 From: felix.yang at oracle.com (Felix Yang) Date: Fri, 2 Jun 2017 13:13:28 +0800 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <4B7BC543-7A5A-4128-9433-DEDC2BCF27DB@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> <7a62f2ee-69bc-0281-3e2f-85a25d917e51@oracle.com> <4B7BC543-7A5A-4128-9433-DEDC2BCF27DB@oracle.com> Message-ID: Igor On 2017/6/2 10:11, Igor Ignatyev wrote: > Hi Felix, > > none of the jdk tests which fail w/ NCDFE: > jdk/test/lib/process/StreamPumper depend on StreamPumper directly, > they get this dependency transitively > from jdk/test/lib/process/ProcessTools, That is why I think it is a bug too. > so I don't see how you will find this good definition of "explicit" > even for the failures at hand. Just meant "expected behavior", as it makes test code clear for me. Of course it fails, otherwise there will be no such discussion at all. -Felix > > I recommend to work around this the same way we did it in hotspot, > which reliably removed almost all our NCDFE failures, -- remove > explicit @build, if not all for all classes, then at least for > jdk/test/lib/** classes. > > -- Igor > >> On Jun 1, 2017, at 6:52 PM, Felix Yang > > wrote: >> >> Hi Igor and Ioi, >> >> I partially agree with you. As initially stated in the proposal >> and bug(JDK-8181299 >> ), I don't think >> this patch is a fix but a quick workaround to make them runnable. >> >> "explicit" is reasonable for me. But "explicit" should not be >> restricted as "explicit all, including dependencies", as it is not >> productive or even realistic in the long term. >> >> Thanks, >> Felix >> On 2017/6/2 7:58, Igor Ignatyev wrote: >>>> For example: doing this may be enough for now: >>>> >>>> * @build jdk.test.lib.process.* >>>> >>>> But what if in the future, jdk.test.lib.process is restructured to >>>> have a private package jdk.test.lib.process.hidden? To work around >>>> CODETOOLS-7901986, all the test cases that must be modified to the >>>> following, which unnecessarily exposes library implementation >>>> details to the library users: >>>> >>>> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >>> >>> and in fact, there is already similar problem and >>> http://cr.openjdk.java.net/~xiaofeya/8181299/webrev.01/ >>> does not >>> address it. >>> jdk/test/lib/process/ProcessTools depends on jdk/test/lib/Utils so >>> all tests which have '@build jdk.test.lib.process.ProcessTools' will >>> have to have '@build jdk.test.lib.Utils'. then we >>> have OutputAnalyzer which depends on ProcessTools so all tests which >>> @build jdk.test.lib.process.OutputAnalyzer will @build ProcessTools >>> and Utils explicitly. many testlibrary classes which on >>> jdk.test.lib.process.OutputAnalyzer, so one will have to >>> specify OutputAnalyzer ProcessTools and Utils in the tests which >>> depends on other testlibrary classes. to make things even worse, >>> Utils depends on OutputAnalyzer and there are lots of tests and test >>> library classes which depend on Utils, so all of them will have to >>> have at least '@build jdk.test.lib.Utils >>> jdk.test.lib.process.OutputAnalyzer jdk.test.lib.process.ProcessTools'. >>> and they will work stable till someone refactors them and extract >>> some new classes. that is to say, it's nearly impossible to have all >>> explicit @build actions. >>> >>> Cheers, >>> -- Igor >>> >>>> On Jun 1, 2017, at 3:37 PM, Ioi Lam >>> > wrote: >>>> >>>> >>>> >>>> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>>>>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty >>>>>> > wrote: >>>>>> >>>>>> Igor, >>>>>> >>>>>>> On 1 Jun 2017, at 04:32, Igor Ignatyev >>>>>> > wrote: >>>>>>> >>>>>>> Hi Felix, >>>>>>> >>>>>>> I have suggested the exact opposite change[1-3] to fix the same >>>>>>> problem. >>>>>> I?m sorry, but this is all just too confusing. After your change, >>>>>> who, or what, is >>>>>> responsible for building/compiling the test library dependencies? >>>>> jtreg is responsible, there is an implicit build for each @run, >>>>> and jtreg will analyze a test class to get transitive closure for >>>>> static dependencies, hence you have to have @build only for >>>>> classes which are not in constant pool, e.g. used only by >>>>> reflection or whose classnames are only used to spawn a new java >>>>> instance. >>>> >>>> >>>> I suspect the problem is caused by a long standing bug in jtreg >>>> that results in library classes being partially compiled. Please >>>> see my evaluation in >>>> >>>> https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 >>>> >>>> In the bug report, there is test case that can reliably reproduce >>>> the NoClassDefFoundError problem. >>>> >>>> I think adding all the @build commands in the tests are just >>>> band-aids. Things will break unless every test explicitly uses >>>> @build to build every class in every library that they use, >>>> including all the private classes that are not directly accessible >>>> by the test cases. >>>> >>>> For example: doing this may be enough for now: >>>> >>>> * @build jdk.test.lib.process.* >>>> >>>> But what if in the future, jdk.test.lib.process is restructured to >>>> have a private package jdk.test.lib.process.hidden? To work around >>>> CODETOOLS-7901986, all the test cases that must be modified to the >>>> following, which unnecessarily exposes library implementation >>>> details to the library users: >>>> >>>> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >>>> >>>> Just imagine this -- "in order to use malloc() you must explicitly >>>> build not only malloc(), but also sbrk() ... and every other >>>> function in libc". That seems unreasonable to me. >>>> >>>> By the way, we made a fix in the HotSpot tests >>>> (seehttps://bugs.openjdk.java.net/browse/JDK-8157957) that got rid >>>> of many (but not all) of the NoClassDefFoundErrors by *removing* >>>> the @build lines ..... >>>> >>>> My proposal is, instead of just adding @build for band-aid, we >>>> should fix CODETOOLS-7901986 instead. >>>> >>>> Thanks >>>> - Ioi >>>> >>>> >>>>>> >>>>>> Test library code has no @modules tags, so does not explicitly >>>>>> declare its >>>>>> module dependencies. Instead module dependencies, required by test >>>>>> library code, are declared in the test using the library. If we >>>>>> wildcard, or >>>>>> otherwise leave broad build dependencies, from tests then there is no >>>>>> way to know what new module dependencies may be added in the future. >>>>>> That is, one of, the reason(s) I asked Felix to be explicit about >>>>>> the build >>>>>> dependencies. >>>>> having explicit builds does not really help w/ module dependency, >>>>> if someone change a testlibrary class so it starts to depend on >>>>> another testlibrary class, jtreg will implicitly build it and if >>>>> this class has some module dependencies, you will have to reflect >>>>> them in the test. >>>>> >>>>> generally speaking, I don't like having explicit build actions >>>>> because build actions themselves are implicit, so they don't >>>>> really help, it's still will be hard to spot missed explicit >>>>> builds. not having (unneeded) explicit builds is an easy rule to >>>>> follow and we can easily find all places which don't follow this >>>>> rule by grep. >>>>> >>>>> -- Igor >>>>>> -Chris. >>>>>> >>>>>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>>>>> [2] >>>>>>> http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>>>>> [3] >>>>>>> http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html >>>>>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From volker.simonis at gmail.com Fri Jun 2 08:41:13 2017 From: volker.simonis at gmail.com (Volker Simonis) Date: Fri, 2 Jun 2017 10:41:13 +0200 Subject: JPMS Access Checks, Verification and the Security Manager In-Reply-To: <5930AD97.3090406@oracle.com> References: <13bb0dc2-9518-b72c-385f-1db95a8edab9@oracle.com> <59249529.3030103@oracle.com> <5930AD97.3090406@oracle.com> Message-ID: Thanks Alex, that makes sense. Regards, Volker On Fri, Jun 2, 2017 at 2:13 AM, Alex Buckley wrote: > On 5/24/2017 12:13 AM, Volker Simonis wrote: >> >> OK, so from what you say I understand that the verification errors I >> see with the Security Manager enabled are an implementation detail of >> HotSpot (because verification uses the same class loading mechanism >> like the runtime) which is not required but still acceptable under the >> JVMS. Is that correct? > > > The JVMS is precise about which exceptions are allowed to be thrown by a JVM > implementation during verification, and AccessControlException is not one of > them. However, the JVMS is only one part of the Java SE Platform > Specification. It is quite proper if another part specifies an > AccessControlException when a class in a restricted package is referenced by > a class without permission. > > I'm thinking in particular of the API specification for > SecurityManager::checkPackageAccess. It states, "This method is called by > the loadClass method of class loaders." Plainly, the intention is that a > class (Tricky) which initiates the loading of another class > (com.sun.crypto.provider.SunJCE) can do so only if it has permission to > reference the other class. Unfortunately, the statement as written is only > guaranteed to be true for the built-in class loaders of the Java SE Platform > and not for user-defined class loaders. Accordingly, we will update the API > specification to clarify how a JVM implementation may support the Security > Manager in checking permissions when classes are loaded and resolved. But to > answer your original question, an application CAN fail because the verifier > can't load classes due to Security Manager restrictions; you may have to > grant additional permissions if application classes wish to reference > certain JDK 9 packages. > > Alex From daniel.fuchs at oracle.com Fri Jun 2 09:19:58 2017 From: daniel.fuchs at oracle.com (Daniel Fuchs) Date: Fri, 2 Jun 2017 10:19:58 +0100 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> Message-ID: <5f94d6e8-0a0d-bcf5-c5ed-9a522af3e213@oracle.com> Hi guys, The jtreg bug really needs to be fixed. What I hear is that adding an explicit @build in one test can make an unrelated test that depends on the same library but doesn't have the explicit @build fail (and possibly randomly and intermittently depending of the order in which tests are run). This is very unintuitive, and the 'obvious' (thouhj maybe wrong) fix for anyone stumbling on the issue would be to fix the failing test by adding the explicit @build - not grep the whole test base in search for a test that might have an explicit @build, which as pointed elsewhere might well be legitimate if the test is using reflection. So until the jtreg bug is fixed, I am not at all sure that removing all the explicit @build is the correct thing to do, as it's still bound to make existing unrelated tests fail randomly if new tests with an explicit @build are added later on... my2c -- daniel On 01/06/2017 23:37, Ioi Lam wrote: > > > On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty >>> wrote: >>> >>> Igor, >>> >>>> On 1 Jun 2017, at 04:32, Igor Ignatyev >>>> wrote: >>>> >>>> Hi Felix, >>>> >>>> I have suggested the exact opposite change[1-3] to fix the same >>>> problem. >>> I?m sorry, but this is all just too confusing. After your change, >>> who, or what, is >>> responsible for building/compiling the test library dependencies? >> jtreg is responsible, there is an implicit build for each @run, and >> jtreg will analyze a test class to get transitive closure for static >> dependencies, hence you have to have @build only for classes which are >> not in constant pool, e.g. used only by reflection or whose classnames >> are only used to spawn a new java instance. > > > I suspect the problem is caused by a long standing bug in jtreg that > results in library classes being partially compiled. Please see my > evaluation in > > https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 > > In the bug report, there is test case that can reliably reproduce the > NoClassDefFoundError problem. > > I think adding all the @build commands in the tests are just band-aids. > Things will break unless every test explicitly uses @build to build > every class in every library that they use, including all the private > classes that are not directly accessible by the test cases. > > For example: doing this may be enough for now: > > * @build jdk.test.lib.process.* > > But what if in the future, jdk.test.lib.process is restructured to have > a private package jdk.test.lib.process.hidden? To work around > CODETOOLS-7901986, all the test cases that must be modified to the > following, which unnecessarily exposes library implementation details to > the library users: > > * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* > > Just imagine this -- "in order to use malloc() you must explicitly build > not only malloc(), but also sbrk() ... and every other function in > libc". That seems unreasonable to me. > > By the way, we made a fix in the HotSpot tests (see > https://bugs.openjdk.java.net/browse/JDK-8157957) that got rid of many > (but not all) of the NoClassDefFoundErrors by *removing* the @build > lines ..... > > My proposal is, instead of just adding @build for band-aid, we should > fix CODETOOLS-7901986 instead. > > Thanks > - Ioi > > >>> >>> Test library code has no @modules tags, so does not explicitly >>> declare its >>> module dependencies. Instead module dependencies, required by test >>> library code, are declared in the test using the library. If we >>> wildcard, or >>> otherwise leave broad build dependencies, from tests then there is no >>> way to know what new module dependencies may be added in the future. >>> That is, one of, the reason(s) I asked Felix to be explicit about the >>> build >>> dependencies. >> having explicit builds does not really help w/ module dependency, if >> someone change a testlibrary class so it starts to depend on another >> testlibrary class, jtreg will implicitly build it and if this class >> has some module dependencies, you will have to reflect them in the test. >> >> generally speaking, I don't like having explicit build actions because >> build actions themselves are implicit, so they don't really help, it's >> still will be hard to spot missed explicit builds. not having >> (unneeded) explicit builds is an easy rule to follow and we can easily >> find all places which don't follow this rule by grep. >> >> -- Igor >>> -Chris. >>> >>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>> [2] >>>> http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>> >>>> [3] http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html > From chris.hegarty at oracle.com Fri Jun 2 13:40:32 2017 From: chris.hegarty at oracle.com (Chris Hegarty) Date: Fri, 2 Jun 2017 14:40:32 +0100 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> Message-ID: <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> On 02/06/17 00:14, Ioi Lam wrote: > ... > > The gem is hidden in the compile.0.jta file. It contains something like: > > -sourcepath :/jdk/foobar/test/lib: > > So if my test refers to a class under /test/lib, such as > jdk.test.lib.process.ProcessTools, javac will be able to locate it under > /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will > build it automatically. > > So really, there's no reason why the test must explicitly do an @build > of the library classes that it uses. Sure, you're relying on the implicit compilation of dependencies by javac. Look at the output, where it compiles the library classes to. It is part of the classes directory for the individual test. That means that the library classes will need to be compiled many many times. The @build tag will compile the library classes to a common output directory, where they can be reused ( unless I'm missing something ). -Chris. From igor.ignatyev at oracle.com Fri Jun 2 19:10:16 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Fri, 2 Jun 2017 12:10:16 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> Message-ID: > On Jun 2, 2017, at 9:14 AM, Ioi Lam wrote: > > On 6/2/17 8:44 AM, Ioi Lam wrote: >> >> On 6/2/17 6:40 AM, Chris Hegarty wrote: >>> On 02/06/17 00:14, Ioi Lam wrote: >>>> ... >>>> >>>> The gem is hidden in the compile.0.jta file. It contains something like: >>>> >>>> -sourcepath :/jdk/foobar/test/lib: >>>> >>>> So if my test refers to a class under /test/lib, such as >>>> jdk.test.lib.process.ProcessTools, javac will be able to locate it under >>>> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >>>> build it automatically. >>>> >>>> So really, there's no reason why the test must explicitly do an @build >>>> of the library classes that it uses. >>> >>> Sure, you're relying on the implicit compilation of dependencies >>> by javac. Look at the output, where it compiles the library >>> classes to. It is part of the classes directory for the >>> individual test. That means that the library classes will need >>> to be compiled many many times. The @build tag will compile >>> the library classes to a common output directory, where they >>> can be reused ( unless I'm missing something ). >>> >>> -Chris. >> Yes, @build will compile classes so that they can be reused. But why should it be the responsibility of every test to do this? >> >> To reuse my malloc metaphore -- is it reasonable for every program that uses malloc to explicitly build libc? >> >> By the way, jtreg arranges the output directory of the test by the directory they sit in, so >> >> jdk/test/foo/bar/XTest.java >> jdk/test/foo/bar/YTest.java >> >> will all output their .class files to the same directory. Therefore, the amount of duplicated classes is not as bad as you might think. We've been omitting the @build tags in the hotspot tests and we haven't seen any problems. >> >> - Ioi > To avoid repeat compilation of the library classes, a more reasonable solution would be: > > [1] Before test execution -- scan all the selected test to find all libraries specified by @library tags > > [2] Fully compile all the libraries into their own output directories > > [3] Then, start execution of the selected tests unfortunately, it is not that simple, there are at least 2 problems w/ that approach: 1. some of library classes have extra module dependency, e.g. jdk.test.lib.management.* depend on jdk.management module, ExtendedRobot (from jdk/test/testlibrary) depends on java.desktop. so compiling the whole library will require extra module dependency, which might be unneeded for the selected tests, as a result we won't be able to run these tests on configurations w/ limited module set. 2. to make our tests packagefull, we had to add '@library /' to many hotspot/test/compiler tests, so we will have to compile all files from hotspot/test. my take on all of this is that determination of output directory for classes is buggy, it uses directory of a @build or @run target to decide where put all produced classes files, but it should have mapping between source and destination paths instead, so all classes from jdk/test/foo/bar/ will go to a test scratch directory and all classes from /test/lib/ (assuming they are declared as @library) and /jdk/test/lib/ to different common directories which will be later added to classpath for the tests which use these libraries. Thanks, -- Igor From valerie.peng at oracle.com Fri Jun 2 22:47:06 2017 From: valerie.peng at oracle.com (Valerie Peng) Date: Fri, 2 Jun 2017 15:47:06 -0700 Subject: Code Review Request, JDK-8181439 Test the jdk.tls.namedGroups System Property In-Reply-To: <15e96b15-0cf9-e776-339f-51c70e0a00b8@oracle.com> References: <15e96b15-0cf9-e776-339f-51c70e0a00b8@oracle.com> Message-ID: Looks fine. Valerie On 6/1/2017 9:46 AM, Xuelei Fan wrote: > Hi Valerie, > > Please review the test update: > http://cr.openjdk.java.net/~xuelei/8181439/webrev.00/ > > More test cases are added for FFDHE groups so as to check unavailable > groups in some platforms or JDK releases. > > Thanks, > Xuelei From david.holmes at oracle.com Sat Jun 3 01:43:42 2017 From: david.holmes at oracle.com (David Holmes) Date: Sat, 3 Jun 2017 11:43:42 +1000 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> Message-ID: <891dd5ce-9158-549c-4e1d-02534b3f08c2@oracle.com> On 3/06/2017 5:10 AM, Igor Ignatyev wrote: > >> On Jun 2, 2017, at 9:14 AM, Ioi Lam wrote: >> >> On 6/2/17 8:44 AM, Ioi Lam wrote: >>> >>> On 6/2/17 6:40 AM, Chris Hegarty wrote: >>>> On 02/06/17 00:14, Ioi Lam wrote: >>>>> ... >>>>> >>>>> The gem is hidden in the compile.0.jta file. It contains something like: >>>>> >>>>> -sourcepath :/jdk/foobar/test/lib: >>>>> >>>>> So if my test refers to a class under /test/lib, such as >>>>> jdk.test.lib.process.ProcessTools, javac will be able to locate it under >>>>> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >>>>> build it automatically. >>>>> >>>>> So really, there's no reason why the test must explicitly do an @build >>>>> of the library classes that it uses. >>>> >>>> Sure, you're relying on the implicit compilation of dependencies >>>> by javac. Look at the output, where it compiles the library >>>> classes to. It is part of the classes directory for the >>>> individual test. That means that the library classes will need >>>> to be compiled many many times. The @build tag will compile >>>> the library classes to a common output directory, where they >>>> can be reused ( unless I'm missing something ). >>>> >>>> -Chris. >>> Yes, @build will compile classes so that they can be reused. But why should it be the responsibility of every test to do this? >>> >>> To reuse my malloc metaphore -- is it reasonable for every program that uses malloc to explicitly build libc? >>> >>> By the way, jtreg arranges the output directory of the test by the directory they sit in, so >>> >>> jdk/test/foo/bar/XTest.java >>> jdk/test/foo/bar/YTest.java >>> >>> will all output their .class files to the same directory. Therefore, the amount of duplicated classes is not as bad as you might think. We've been omitting the @build tags in the hotspot tests and we haven't seen any problems. >>> >>> - Ioi >> To avoid repeat compilation of the library classes, a more reasonable solution would be: >> >> [1] Before test execution -- scan all the selected test to find all libraries specified by @library tags >> >> [2] Fully compile all the libraries into their own output directories >> >> [3] Then, start execution of the selected tests > > unfortunately, it is not that simple, there are at least 2 problems w/ that approach: > 1. some of library classes have extra module dependency, e.g. jdk.test.lib.management.* depend on jdk.management module, ExtendedRobot (from jdk/test/testlibrary) depends on java.desktop. so compiling the whole library will require extra module dependency, which might be unneeded for the selected tests, as a result we won't be able to run these tests on configurations w/ limited module set. > 2. to make our tests packagefull, we had to add '@library /' to many hotspot/test/compiler tests, so we will have to compile all files from hotspot/test. > > my take on all of this is that determination of output directory for classes is buggy, it uses directory of a @build or @run target to decide where put all produced classes files, but it should have mapping between source and destination paths instead, so all classes from jdk/test/foo/bar/ will go to a test scratch directory and all classes from /test/lib/ (assuming they are declared as @library) and /jdk/test/lib/ to different common directories which will be later added to classpath for the tests which use these libraries. But unless you explicitly compile the library classes you can't control where the class files are placed. The tests are compiled with a "-d" directive, so all classes, directly and implicitly compiled will be relative to that directory based on their package. If every test were declared in a package based on the source arrangement then jtreg would be able to use a common output directory. Neither suggested approach seems a great solution to me. Implicit compilation wastes effort rebuilding the libraries. Explicit compilation is instrusive and difficult to get right - and I have no idea how to get the module dependency stuff sorted out. Maybe the design flaw here is attempting to combine a test library with the tests that use it. You either want it to be a binary library jtreg can be pointed at, or you need a way to tell jtreg to build the library first and then use. But IIUC jtreg isn't set up to handle that - but it could be handled by running jtreg via make. My 2c. David > Thanks, > -- Igor > > > From igor.ignatyev at oracle.com Sat Jun 3 01:45:09 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Fri, 2 Jun 2017 18:45:09 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> Message-ID: <5CAA67DE-1B70-431C-A15A-C898A8B608CE@oracle.com> I have measured how much time it takes to run :tier1 w/ and and w/o the fix which removes @build for jdk.test.lib.** classes[1-2]: - w/o 8181391, i.e. w/ @build: real 33m4.624s, user 111m56.758s, sys 6m24.022s. [3] is a breakdown for jtreg actions - w/ 8181391, i.e. w/o @build: real 32m17.259s, user 109m18.236s, sys 6m9.669s. [4] is a breakdown for jtreg actions as you can see there is no much difference in execution time, and the run w/o @build action was even a bit faster. the total time spend on build was lower. hence I'd say removing @build actions does not impact overall execution time. Even if it did, I don't think I'd prefer us to choose isolation and determinism over small performance improvements. as Ioi and I stated before, removing @build actions did not help in all cases in hotspot. the root cause of this is having @run actions whose target is a class from library, this is identical to have explicit @build action for this class. if this class has dependency on other classes from testlibrary, you can get a testlibrary split into different locations and as a results NCDFE in runtime due to CODETOOLS-7901986. Fortunately, it is not the case for jdk tests, the only test library class which is used in @run is ClassFileInstaller, which does not have any dependencies. Therefore I think removing explicit @build is a more reliable and clearer way to work around current problems and it does not have a big drawback if any. PS measurements were done on my mac 3.1 GHz Intel Core i7, 16 GB 1867 MHz DDR3, jtreg was run w/ "-conc:8 -agentvm" [1] http://scaab055.us.oracle.com:9502/vmsqe/home/iignatye/webrev//8181391/webrev.00/index.html [2] https://bugs.openjdk.java.net/browse/JDK-8181391 [3] compile: 826.206 build: 776.955 testng: 5362.58 junit: 640.705 shell: 861.206 main: 6823.19 clean: 0.004 driver: 6.578 [4] compile: 829.317 build: 774.904 testng: 5251 junit: 648.888 shell: 852.658 main: 6686.99 clean: 0.002 driver: 5.973 Thanks, -- Igor > On Jun 2, 2017, at 8:44 AM, Ioi Lam wrote: > > On 6/2/17 6:40 AM, Chris Hegarty wrote: >> On 02/06/17 00:14, Ioi Lam wrote: >>> ... >>> >>> The gem is hidden in the compile.0.jta file. It contains something like: >>> >>> -sourcepath :/jdk/foobar/test/lib: >>> >>> So if my test refers to a class under /test/lib, such as >>> jdk.test.lib.process.ProcessTools, javac will be able to locate it under >>> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >>> build it automatically. >>> >>> So really, there's no reason why the test must explicitly do an @build >>> of the library classes that it uses. >> >> Sure, you're relying on the implicit compilation of dependencies >> by javac. Look at the output, where it compiles the library >> classes to. It is part of the classes directory for the >> individual test. That means that the library classes will need >> to be compiled many many times. The @build tag will compile >> the library classes to a common output directory, where they >> can be reused ( unless I'm missing something ). >> >> -Chris. > Yes, @build will compile classes so that they can be reused. But why should it be the responsibility of every test to do this? > > To reuse my malloc metaphore -- is it reasonable for every program that uses malloc to explicitly build libc? > > By the way, jtreg arranges the output directory of the test by the directory they sit in, so > > jdk/test/foo/bar/XTest.java > jdk/test/foo/bar/YTest.java > > will all output their .class files to the same directory. Therefore, the amount of duplicated classes is not as bad as you might think. We've been omitting the @build tags in the hotspot tests and we haven't seen any problems. > > - Ioi -------------- next part -------------- An HTML attachment was scrubbed... URL: From sha.jiang at oracle.com Mon Jun 5 10:31:54 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Mon, 5 Jun 2017 18:31:54 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases Message-ID: Hi, Please review this manual test for checking if a jar, which is signed and timestamped by a JDK build, could be verified by other JDK builds. It also can be used to check if the default timestamp digest algorithm on signing is SHA-256. For more details, please look through the test summary. Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ Best regards, John Jiang From xuelei.fan at oracle.com Mon Jun 5 21:15:23 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Mon, 5 Jun 2017 14:15:23 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints Message-ID: Hi, Please review the JDK 10 update: http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ This update extends the DisabledAlgorithmConstraints implementation by checking the AlgorithmParameters, which is ignored at present. Thanks, Xuelei From igor.ignatyev at oracle.com Mon Jun 5 22:20:49 2017 From: igor.ignatyev at oracle.com (Igor Ignatyev) Date: Mon, 5 Jun 2017 15:20:49 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <5CAA67DE-1B70-431C-A15A-C898A8B608CE@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> <5CAA67DE-1B70-431C-A15A-C898A8B608CE@oracle.com> Message-ID: <0A7DAAB5-E467-4675-A1B6-7915C052A682@oracle.com> just out of curiosity I have removed @build for all jdk.testlibary classes as well and run :tier1, it took approximately the same amount of time[1], the breakdown[2] shows that we spend 70 seconds more on build actions, which is not that big comparing to total execution time. -- Igor [1] real 33m39.697s user 112m5.722s sys 6m26.973s [2] compile: 903.9 build: 846.53 testng: 5515.98 junit: 674.429 shell: 885.103 main: 6973.12 clean: 0.001 driver: 9.125 > On Jun 2, 2017, at 6:45 PM, Igor Ignatyev wrote: > > I have measured how much time it takes to run :tier1 w/ and and w/o the fix which removes @build for jdk.test.lib.** classes[1-2]: > - w/o 8181391, i.e. w/ @build: real 33m4.624s, user 111m56.758s, sys 6m24.022s. [3] is a breakdown for jtreg actions > - w/ 8181391, i.e. w/o @build: real 32m17.259s, user 109m18.236s, sys 6m9.669s. [4] is a breakdown for jtreg actions > as you can see there is no much difference in execution time, and the run w/o @build action was even a bit faster. the total time spend on build was lower. hence I'd say removing @build actions does not impact overall execution time. Even if it did, I don't think I'd prefer us to choose isolation and determinism over small performance improvements. > > as Ioi and I stated before, removing @build actions did not help in all cases in hotspot. the root cause of this is having @run actions whose target is a class from library, this is identical to have explicit @build action for this class. if this class has dependency on other classes from testlibrary, you can get a testlibrary split into different locations and as a results NCDFE in runtime due to CODETOOLS-7901986. Fortunately, it is not the case for jdk tests, the only test library class which is used in @run is ClassFileInstaller, which does not have any dependencies. Therefore I think removing explicit @build is a more reliable and clearer way to work around current problems and it does not have a big drawback if any. > > PS measurements were done on my mac 3.1 GHz Intel Core i7, 16 GB 1867 MHz DDR3, jtreg was run w/ "-conc:8 -agentvm" > > [1] http://scaab055.us.oracle.com:9502/vmsqe/home/iignatye/webrev//8181391/webrev.00/index.html > [2] https://bugs.openjdk.java.net/browse/JDK-8181391 > [3] > compile: 826.206 > build: 776.955 > testng: 5362.58 > junit: 640.705 > shell: 861.206 > main: 6823.19 > clean: 0.004 > driver: 6.578 > [4] > compile: 829.317 > build: 774.904 > testng: 5251 > junit: 648.888 > shell: 852.658 > main: 6686.99 > clean: 0.002 > driver: 5.973 > > Thanks, > -- Igor > >> On Jun 2, 2017, at 8:44 AM, Ioi Lam > wrote: >> >> On 6/2/17 6:40 AM, Chris Hegarty wrote: >>> On 02/06/17 00:14, Ioi Lam wrote: >>>> ... >>>> >>>> The gem is hidden in the compile.0.jta file. It contains something like: >>>> >>>> -sourcepath :/jdk/foobar/test/lib: >>>> >>>> So if my test refers to a class under /test/lib, such as >>>> jdk.test.lib.process.ProcessTools, javac will be able to locate it under >>>> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >>>> build it automatically. >>>> >>>> So really, there's no reason why the test must explicitly do an @build >>>> of the library classes that it uses. >>> >>> Sure, you're relying on the implicit compilation of dependencies >>> by javac. Look at the output, where it compiles the library >>> classes to. It is part of the classes directory for the >>> individual test. That means that the library classes will need >>> to be compiled many many times. The @build tag will compile >>> the library classes to a common output directory, where they >>> can be reused ( unless I'm missing something ). >>> >>> -Chris. >> Yes, @build will compile classes so that they can be reused. But why should it be the responsibility of every test to do this? >> >> To reuse my malloc metaphore -- is it reasonable for every program that uses malloc to explicitly build libc? >> >> By the way, jtreg arranges the output directory of the test by the directory they sit in, so >> >> jdk/test/foo/bar/XTest.java >> jdk/test/foo/bar/YTest.java >> >> will all output their .class files to the same directory. Therefore, the amount of duplicated classes is not as bad as you might think. We've been omitting the @build tags in the hotspot tests and we haven't seen any problems. >> >> - Ioi > -------------- next part -------------- An HTML attachment was scrubbed... URL: From weijun.wang at oracle.com Tue Jun 6 05:53:23 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Tue, 6 Jun 2017 13:53:23 +0800 Subject: RFR 8181461: sun/security/krb5/auto/KdcPolicy.java fails with java.lang.Exception: Does not match Message-ID: <0c0009c5-d62b-bbd9-53c8-f41be2c6426d@oracle.com> Please take a review on this change: http://cr.openjdk.java.net/~weijun/8181461/webrev.00/ This is a test bug and the fix is simply: // 1. Default policy is tryLast .... writeConf(1, 3000, p1, p3); - test("a3000c3000c3000|a3000c3000-|a3000c3000c3000-"); + test("a3000c3000c3000|a3000c3000-|a3000c3000c3000a3000-"); Here, max_retries is 1 and timeout is 3000ms. A is a KDC that never replies, and C is one that usually replies in time. Here the test client might send out 2 AS_REQs, the initial one and the one with preauth. We should observe these possible results: (1). C always replies in time: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, succeed (c3000) 3. AS_REQ with preauth sent to C (try last good), succeed (c3000) (2). C fails the 1st time: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, timeout (c3000) 3. Final result is failure (-) (3). C succeeds for the 1st time but fails later: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, succeed (c3000) 3. AS_REQ with preauth sent to C (try last good), timeout (c3000) 4. AS_REQ with preauth sent to A, timeout (a3000) 5. Final result is failure (-) The original test code has a bug with case (3), where it assumes #4 above is not sent, this is wrong. AS_REQ with preauth is a new request different from the initial AS_REQ. The order of preference is changed according to the policy (set to tryLast) but all KDCs will still be tried. Thanks Max From sibabrata.sahoo at oracle.com Tue Jun 6 06:57:43 2017 From: sibabrata.sahoo at oracle.com (Sibabrata Sahoo) Date: Mon, 5 Jun 2017 23:57:43 -0700 (PDT) Subject: RFR 8181461: sun/security/krb5/auto/KdcPolicy.java fails with java.lang.Exception: Does not match In-Reply-To: <0c0009c5-d62b-bbd9-53c8-f41be2c6426d@oracle.com> References: <0c0009c5-d62b-bbd9-53c8-f41be2c6426d@oracle.com> Message-ID: Change looks fine to me. But I am not the reviewer yet. Thanks, Siba -----Original Message----- From: Weijun Wang Sent: Tuesday, June 06, 2017 11:23 AM To: Security Dev OpenJDK Cc: Gustavo Galimberti; Sibabrata Sahoo Subject: RFR 8181461: sun/security/krb5/auto/KdcPolicy.java fails with java.lang.Exception: Does not match Please take a review on this change: http://cr.openjdk.java.net/~weijun/8181461/webrev.00/ This is a test bug and the fix is simply: // 1. Default policy is tryLast .... writeConf(1, 3000, p1, p3); - test("a3000c3000c3000|a3000c3000-|a3000c3000c3000-"); + test("a3000c3000c3000|a3000c3000-|a3000c3000c3000a3000-"); Here, max_retries is 1 and timeout is 3000ms. A is a KDC that never replies, and C is one that usually replies in time. Here the test client might send out 2 AS_REQs, the initial one and the one with preauth. We should observe these possible results: (1). C always replies in time: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, succeed (c3000) 3. AS_REQ with preauth sent to C (try last good), succeed (c3000) (2). C fails the 1st time: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, timeout (c3000) 3. Final result is failure (-) (3). C succeeds for the 1st time but fails later: 1. Initial AS_REQ sent to A, timeout (a3000) 2. Initial AS_REQ sent to C, succeed (c3000) 3. AS_REQ with preauth sent to C (try last good), timeout (c3000) 4. AS_REQ with preauth sent to A, timeout (a3000) 5. Final result is failure (-) The original test code has a bug with case (3), where it assumes #4 above is not sent, this is wrong. AS_REQ with preauth is a new request different from the initial AS_REQ. The order of preference is changed according to the policy (set to tryLast) but all KDCs will still be tried. Thanks Max From xuelei.fan at oracle.com Tue Jun 6 07:25:06 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Tue, 6 Jun 2017 00:25:06 -0700 Subject: RFR 8181461: sun/security/krb5/auto/KdcPolicy.java fails with java.lang.Exception: Does not match In-Reply-To: <0c0009c5-d62b-bbd9-53c8-f41be2c6426d@oracle.com> References: <0c0009c5-d62b-bbd9-53c8-f41be2c6426d@oracle.com> Message-ID: <7bd98c9f-da58-ac13-b52a-d66519d3f189@oracle.com> Please update the copyright year. Otherwise, looks fine to me. Xuelei On 6/5/2017 10:53 PM, Weijun Wang wrote: > Please take a review on this change: > > http://cr.openjdk.java.net/~weijun/8181461/webrev.00/ > > This is a test bug and the fix is simply: > > // 1. Default policy is tryLast > .... > writeConf(1, 3000, p1, p3); > - test("a3000c3000c3000|a3000c3000-|a3000c3000c3000-"); > + test("a3000c3000c3000|a3000c3000-|a3000c3000c3000a3000-"); > > Here, max_retries is 1 and timeout is 3000ms. A is a KDC that never > replies, and C is one that usually replies in time. > > Here the test client might send out 2 AS_REQs, the initial one and the > one with preauth. We should observe these possible results: > > (1). C always replies in time: > > 1. Initial AS_REQ sent to A, timeout (a3000) > 2. Initial AS_REQ sent to C, succeed (c3000) > 3. AS_REQ with preauth sent to C (try last good), succeed (c3000) > > (2). C fails the 1st time: > > 1. Initial AS_REQ sent to A, timeout (a3000) > 2. Initial AS_REQ sent to C, timeout (c3000) > 3. Final result is failure (-) > > (3). C succeeds for the 1st time but fails later: > > 1. Initial AS_REQ sent to A, timeout (a3000) > 2. Initial AS_REQ sent to C, succeed (c3000) > 3. AS_REQ with preauth sent to C (try last good), timeout (c3000) > 4. AS_REQ with preauth sent to A, timeout (a3000) > 5. Final result is failure (-) > > The original test code has a bug with case (3), where it assumes #4 > above is not sent, this is wrong. AS_REQ with preauth is a new request > different from the initial AS_REQ. The order of preference is changed > according to the policy (set to tryLast) but all KDCs will still be tried. > > Thanks > Max From sean.coffey at oracle.com Tue Jun 6 08:40:35 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Tue, 6 Jun 2017 09:40:35 +0100 Subject: RFR : 8181205:JRE fails to load/register security providers when started from UNC pathname In-Reply-To: <05e5ab76-ac7a-d38f-197a-f7d5ea8523a6@oracle.com> References: <05e5ab76-ac7a-d38f-197a-f7d5ea8523a6@oracle.com> Message-ID: <1f0f16d4-be5f-48a2-e5e7-bf19437eb64a@oracle.com> ping. Can I get a review for this please ? regards, Sean. On 01/06/2017 17:23, Se?n Coffey wrote: > The recent JDK-8163528 fix caused a regression for JDK binaries > launched with a UNC pathname. We can use the Paths class to create the > required File. I managed to put a test together which should test > this code path. > > webrev : http://cr.openjdk.java.net/~coffeys/webrev.8181205/webrev/ > JBS record : https://bugs.openjdk.java.net/browse/JDK-8181205 > > regards, > Sean. > From xuelei.fan at oracle.com Tue Jun 6 15:28:46 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Tue, 6 Jun 2017 08:28:46 -0700 Subject: RFR : 8181205:JRE fails to load/register security providers when started from UNC pathname In-Reply-To: <1f0f16d4-be5f-48a2-e5e7-bf19437eb64a@oracle.com> References: <05e5ab76-ac7a-d38f-197a-f7d5ea8523a6@oracle.com> <1f0f16d4-be5f-48a2-e5e7-bf19437eb64a@oracle.com> Message-ID: <0b9dc30d-30d4-3ef7-30f6-16c5ecefe9b4@oracle.com> Looks fine to me. There is no clear reason why Paths.get(uri).toFile().getParentFile() works but not File(uri).getParentFile(). Would you mind add a comment so that the code will not modified back to use File(uri) later? Thanks, Xuelei On 6/6/2017 1:40 AM, Se?n Coffey wrote: > ping. Can I get a review for this please ? > > regards, > Sean. > > > On 01/06/2017 17:23, Se?n Coffey wrote: >> The recent JDK-8163528 fix caused a regression for JDK binaries >> launched with a UNC pathname. We can use the Paths class to create the >> required File. I managed to put a test together which should test >> this code path. >> >> webrev : http://cr.openjdk.java.net/~coffeys/webrev.8181205/webrev/ >> JBS record : https://bugs.openjdk.java.net/browse/JDK-8181205 >> >> regards, >> Sean. >> > From sean.coffey at oracle.com Tue Jun 6 17:21:13 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Tue, 6 Jun 2017 18:21:13 +0100 Subject: RFR : 8181205:JRE fails to load/register security providers when started from UNC pathname In-Reply-To: <0b9dc30d-30d4-3ef7-30f6-16c5ecefe9b4@oracle.com> References: <05e5ab76-ac7a-d38f-197a-f7d5ea8523a6@oracle.com> <1f0f16d4-be5f-48a2-e5e7-bf19437eb64a@oracle.com> <0b9dc30d-30d4-3ef7-30f6-16c5ecefe9b4@oracle.com> Message-ID: <333e1aa7-c370-6829-b724-09601e296ad5@oracle.com> On 06/06/2017 16:28, Xuelei Fan wrote: > Looks fine to me. > > There is no clear reason why Paths.get(uri).toFile().getParentFile() > works but not File(uri).getParentFile(). Would you mind add a comment > so that the code will not modified back to use File(uri) later? Thanks for review. It seems to be a long standing issue with URL/URI interoperability . If the URL contained extra slashes, the conversion works seamlessly. i.e. works : new URL("file:////MyComputer/c/Java/jre1.8.0_131/lib/ext/access-bridge-32.jar"); buggy : new URL("file://MyComputer/c/Java/jre1.8.0_131/lib/ext/access-bridge-32.jar"); The Paths.get(URI) method handles the UNC path in the correct fashion. I'll add this comment to the code : // Use the Paths.get(uri) call in order to handle UNC based file name conversion correctly. regards, Sean. > Thanks, > Xuelei > > On 6/6/2017 1:40 AM, Se?n Coffey wrote: >> ping. Can I get a review for this please ? >> >> regards, >> Sean. >> >> >> On 01/06/2017 17:23, Se?n Coffey wrote: >>> The recent JDK-8163528 fix caused a regression for JDK binaries >>> launched with a UNC pathname. We can use the Paths class to create >>> the required File. I managed to put a test together which should >>> test this code path. >>> >>> webrev : http://cr.openjdk.java.net/~coffeys/webrev.8181205/webrev/ >>> JBS record : https://bugs.openjdk.java.net/browse/JDK-8181205 >>> >>> regards, >>> Sean. >>> >> From sean.mullan at oracle.com Tue Jun 6 20:27:18 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Tue, 6 Jun 2017 16:27:18 -0400 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: References: Message-ID: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> Hi John, This looks like a very useful test. I have not gone through all of the code, but here are a few comments for now until I have more time: - add tests for EC keys - add tests for SHA-512 variants of the signature algorithms - add tests for larger key sizes (ex: 2048 for DSA/RSA) - you can use the diamond operator <> in various places - might be more compact if jdkList() used Files.lines() to parse the file into a stream then an array - did you consider using the jarsigner API (jdk.security.jarsigner) instead of the command-line? I think this would be better (if possible) and it would give us some more tests of that API. --Sean On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: > Hi, > Please review this manual test for checking if a jar, which is signed > and timestamped by a JDK build, could be verified by other JDK builds. > It also can be used to check if the default timestamp digest algorithm > on signing is SHA-256. > For more details, please look through the test summary. > > Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 > Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ > > Best regards, > John Jiang > From anthony.scarpino at oracle.com Tue Jun 6 20:45:25 2017 From: anthony.scarpino at oracle.com (Anthony Scarpino) Date: Tue, 6 Jun 2017 13:45:25 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints In-Reply-To: References: Message-ID: On 06/05/2017 02:15 PM, Xuelei Fan wrote: > Hi, > > Please review the JDK 10 update: > http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ > > This update extends the DisabledAlgorithmConstraints implementation by > checking the AlgorithmParameters, which is ignored at present. > > Thanks, > Xuelei I'm find with the change, but I have an organizational requests DisabledAlgorithmConstraints.java:253-264: Can you move DH/DiffieHellman string value checking into a method in AlgorithmDecomposer? All the algorithm name details are handling in there. Just to be consistent in keeping them in one place. Tony From valerie.peng at oracle.com Tue Jun 6 22:53:19 2017 From: valerie.peng at oracle.com (Valerie Peng) Date: Tue, 6 Jun 2017 15:53:19 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints In-Reply-To: References: Message-ID: Looks fine to me. Thanks, Valerie On 6/5/2017 2:15 PM, Xuelei Fan wrote: > Hi, > > Please review the JDK 10 update: > http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ > > This update extends the DisabledAlgorithmConstraints implementation by > checking the AlgorithmParameters, which is ignored at present. > > Thanks, > Xuelei From xuelei.fan at oracle.com Tue Jun 6 23:04:14 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Tue, 6 Jun 2017 16:04:14 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints In-Reply-To: References: Message-ID: New webrev: http://cr.openjdk.java.net/~xuelei/8178728/webrev.01/ On 6/6/2017 1:45 PM, Anthony Scarpino wrote: > On 06/05/2017 02:15 PM, Xuelei Fan wrote: >> Hi, >> >> Please review the JDK 10 update: >> http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ >> >> This update extends the DisabledAlgorithmConstraints implementation by >> checking the AlgorithmParameters, which is ignored at present. >> >> Thanks, >> Xuelei > > I'm find with the change, but I have an organizational requests > > DisabledAlgorithmConstraints.java:253-264: > Can you move DH/DiffieHellman string value checking into a method in > AlgorithmDecomposer? All the algorithm name details are handling in > there. Just to be consistent in keeping them in one place. > Good points. Updated accordingly. I'm not very sure of the impact to decompose the general algorithm names yet. So I just add a more method (getAliases()), and not touch on the decomposes() method. Thanks, Xuelei From anthony.scarpino at oracle.com Wed Jun 7 01:03:09 2017 From: anthony.scarpino at oracle.com (Anthony Scarpino) Date: Tue, 6 Jun 2017 18:03:09 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints In-Reply-To: References: Message-ID: <99e12ffb-20a2-79cd-7a69-5e953202c642@oracle.com> On 06/06/2017 04:04 PM, Xuelei Fan wrote: > New webrev: > http://cr.openjdk.java.net/~xuelei/8178728/webrev.01/ > > On 6/6/2017 1:45 PM, Anthony Scarpino wrote: >> On 06/05/2017 02:15 PM, Xuelei Fan wrote: >>> Hi, >>> >>> Please review the JDK 10 update: >>> http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ >>> >>> This update extends the DisabledAlgorithmConstraints implementation by >>> checking the AlgorithmParameters, which is ignored at present. >>> >>> Thanks, >>> Xuelei >> >> I'm find with the change, but I have an organizational requests >> >> DisabledAlgorithmConstraints.java:253-264: >> Can you move DH/DiffieHellman string value checking into a method in >> AlgorithmDecomposer? All the algorithm name details are handling in >> there. Just to be consistent in keeping them in one place. >> > Good points. Updated accordingly. > > I'm not very sure of the impact to decompose the general algorithm names > yet. So I just add a more method (getAliases()), and not touch on the > decomposes() method. While I was review this earlier today, I was thinking about changes to aliases, including the hashes, that could make this faster. The changes look fine. Tony > > Thanks, > Xuelei From sha.jiang at oracle.com Wed Jun 7 01:14:50 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Wed, 7 Jun 2017 09:14:50 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> Message-ID: Hi Sean, On 07/06/2017 04:27, Sean Mullan wrote: > Hi John, > > This looks like a very useful test. I have not gone through all of the > code, but here are a few comments for now until I have more time: > > - add tests for EC keys > - add tests for SHA-512 variants of the signature algorithms > - add tests for larger key sizes (ex: 2048 for DSA/RSA) > - you can use the diamond operator <> in various places > - might be more compact if jdkList() used Files.lines() to parse the > file into a stream then an array I did consider about the above two points. Because the test will be backported to JDK 6, so I only used the features those supported by JDK 6. I supposed that would make the backport easier. Does it make sense? Best regards, John Jiang > - did you consider using the jarsigner API (jdk.security.jarsigner) > instead of the command-line? I think this would be better (if > possible) and it would give us some more tests of that API. > > --Sean > > On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >> Hi, >> Please review this manual test for checking if a jar, which is signed >> and timestamped by a JDK build, could be verified by other JDK builds. >> It also can be used to check if the default timestamp digest >> algorithm on signing is SHA-256. >> For more details, please look through the test summary. >> >> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >> >> Best regards, >> John Jiang >> > From weijun.wang at oracle.com Wed Jun 7 01:55:24 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Wed, 7 Jun 2017 09:55:24 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> Message-ID: On 06/07/2017 09:14 AM, sha.jiang at oracle.com wrote: > Hi Sean, > > On 07/06/2017 04:27, Sean Mullan wrote: >> Hi John, >> >> This looks like a very useful test. I have not gone through all of the >> code, but here are a few comments for now until I have more time: >> >> - add tests for EC keys >> - add tests for SHA-512 variants of the signature algorithms >> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >> - you can use the diamond operator <> in various places >> - might be more compact if jdkList() used Files.lines() to parse the >> file into a stream then an array > I did consider about the above two points. Because the test will be > backported to JDK 6, so I only used the features those supported by JDK 6. > I supposed that would make the backport easier. Does it make sense? I think this depends on how one plans to run this test. If the "neither of jdkListFile and jdkList is specified" case is very useful, then it makes sense to backport it to a quite old release. Otherwise, you can just use a new JDK to launch the test itself. > > Best regards, > John Jiang >> - did you consider using the jarsigner API (jdk.security.jarsigner) >> instead of the command-line? I think this would be better (if >> possible) and it would give us some more tests of that API. jarsigner can show warnings but JarSigner cannot, and maybe a user wants to compare "Status of Signing" and "Status of Verifying". Also, 1. Would you like to make JAVA_SECURITY configurable on the jtreg command line? Maybe someone wants to try out different java.security files. 2. Sometimes a system might lack enough entropy to generate random numbers. It will be safe to add -Djava.security.egd=file:/dev/./urandom to both the keytool and jarsigner commands. 3. I rethink about unsupportedSigAlgs. Is it possible to detect it with a separate program that just calls Signature.getInstance()? Otherwise if jarsigner fails for another reason (say, TSA cannot be reached) we won't be able to notice it. 4. About javaVersion(), if you test with an OpenJDK build, there will be no "Java SE" there. Maybe you can read a system property instead? Thanks Max >> >> --Sean >> >> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>> Hi, >>> Please review this manual test for checking if a jar, which is signed >>> and timestamped by a JDK build, could be verified by other JDK builds. >>> It also can be used to check if the default timestamp digest >>> algorithm on signing is SHA-256. >>> For more details, please look through the test summary. >>> >>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>> >>> Best regards, >>> John Jiang >>> >> > From xuelei.fan at oracle.com Wed Jun 7 03:13:44 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Tue, 6 Jun 2017 20:13:44 -0700 Subject: Code Review Request, JDK-8178728 Check the AlgorithmParameters in algorithm constraints In-Reply-To: <99e12ffb-20a2-79cd-7a69-5e953202c642@oracle.com> References: <99e12ffb-20a2-79cd-7a69-5e953202c642@oracle.com> Message-ID: On 6/6/2017 6:03 PM, Anthony Scarpino wrote: > On 06/06/2017 04:04 PM, Xuelei Fan wrote: >> New webrev: >> http://cr.openjdk.java.net/~xuelei/8178728/webrev.01/ >> >> On 6/6/2017 1:45 PM, Anthony Scarpino wrote: >>> On 06/05/2017 02:15 PM, Xuelei Fan wrote: >>>> Hi, >>>> >>>> Please review the JDK 10 update: >>>> http://cr.openjdk.java.net/~xuelei/8178728/webrev.00/ >>>> >>>> This update extends the DisabledAlgorithmConstraints implementation by >>>> checking the AlgorithmParameters, which is ignored at present. >>>> >>>> Thanks, >>>> Xuelei >>> >>> I'm find with the change, but I have an organizational requests >>> >>> DisabledAlgorithmConstraints.java:253-264: >>> Can you move DH/DiffieHellman string value checking into a method in >>> AlgorithmDecomposer? All the algorithm name details are handling in >>> there. Just to be consistent in keeping them in one place. >>> >> Good points. Updated accordingly. >> >> I'm not very sure of the impact to decompose the general algorithm >> names yet. So I just add a more method (getAliases()), and not touch >> on the decomposes() method. > > While I was review this earlier today, I was thinking about changes to > aliases, including the hashes, that could make this faster. > I thought of the option when I made the update. It's a better position. But I'm not confidential with my update. So let's do it in JDK 10 later. May backport this update, I would like to keep the impact as minimal as possible. > The changes look fine. > Thanks for the view! Xuelei From mbalao at redhat.com Wed Jun 7 13:37:53 2017 From: mbalao at redhat.com (Martin Balao) Date: Wed, 7 Jun 2017 10:37:53 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension Message-ID: Hi, I'd like to propose a patch for bug ID JDK-8046295 - Support Trusted CA Indication extension [1] and ask for reviews. I have the OCA signed since I'm a Red Hat employee and this is part of my work [2]. Webrev (jdk10 jdk10+10 based): * http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/ (browse online) * http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/8046295.webrev.00.zip (zip download) Trusted CA Indication TLS extension (for TLS v1.2+) has been implemented on both client and server sides, according to RFC 6066 [3]. Implementation assumes the use of X.509 certificates. Client side * The extension can be enabled by invoking 'setUseTrustedCAIndication' method on the 'X509TrustManager' instance used for establishing a TLS channel. * When enabled, a SHA-1 hash for each certificate managed by the TrustManager instance is sent to the server as a "Trusted CA Indication" data element. This happens during the Client Hello stage of the TLS Handshake. * Note: SHA-1 key hash, Distinguished Name and pre-agreed methods specified by RFC 6066 to identify a certificate were not implemented on the client side. Server side * The extension is always enabled on the server side. * When a client sends Trusted CA Indication data elements during the Client Hello stage (TLS Handshake), the server tries to choose a certificate from its 'X509KeyManager' instance based on that information. If a certificate is not found, the TLS channel cannot be established. * A certificate chain on a 'X509KeyManager' instance can be set as 'pre-agreed' trusted (see RFC 6066) invoking the 'setPreAgreedCertificate' method * This is the procedure through which the server chooses a certificate: * Cipher suites iterated as usual (in preferred order) * If the client has sent Trusted CA Indication data elements: * All the certificate chains for the chosen cipher suite algorithm are retrieved from the 'X509KeyManager' instance and iterated * For each certificate on a chain (starting from root): * For each Trusted CA Indication data element: * If there is a match between the Trusted CA Indication data element and the certificate in the server's chain, the certificate chain is chosen. * If Trusted CA Indication data element is of "pre-agreed" type and the certificate chain was set as "pre-agreed", the certificate chain is chosen. * As a consequence of the previous procedure, a client may trust in an intermediate certificate and the server will be able to choose a certificate chain that contains that intermediate certificate. * SHA-1 certificate hash, SHA-1 key hash, Distinguished Name and pre-agreed methods specified by RFC 6066 are supported. Test cases implemented for both client and server sides. Thanks in advanced, Martin Balao.- -- [1] - https://bugs.openjdk.java.net/browse/JDK-8046295 [2] - http://www.oracle.com/technetwork/community/oca-486395.html#r [3] - https://tools.ietf.org/html/rfc6066#page-12 -------------- next part -------------- An HTML attachment was scrubbed... URL: From sean.mullan at oracle.com Wed Jun 7 15:11:28 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Wed, 7 Jun 2017 11:11:28 -0400 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> Message-ID: <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: > Hi Sean, > > On 07/06/2017 04:27, Sean Mullan wrote: >> Hi John, >> >> This looks like a very useful test. I have not gone through all of the >> code, but here are a few comments for now until I have more time: >> >> - add tests for EC keys >> - add tests for SHA-512 variants of the signature algorithms >> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >> - you can use the diamond operator <> in various places >> - might be more compact if jdkList() used Files.lines() to parse the >> file into a stream then an array > I did consider about the above two points. Because the test will be > backported to JDK 6, so I only used the features those supported by JDK 6. > I supposed that would make the backport easier. Does it make sense? Yes, that makes sense. --Sean > > Best regards, > John Jiang >> - did you consider using the jarsigner API (jdk.security.jarsigner) >> instead of the command-line? I think this would be better (if >> possible) and it would give us some more tests of that API. >> >> --Sean >> >> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>> Hi, >>> Please review this manual test for checking if a jar, which is signed >>> and timestamped by a JDK build, could be verified by other JDK builds. >>> It also can be used to check if the default timestamp digest >>> algorithm on signing is SHA-256. >>> For more details, please look through the test summary. >>> >>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>> >>> Best regards, >>> John Jiang >>> >> > From xuelei.fan at oracle.com Thu Jun 8 16:35:03 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 8 Jun 2017 09:35:03 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: Message-ID: Hi Martin, Thanks for the contribution of the extension implementation. Do you know the real requirements or user cases for this extension? There is a vague point for this extension (Section 6, RFC 6066): Servers that receive a client hello containing the "trusted_ca_keys" extension MAY use the information contained in the extension to guide their selection of an appropriate certificate chain to return to the client. For better interop, a vendor may not use this extension if no appropriate certificate matching the trusted CA indication. This could make the benefits of this extension discounted a lot. I have some concerns about the design: 1. The trust manager and key manager should be immutable, otherwise there are multiple threading issues. It implies that the set methods should not be used for the trust/key managers. 2. The trusted ca indication is connection-oriented attribute, while trust/key managers are context wild configuration. So it may be not the right place to configure the trusted ca indication in trust/key manager. For example, every connection may have its own pre-agreed indication. While the pre-agreed configuration in key manager means all connections in the context have the same pre-agreed indication. 3. In the implementation, if the extension is enabled, the client will use all trusted certificates as the trusted ca indication. As all trusted CA are indicated, the extension is actually redundant. Use it or not, no impact on the final handshaking result. I would think it twice to use this extension if there is no significant benefits. Thanks & Regards, Xuelei On 6/7/2017 6:37 AM, Martin Balao wrote: > Hi, > > I'd like to propose a patch for bug ID JDK-8046295 - Support Trusted CA > Indication extension [1] and ask for reviews. I have the OCA signed > since I'm a Red Hat employee and this is part of my work [2]. > > Webrev (jdk10 jdk10+10 based): > > * > http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/ > (browse online) > * > http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/8046295.webrev.00.zip > (zip download) > > Trusted CA Indication TLS extension (for TLS v1.2+) has been implemented > on both client and server sides, according to RFC 6066 [3]. > Implementation assumes the use of X.509 certificates. > > Client side > > * The extension can be enabled by invoking 'setUseTrustedCAIndication' > method on the 'X509TrustManager' instance used for establishing a TLS > channel. > * When enabled, a SHA-1 hash for each certificate managed by the > TrustManager instance is sent to the server as a "Trusted CA Indication" > data element. This happens during the Client Hello stage of the TLS > Handshake. > * Note: SHA-1 key hash, Distinguished Name and pre-agreed methods > specified by RFC 6066 to identify a certificate were not implemented on > the client side. > > Server side > > * The extension is always enabled on the server side. > * When a client sends Trusted CA Indication data elements during the > Client Hello stage (TLS Handshake), the server tries to choose a > certificate from its 'X509KeyManager' instance based on that > information. If a certificate is not found, the TLS channel cannot be > established. > * A certificate chain on a 'X509KeyManager' instance can be set as > 'pre-agreed' trusted (see RFC 6066) invoking the > 'setPreAgreedCertificate' method > * This is the procedure through which the server chooses a certificate: > * Cipher suites iterated as usual (in preferred order) > * If the client has sent Trusted CA Indication data elements: > * All the certificate chains for the chosen cipher suite algorithm > are retrieved from the 'X509KeyManager' instance and iterated > * For each certificate on a chain (starting from root): > * For each Trusted CA Indication data element: > * If there is a match between the Trusted CA Indication data > element and the certificate in the server's chain, the certificate chain > is chosen. > * If Trusted CA Indication data element is of "pre-agreed" type > and the certificate chain was set as "pre-agreed", the certificate chain > is chosen. > * As a consequence of the previous procedure, a client may trust in an > intermediate certificate and the server will be able to choose a > certificate chain that contains that intermediate certificate. > * SHA-1 certificate hash, SHA-1 key hash, Distinguished Name and > pre-agreed methods specified by RFC 6066 are supported. > Test cases implemented for both client and server sides. > > Thanks in advanced, > Martin Balao.- > > -- > [1] - https://bugs.openjdk.java.net/browse/JDK-8046295 > [2] - http://www.oracle.com/technetwork/community/oca-486395.html#r > [3] - https://tools.ietf.org/html/rfc6066#page-12 From mbalao at redhat.com Thu Jun 8 22:09:15 2017 From: mbalao at redhat.com (Martin Balao) Date: Thu, 8 Jun 2017 19:09:15 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: Message-ID: Hi Xuelei, Thanks for your time reviewing this implementation. You have a point in interoperability considerations. A client may decide not to send this extension to avoid an error on the server: either because it does not know the server implementation and assumes that it won't support the extension or because it knows that server does not support this extension. Given that the extension is not widely implemented (as far as I know OpenSSL does not support it [1]), I imagine that the first use cases are going to be in scenarios where the user has control of both the client and the server. In fact, this may be a good reason to choose Java on both the server and the client -and thus the value of implementing both sides at the same time-. Your design concerns were also mine. I'll try to explain why I went that way, and I'm open to discuss and re-work all of them. 1. I understand your point and agree that TrustedManager and KeyManager are better to remain immutable. The reason why I decided to go for a change in X509KeyManager/X509TrustManager API (instead of a change in the factory API/constructor) was to minimize the scope of the change as this is X509 oriented. I agree with you that if multiple threads use the same trust objects, any change will impact the others. If they require isolation, they would need to instantiate different factories and get different trust objects. 2. As it was implemented now, it'd be necessesary to instantiate a new KeyManagerFactory/TrustedManagerFactory to have a different configuration regarding pre-agreed certificates or the use of Trusted CA Indication extension. Once the factory is instantiated, it will return the same TrustManager/KeyManager object. My first implementation was aligned to have this information more tighted to the connection. But I faced a few issues with that: 1. When you create a SSLSocket from the client side, the socket automatically connects to the server. So, the use of Trusted CA Indication needs to be there at the very beginning -if not, the extension won't be used-. Getting a socket and setting SSLParameters is not useful in this case. 2. Setting this information through the SSLContext would be a hard interface change -init method?-, but we can explore that path. The fact that the information that is going to be used (when the extension is enabled) is inside the TrustManager is what made me decide to go for the option I implemented. Both the "switch" to use it and the information are together. This is related to #3. 3. The client sends all the certificates in the TrustManager as trusted CA indication data because those are the certificates in which it trusts in order to validate the connection. These certificates are not necessarily the same that the certificates that the server has in its KeyManager. If the client previously knows that the server has the same certificates, I agree with you that it makes no sense to use this extension. That is going to depend on each use-case. A different alternative for #1 and #2 would be to set this information through SSLServerSocketFactory (SSLSocketImpl) and SSLEngine (SSLEngineImpl). Do you have any preference or any other idea regarding the interface design for enabling the extension? BTW, there is an additional issue I've recently noticed related to the aliases retrieval in the server side that I'll fix in the next Webrev. To briefly summarize, getServerAliases method (invoked from ServerHanshaker.java) is not enough when the KeyManager is implemented by X509KeyManagerImpl.java (instead of SunX509KeyManagerImpl.java). The reason is that chooseServerAlias does a lot more stuff in this implementation than just getting the alias from a cache (i.e.: use of server name indications). Kind regards, Martin.- -- [1] - https://github.com/openssl/openssl/issues/3029 On Thu, Jun 8, 2017 at 1:35 PM, Xuelei Fan wrote: > Hi Martin, > > Thanks for the contribution of the extension implementation. > > Do you know the real requirements or user cases for this extension? > > > There is a vague point for this extension (Section 6, RFC 6066): > > Servers that receive a client hello containing the "trusted_ca_keys" > extension MAY use the information contained in the extension to guide > their selection of an appropriate certificate chain to return to the > client. > > For better interop, a vendor may not use this extension if no appropriate > certificate matching the trusted CA indication. This could make the > benefits of this extension discounted a lot. > > I have some concerns about the design: > 1. The trust manager and key manager should be immutable, otherwise there > are multiple threading issues. It implies that the set methods should not > be used for the trust/key managers. > > 2. The trusted ca indication is connection-oriented attribute, while > trust/key managers are context wild configuration. So it may be not the > right place to configure the trusted ca indication in trust/key manager. > For example, every connection may have its own pre-agreed indication. While > the pre-agreed configuration in key manager means all connections in the > context have the same pre-agreed indication. > > 3. In the implementation, if the extension is enabled, the client will use > all trusted certificates as the trusted ca indication. As all trusted CA > are indicated, the extension is actually redundant. Use it or not, no > impact on the final handshaking result. I would think it twice to use this > extension if there is no significant benefits. > > Thanks & Regards, > Xuelei > > > On 6/7/2017 6:37 AM, Martin Balao wrote: > >> Hi, >> >> I'd like to propose a patch for bug ID JDK-8046295 - Support Trusted CA >> Indication extension [1] and ask for reviews. I have the OCA signed since >> I'm a Red Hat employee and this is part of my work [2]. >> >> Webrev (jdk10 jdk10+10 based): >> >> * http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-80 >> 46295/webrev.00/ (browse online) >> * http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-80 >> 46295/webrev.00/8046295.webrev.00.zip (zip download) >> >> Trusted CA Indication TLS extension (for TLS v1.2+) has been implemented >> on both client and server sides, according to RFC 6066 [3]. Implementation >> assumes the use of X.509 certificates. >> >> Client side >> >> * The extension can be enabled by invoking 'setUseTrustedCAIndication' >> method on the 'X509TrustManager' instance used for establishing a TLS >> channel. >> * When enabled, a SHA-1 hash for each certificate managed by the >> TrustManager instance is sent to the server as a "Trusted CA Indication" >> data element. This happens during the Client Hello stage of the TLS >> Handshake. >> * Note: SHA-1 key hash, Distinguished Name and pre-agreed methods >> specified by RFC 6066 to identify a certificate were not implemented on the >> client side. >> >> Server side >> >> * The extension is always enabled on the server side. >> * When a client sends Trusted CA Indication data elements during the >> Client Hello stage (TLS Handshake), the server tries to choose a >> certificate from its 'X509KeyManager' instance based on that information. >> If a certificate is not found, the TLS channel cannot be established. >> * A certificate chain on a 'X509KeyManager' instance can be set as >> 'pre-agreed' trusted (see RFC 6066) invoking the 'setPreAgreedCertificate' >> method >> * This is the procedure through which the server chooses a certificate: >> * Cipher suites iterated as usual (in preferred order) >> * If the client has sent Trusted CA Indication data elements: >> * All the certificate chains for the chosen cipher suite algorithm >> are retrieved from the 'X509KeyManager' instance and iterated >> * For each certificate on a chain (starting from root): >> * For each Trusted CA Indication data element: >> * If there is a match between the Trusted CA Indication data >> element and the certificate in the server's chain, the certificate chain is >> chosen. >> * If Trusted CA Indication data element is of "pre-agreed" type >> and the certificate chain was set as "pre-agreed", the certificate chain is >> chosen. >> * As a consequence of the previous procedure, a client may trust in an >> intermediate certificate and the server will be able to choose a >> certificate chain that contains that intermediate certificate. >> * SHA-1 certificate hash, SHA-1 key hash, Distinguished Name and >> pre-agreed methods specified by RFC 6066 are supported. >> Test cases implemented for both client and server sides. >> >> Thanks in advanced, >> Martin Balao.- >> >> -- >> [1] - https://bugs.openjdk.java.net/browse/JDK-8046295 >> [2] - http://www.oracle.com/technetwork/community/oca-486395.html#r >> [3] - https://tools.ietf.org/html/rfc6066#page-12 >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xuelei.fan at oracle.com Fri Jun 9 03:36:19 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 8 Jun 2017 20:36:19 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: Message-ID: On 6/8/2017 3:09 PM, Martin Balao wrote: > Hi Xuelei, > > Thanks for your time reviewing this implementation. > > You have a point in interoperability considerations. A client may decide > not to send this extension to avoid an error on the server: either > because it does not know the server implementation and assumes that it > won't support the extension or because it knows that server does not > support this extension. Given that the extension is not widely > implemented (as far as I know OpenSSL does not support it [1]), I > imagine that the first use cases are going to be in scenarios where the > user has control of both the client and the server. In fact, this may be > a good reason to choose Java on both the server and the client -and thus > the value of implementing both sides at the same time-. > > Your design concerns were also mine. I'll try to explain why I went that > way, and I'm open to discuss and re-work all of them. > > 1. > > I understand your point and agree that TrustedManager and KeyManager are > better to remain immutable. The reason why I decided to go for a change > in X509KeyManager/X509TrustManager API (instead of a change in the > factory API/constructor) was to minimize the scope of the change as this > is X509 oriented. I agree with you that if multiple threads use the same > trust objects, any change will impact the others. If they require > isolation, they would need to instantiate different factories and get > different trust objects. > > 2. > > As it was implemented now, it'd be necessesary to instantiate a new > KeyManagerFactory/TrustedManagerFactory to have a different > configuration regarding pre-agreed certificates or the use of Trusted CA > Indication extension. Once the factory is instantiated, it will return > the same TrustManager/KeyManager object. > > My first implementation was aligned to have this information more > tighted to the connection. But I faced a few issues with that: > > 1. When you create a SSLSocket from the client side, the socket > automatically connects to the server. So, the use of Trusted CA > Indication needs to be there at the very beginning -if not, the > extension won't be used-. Getting a socket and setting SSLParameters is > not useful in this case. > I did not get the point why setting SSLParametes is not useful. When an SSLSocket in created, the SSL connection has not established. Creating a SSLSocket, and then setting the SSLParameters, and then negotiating the SSL connection (handshake). That's the general routines to use SSLParameters. // connect SSLSocket sslSocket = sslSocketFactory.createSocket(...); // configure SSLParameters sslParameters = ... sslSocket.setParameters(sslParameters); // negotiate sslSocket.startHandshake()/sslSocket I/O. I think if trusted ca indication is configured with SSLParameters, it should can be used. > 2. Setting this information through the SSLContext would be a hard > interface change -init method?-, but we can explore that path. > > The fact that the information that is going to be used (when the > extension is enabled) is inside the TrustManager is what made me decide > to go for the option I implemented. Both the "switch" to use it and the > information are together. This is related to #3. > > 3. > > The client sends all the certificates in the TrustManager as trusted CA > indication data because those are the certificates in which it trusts in > order to validate the connection. These certificates are not necessarily > the same that the certificates that the server has in its KeyManager. If > the client previously knows that the server has the same certificates, I > agree with you that it makes no sense to use this extension. That is > going to depend on each use-case. > > A different alternative for #1 and #2 would be to set this information > through SSLServerSocketFactory (SSLSocketImpl) and SSLEngine > (SSLEngineImpl). > > Do you have any preference or any other idea regarding the interface > design for enabling the extension? > Per my understanding, the significant benefit of this extension is to help the server choose the right server cert if the server supports multiple certificates. For example, the server has 10 RSA certs issued by 8 CAs, while the client only trust one CA. I may add a pair of SSLParametersmethod, or define a system property: public boolean getUseTrustedIndication(); public void setUseTrustedIndication(boolean useTrustedIndication); or define a system property: jdk.tls.useTrustedIndication = true/false. or use system property: jdk.tls.extensions = +/-trusted_ca_keys As this is a mandatory TLS Extensions of NIST SP 800-52, more flexibility may be not desired. I would prefer to use one system property only. The trusted authorities can be get from client trust manager. Server can choose the best matching of server certificate of the client requested trusted authorities. In JSSE, the benefits pre_agreed option can get by customizing the key/trust manager, so I did not see too much benefits to support this option in JDK. The values of other three options can get from the key/trust manager certificates. What do you think? Can it meet your requirements? > BTW, there is an additional issue I've recently noticed related to the > aliases retrieval in the server side that I'll fix in the next Webrev. > To briefly summarize, getServerAliases method (invoked from > ServerHanshaker.java) is not enough when the KeyManager is implemented > by X509KeyManagerImpl.java (instead of SunX509KeyManagerImpl.java). The > reason is that chooseServerAlias does a lot more stuff in this > implementation than just getting the alias from a cache (i.e.: use of > server name indications). > Please feel free to file a bug or submit the change. Thanks & Regards, Xuelei > Kind regards, > Martin.- > > -- > [1] - https://github.com/openssl/openssl/issues/3029 > > On Thu, Jun 8, 2017 at 1:35 PM, Xuelei Fan > wrote: > > Hi Martin, > > Thanks for the contribution of the extension implementation. > > Do you know the real requirements or user cases for this extension? > > > There is a vague point for this extension (Section 6, RFC 6066): > > Servers that receive a client hello containing the "trusted_ca_keys" > extension MAY use the information contained in the extension to > guide > their selection of an appropriate certificate chain to return to the > client. > > For better interop, a vendor may not use this extension if no > appropriate certificate matching the trusted CA indication. This > could make the benefits of this extension discounted a lot. > > I have some concerns about the design: > 1. The trust manager and key manager should be immutable, otherwise > there are multiple threading issues. It implies that the set > methods should not be used for the trust/key managers. > > 2. The trusted ca indication is connection-oriented attribute, while > trust/key managers are context wild configuration. So it may be not > the right place to configure the trusted ca indication in trust/key > manager. For example, every connection may have its own pre-agreed > indication. While the pre-agreed configuration in key manager means > all connections in the context have the same pre-agreed indication. > > 3. In the implementation, if the extension is enabled, the client > will use all trusted certificates as the trusted ca indication. As > all trusted CA are indicated, the extension is actually redundant. > Use it or not, no impact on the final handshaking result. I would > think it twice to use this extension if there is no significant > benefits. > > Thanks & Regards, > Xuelei > > > On 6/7/2017 6:37 AM, Martin Balao wrote: > > Hi, > > I'd like to propose a patch for bug ID JDK-8046295 - Support > Trusted CA Indication extension [1] and ask for reviews. I have > the OCA signed since I'm a Red Hat employee and this is part of > my work [2]. > > Webrev (jdk10 jdk10+10 based): > > * > http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/ > > (browse online) > * > http://cr.openjdk.java.net/~sgehwolf/webrevs/mbalaoal/JDK-8046295/webrev.00/8046295.webrev.00.zip > > (zip download) > > Trusted CA Indication TLS extension (for TLS v1.2+) has been > implemented on both client and server sides, according to RFC > 6066 [3]. Implementation assumes the use of X.509 certificates. > > Client side > > * The extension can be enabled by invoking > 'setUseTrustedCAIndication' method on the 'X509TrustManager' > instance used for establishing a TLS channel. > * When enabled, a SHA-1 hash for each certificate managed by > the TrustManager instance is sent to the server as a "Trusted CA > Indication" data element. This happens during the Client Hello > stage of the TLS Handshake. > * Note: SHA-1 key hash, Distinguished Name and pre-agreed > methods specified by RFC 6066 to identify a certificate were not > implemented on the client side. > > Server side > > * The extension is always enabled on the server side. > * When a client sends Trusted CA Indication data elements > during the Client Hello stage (TLS Handshake), the server tries > to choose a certificate from its 'X509KeyManager' instance based > on that information. If a certificate is not found, the TLS > channel cannot be established. > * A certificate chain on a 'X509KeyManager' instance can be > set as 'pre-agreed' trusted (see RFC 6066) invoking the > 'setPreAgreedCertificate' method > * This is the procedure through which the server chooses a > certificate: > * Cipher suites iterated as usual (in preferred order) > * If the client has sent Trusted CA Indication data elements: > * All the certificate chains for the chosen cipher suite > algorithm are retrieved from the 'X509KeyManager' instance and > iterated > * For each certificate on a chain (starting from root): > * For each Trusted CA Indication data element: > * If there is a match between the Trusted CA Indication > data element and the certificate in the server's chain, the > certificate chain is chosen. > * If Trusted CA Indication data element is of > "pre-agreed" type and the certificate chain was set as > "pre-agreed", the certificate chain is chosen. > * As a consequence of the previous procedure, a client may > trust in an intermediate certificate and the server will be able > to choose a certificate chain that contains that intermediate > certificate. > * SHA-1 certificate hash, SHA-1 key hash, Distinguished Name > and pre-agreed methods specified by RFC 6066 are supported. > Test cases implemented for both client and server sides. > > Thanks in advanced, > Martin Balao.- > > -- > [1] - https://bugs.openjdk.java.net/browse/JDK-8046295 > > [2] - > http://www.oracle.com/technetwork/community/oca-486395.html#r > > [3] - https://tools.ietf.org/html/rfc6066#page-12 > > > From xuelei.fan at oracle.com Fri Jun 9 04:37:58 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 8 Jun 2017 21:37:58 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: Message-ID: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> On 6/8/2017 8:36 PM, Xuelei Fan wrote: > The trusted authorities can be get from client trust manager. Server > can choose the best matching of server certificate of the client > requested trusted authorities. > I missed the point that the key manager need to know the client requested trusted authorities for the choosing. So may need a new SSLSession attribute (See similar method in ExtendedSSLSession). Xuelei From sha.jiang at oracle.com Fri Jun 9 08:44:08 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Fri, 9 Jun 2017 16:44:08 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> Message-ID: Hi Sean and Max, Thanks for your comments. Please review the updated webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ The test has been modified significantly. The main points are: 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA and EC. 2. Adds cases on SHA-512. Now the test supports digest algorithms SHA-1, SHA-256 and SHA-512. 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] for RSA and DSA. 4. Adds cases on default signature algorithm. Now the test report can display the default algorithmat column [Signature Algorithm]. 5. Adds property -Djava.security.egd=file:/dev/./urandom for keytool and jarsigner commands. 6. Create a separated application, JdkUtils.java, to determine the JDK build version (java.runtime.version) and check if a signature algorithm is supported by a JDK. 7. Introduces a new property, named javaSecurityFile, for allowing users to specify alternative java security properties file. 8. Renames report column [Cert Type] to [Certificate]. This column displays the certificate identifiers, which is a combination of key algorithm, digest algorithm, key size and expired mark (if any). 9. The test summary also be updated accordingly. Best regards, John Jiang On 07/06/2017 23:11, Sean Mullan wrote: > On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >> Hi Sean, >> >> On 07/06/2017 04:27, Sean Mullan wrote: >>> Hi John, >>> >>> This looks like a very useful test. I have not gone through all of >>> the code, but here are a few comments for now until I have more time: >>> >>> - add tests for EC keys >>> - add tests for SHA-512 variants of the signature algorithms >>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>> - you can use the diamond operator <> in various places >>> - might be more compact if jdkList() used Files.lines() to parse the >>> file into a stream then an array >> I did consider about the above two points. Because the test will be >> backported to JDK 6, so I only used the features those supported by >> JDK 6. >> I supposed that would make the backport easier. Does it make sense? > > Yes, that makes sense. > > --Sean > >> >> Best regards, >> John Jiang >>> - did you consider using the jarsigner API (jdk.security.jarsigner) >>> instead of the command-line? I think this would be better (if >>> possible) and it would give us some more tests of that API. >>> >>> --Sean >>> >>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>> Hi, >>>> Please review this manual test for checking if a jar, which is >>>> signed and timestamped by a JDK build, could be verified by other >>>> JDK builds. >>>> It also can be used to check if the default timestamp digest >>>> algorithm on signing is SHA-256. >>>> For more details, please look through the test summary. >>>> >>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>> >>>> Best regards, >>>> John Jiang >>>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ecki at zusammenkunft.net Fri Jun 9 09:19:18 2017 From: ecki at zusammenkunft.net (Bernd Eckenfels) Date: Fri, 9 Jun 2017 09:19:18 +0000 Subject: RSASSA_PSS (for Certificates) Message-ID: Hello, Are there any plans to support RSA PSS as a Signature algorithm? https://bugs.openjdk.java.net/browse/JDK-8146293 In the german energy market RSA PSS is used for signing messages, and authorities demand to use it also for certificate signatures (RFC 4055) starting with 2018. This is somewhat paranoid but hey, it's a field requirement. At the moment BouncyCastle can be used as a Signature provider and if also used to create X509Certificate objects it can even verify the Signature. BTW: when the BC provider is registered the JDK X509Certificate.verify() finds the RSA PSS OID and uses the BC implementation, however the verification fails for non-Standard parameters (which is not uncommon since people try to avoid SHA1 in MFG1j as it does not parse and set the aproperiate parameters. I wonder if the modularity of X509Certificate could be enhanced to allow that? Having an option to extract ParameterSpec from a random signature block would certainly be a nice feature (similar to looking up the algorithm itself by OID) BTW there was some discussion on PKCS#11 supporting it - I think the Athena PKCS11 lib with their JCOS based IDProtect tokens supports RSAPSS as an mechanism. But I guess that are three different topic, JCE Signature, X509CertExtension and PKCS11 mechanism. Gruss Bernd -- http://bernd.eckenfels.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From weijun.wang at oracle.com Fri Jun 9 12:05:59 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Fri, 9 Jun 2017 20:05:59 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> Message-ID: <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> The test can be more friendly with default values. For example, in createCertificates(), you can generate certs that use default sigalg and keysize (i.e. without specifying -siglag and -keysize), and give them aliases with "default" or "null" inside. And in jar signing when signing with one -sigalg you can also choose cert generated with different or default sigalgs. BTW, I remember certain pairs of -keysize and -sigalg do not work together. For example, 1024 bit of DSA key cannot be used with SHA512withDSA signature algorithm. Have you noticed it? Thanks Max On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: > Hi Sean and Max, > Thanks for your comments. > Please review the updated webrev: > http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ > > The test has been modified significantly. The main points are: > 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA and EC. > 2. Adds cases on SHA-512. Now the test supports digest algorithms SHA-1, > SHA-256 and SHA-512. > 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] for > RSA and DSA. > 4. Adds cases on default signature algorithm. Now the test report can > display the default algorithmat column [Signature Algorithm]. > 5. Adds property -Djava.security.egd=file:/dev/./urandom for keytool and > jarsigner commands. > 6. Create a separated application, JdkUtils.java, to determine the JDK > build version (java.runtime.version) and check if a signature algorithm > is supported by a JDK. > 7. Introduces a new property, named javaSecurityFile, for allowing users > to specify alternative java security properties file. > 8. Renames report column [Cert Type] to [Certificate]. This column > displays the certificate identifiers, which is a combination of key > algorithm, digest algorithm, key size and expired mark (if any). > 9. The test summary also be updated accordingly. > > Best regards, > John Jiang > > > On 07/06/2017 23:11, Sean Mullan wrote: >> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>> Hi Sean, >>> >>> On 07/06/2017 04:27, Sean Mullan wrote: >>>> Hi John, >>>> >>>> This looks like a very useful test. I have not gone through all of >>>> the code, but here are a few comments for now until I have more time: >>>> >>>> - add tests for EC keys >>>> - add tests for SHA-512 variants of the signature algorithms >>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>> - you can use the diamond operator <> in various places >>>> - might be more compact if jdkList() used Files.lines() to parse the >>>> file into a stream then an array >>> I did consider about the above two points. Because the test will be >>> backported to JDK 6, so I only used the features those supported by >>> JDK 6. >>> I supposed that would make the backport easier. Does it make sense? >> >> Yes, that makes sense. >> >> --Sean >> >>> >>> Best regards, >>> John Jiang >>>> - did you consider using the jarsigner API (jdk.security.jarsigner) >>>> instead of the command-line? I think this would be better (if >>>> possible) and it would give us some more tests of that API. >>>> >>>> --Sean >>>> >>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>> Hi, >>>>> Please review this manual test for checking if a jar, which is >>>>> signed and timestamped by a JDK build, could be verified by other >>>>> JDK builds. >>>>> It also can be used to check if the default timestamp digest >>>>> algorithm on signing is SHA-256. >>>>> For more details, please look through the test summary. >>>>> >>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>> >>>>> Best regards, >>>>> John Jiang >>>>> >>>> >>> >> > From sha.jiang at oracle.com Fri Jun 9 13:25:34 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Fri, 9 Jun 2017 21:25:34 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> Message-ID: <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> Hi Max, On 09/06/2017 20:05, Weijun Wang wrote: > The test can be more friendly with default values. > > For example, in createCertificates(), you can generate certs that use > default sigalg and keysize (i.e. without specifying -siglag and > -keysize), and give them aliases with "default" or "null" inside. > > And in jar signing when signing with one -sigalg you can also choose > cert generated with different or default sigalgs. I supposed this test just focus on signed jar verifying, but not certificate creating and jar signing. So, I'm not sure such cases are necessary. > > BTW, I remember certain pairs of -keysize and -sigalg do not work > together. For example, 1024 bit of DSA key cannot be used with > SHA512withDSA signature algorithm. Have you noticed it? It looks SHA512withDSA is not supported yet. I was using JDK10 build 10. When the test tried to create certificate with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, the below error raised: keytool error: java.security.NoSuchAlgorithmException: unrecognized algorithm name: SHA512withDSA If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: keytool error: java.security.InvalidKeyException: The security strength of SHA-1 digest algorithm is not sufficient for this key size Again, this test focus on signed jar verifying. If some problems are raised on certificate creating or jar signing, the associated verifying cases will be ignored. Best regards, John Jiang > > Thanks > Max > > > On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >> Hi Sean and Max, >> Thanks for your comments. >> Please review the updated webrev: >> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >> >> The test has been modified significantly. The main points are: >> 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA >> and EC. >> 2. Adds cases on SHA-512. Now the test supports digest algorithms >> SHA-1, SHA-256 and SHA-512. >> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] >> for RSA and DSA. >> 4. Adds cases on default signature algorithm. Now the test report can >> display the default algorithmat column [Signature Algorithm]. >> 5. Adds property -Djava.security.egd=file:/dev/./urandom for keytool >> and jarsigner commands. >> 6. Create a separated application, JdkUtils.java, to determine the >> JDK build version (java.runtime.version) and check if a signature >> algorithm is supported by a JDK. >> 7. Introduces a new property, named javaSecurityFile, for allowing >> users to specify alternative java security properties file. >> 8. Renames report column [Cert Type] to [Certificate]. This column >> displays the certificate identifiers, which is a combination of key >> algorithm, digest algorithm, key size and expired mark (if any). >> 9. The test summary also be updated accordingly. >> >> Best regards, >> John Jiang >> >> >> On 07/06/2017 23:11, Sean Mullan wrote: >>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>> Hi Sean, >>>> >>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>> Hi John, >>>>> >>>>> This looks like a very useful test. I have not gone through all of >>>>> the code, but here are a few comments for now until I have more time: >>>>> >>>>> - add tests for EC keys >>>>> - add tests for SHA-512 variants of the signature algorithms >>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>> - you can use the diamond operator <> in various places >>>>> - might be more compact if jdkList() used Files.lines() to parse >>>>> the file into a stream then an array >>>> I did consider about the above two points. Because the test will be >>>> backported to JDK 6, so I only used the features those supported by >>>> JDK 6. >>>> I supposed that would make the backport easier. Does it make sense? >>> >>> Yes, that makes sense. >>> >>> --Sean >>> >>>> >>>> Best regards, >>>> John Jiang >>>>> - did you consider using the jarsigner API >>>>> (jdk.security.jarsigner) instead of the command-line? I think this >>>>> would be better (if possible) and it would give us some more tests >>>>> of that API. >>>>> >>>>> --Sean >>>>> >>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>> Hi, >>>>>> Please review this manual test for checking if a jar, which is >>>>>> signed and timestamped by a JDK build, could be verified by other >>>>>> JDK builds. >>>>>> It also can be used to check if the default timestamp digest >>>>>> algorithm on signing is SHA-256. >>>>>> For more details, please look through the test summary. >>>>>> >>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>> >>>>>> Best regards, >>>>>> John Jiang >>>>>> >>>>> >>>> >>> >> > From weijun.wang at oracle.com Fri Jun 9 14:04:05 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Fri, 9 Jun 2017 22:04:05 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> Message-ID: <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> On 06/09/2017 09:25 PM, sha.jiang at oracle.com wrote: > Hi Max, > > On 09/06/2017 20:05, Weijun Wang wrote: >> The test can be more friendly with default values. >> >> For example, in createCertificates(), you can generate certs that use >> default sigalg and keysize (i.e. without specifying -siglag and >> -keysize), and give them aliases with "default" or "null" inside. >> >> And in jar signing when signing with one -sigalg you can also choose >> cert generated with different or default sigalgs. > I supposed this test just focus on signed jar verifying, but not > certificate creating and jar signing. So, I'm not sure such cases are > necessary. Well sometimes a test can do many things. If you only care about jar verification, why bother creating certs with different digest algorithms? On the other hand, if you do care about more, then in 338 // If the digest algorithm is not specified, then it 339 // uses certificate with SHA256 digest and 1024 key 340 // size. 341 if (digestAlgorithm == DEFAULT) { 342 certDigest = SHA256; 343 certKeySize = 1024; 344 } it seems a little awkward to hardcode the algorithm and keysize. If signing is using a default algorithm, it seems natural to use the cert that was generated with a default algorithm. In fact, this test case is quite useful that it ensures our different tools are using the same (or at least interoperable) default algorithms. --Max >> >> BTW, I remember certain pairs of -keysize and -sigalg do not work >> together. For example, 1024 bit of DSA key cannot be used with >> SHA512withDSA signature algorithm. Have you noticed it? > It looks SHA512withDSA is not supported yet. > I was using JDK10 build 10. When the test tried to create certificate > with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, the below error > raised: > keytool error: java.security.NoSuchAlgorithmException: unrecognized > algorithm name: SHA512withDSA > > If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: > keytool error: java.security.InvalidKeyException: The security strength > of SHA-1 digest algorithm is not sufficient for this key size > > Again, this test focus on signed jar verifying. If some problems are > raised on certificate creating or jar signing, the associated verifying > cases will be ignored. > > Best regards, > John Jiang >> >> Thanks >> Max >> >> >> On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >>> Hi Sean and Max, >>> Thanks for your comments. >>> Please review the updated webrev: >>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >>> >>> The test has been modified significantly. The main points are: >>> 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA >>> and EC. >>> 2. Adds cases on SHA-512. Now the test supports digest algorithms >>> SHA-1, SHA-256 and SHA-512. >>> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] >>> for RSA and DSA. >>> 4. Adds cases on default signature algorithm. Now the test report can >>> display the default algorithmat column [Signature Algorithm]. >>> 5. Adds property -Djava.security.egd=file:/dev/./urandom for keytool >>> and jarsigner commands. >>> 6. Create a separated application, JdkUtils.java, to determine the >>> JDK build version (java.runtime.version) and check if a signature >>> algorithm is supported by a JDK. >>> 7. Introduces a new property, named javaSecurityFile, for allowing >>> users to specify alternative java security properties file. >>> 8. Renames report column [Cert Type] to [Certificate]. This column >>> displays the certificate identifiers, which is a combination of key >>> algorithm, digest algorithm, key size and expired mark (if any). >>> 9. The test summary also be updated accordingly. >>> >>> Best regards, >>> John Jiang >>> >>> >>> On 07/06/2017 23:11, Sean Mullan wrote: >>>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>>> Hi Sean, >>>>> >>>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>>> Hi John, >>>>>> >>>>>> This looks like a very useful test. I have not gone through all of >>>>>> the code, but here are a few comments for now until I have more time: >>>>>> >>>>>> - add tests for EC keys >>>>>> - add tests for SHA-512 variants of the signature algorithms >>>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>>> - you can use the diamond operator <> in various places >>>>>> - might be more compact if jdkList() used Files.lines() to parse >>>>>> the file into a stream then an array >>>>> I did consider about the above two points. Because the test will be >>>>> backported to JDK 6, so I only used the features those supported by >>>>> JDK 6. >>>>> I supposed that would make the backport easier. Does it make sense? >>>> >>>> Yes, that makes sense. >>>> >>>> --Sean >>>> >>>>> >>>>> Best regards, >>>>> John Jiang >>>>>> - did you consider using the jarsigner API >>>>>> (jdk.security.jarsigner) instead of the command-line? I think this >>>>>> would be better (if possible) and it would give us some more tests >>>>>> of that API. >>>>>> >>>>>> --Sean >>>>>> >>>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>>> Hi, >>>>>>> Please review this manual test for checking if a jar, which is >>>>>>> signed and timestamped by a JDK build, could be verified by other >>>>>>> JDK builds. >>>>>>> It also can be used to check if the default timestamp digest >>>>>>> algorithm on signing is SHA-256. >>>>>>> For more details, please look through the test summary. >>>>>>> >>>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>>> >>>>>>> Best regards, >>>>>>> John Jiang >>>>>>> >>>>>> >>>>> >>>> >>> >> > From mbalao at redhat.com Fri Jun 9 20:10:18 2017 From: mbalao at redhat.com (Martin Balao) Date: Fri, 9 Jun 2017 17:10:18 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> Message-ID: Hi Xuelei, I didn't notice that some of the SSLSocket contructors did not establish the connection, so SSLParameters can be effective for Trusted CA Indication. This was an invalid argument on my side, sorry. As for the configuration to enable the extension, it's probably not necessary on the Server side because -as you mentioned- it is mandatory and there is no harm in supporting it. However, it has to be configurable on the Client side because -as we previously discussed- the client may cause a handshake failure if the server does not support the extension. I'd prefer the Client configuring the SSLSocket through SSLParameters instead of a system-wide property -which has even more impact than the TrustManager approach-. Would this work for you? > In JSSE, the benefits pre_agreed option can get by customizing the key/trust manager, so I did not see too much benefits to support this option in JDK I understand your point and will remove support for "pre_agreed". On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan wrote: > > > On 6/8/2017 8:36 PM, Xuelei Fan wrote: > >> The trusted authorities can be get from client trust manager. Server can >> choose the best matching of server certificate of the client requested >> trusted authorities. >> > > > I missed the point that the key manager need to know the client requested > trusted authorities for the choosing. So may need a new SSLSession > attribute (See similar method in ExtendedSSLSession). > > Xuelei > Yes, an attribute on SSLSession may do the job (both when Key Manager receives a SSLSocket and a SSLEngine). Kind regards, Martin.- -------------- next part -------------- An HTML attachment was scrubbed... URL: From xuelei.fan at oracle.com Fri Jun 9 21:15:13 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Fri, 9 Jun 2017 14:15:13 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> Message-ID: <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> I'm OK to use SSLParameters. Thank you very much for considering a new design. Xuelei On 6/9/2017 1:10 PM, Martin Balao wrote: > Hi Xuelei, > > I didn't notice that some of the SSLSocket contructors did not establish > the connection, so SSLParameters can be effective for Trusted CA > Indication. This was an invalid argument on my side, sorry. > > As for the configuration to enable the extension, it's probably not > necessary on the Server side because -as you mentioned- it is mandatory > and there is no harm in supporting it. However, it has to be > configurable on the Client side because -as we previously discussed- the > client may cause a handshake failure if the server does not support the > extension. I'd prefer the Client configuring the SSLSocket through > SSLParameters instead of a system-wide property -which has even more > impact than the TrustManager approach-. Would this work for you? > > > In JSSE, the benefits pre_agreed option can get by customizing the > key/trust manager, so I did not see too much benefits to support this > option in JDK > > I understand your point and will remove support for "pre_agreed". > > > On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan > wrote: > > > > On 6/8/2017 8:36 PM, Xuelei Fan wrote: > > The trusted authorities can be get from client trust manager. > Server can choose the best matching of server certificate of the > client requested trusted authorities. > > > > I missed the point that the key manager need to know the client > requested trusted authorities for the choosing. So may need a new > SSLSession attribute (See similar method in ExtendedSSLSession). > > Xuelei > > > > Yes, an attribute on SSLSession may do the job (both when Key Manager > receives a SSLSocket and a SSLEngine). > > Kind regards, > Martin.- > From bradford.wetmore at oracle.com Fri Jun 9 22:11:58 2017 From: bradford.wetmore at oracle.com (Bradford Wetmore) Date: Fri, 9 Jun 2017 15:11:58 -0700 Subject: RSASSA_PSS (for Certificates) In-Reply-To: References: Message-ID: <0a95f90f-4752-127d-76bb-99aa834ee607@oracle.com> Don't know if you've noticed, but JDK-8146293 is marked as "In Progress". The companion JSSE bug is: JDK-8166595. Brad On 6/9/2017 2:19 AM, Bernd Eckenfels wrote: > Hello, > > Are there any plans to support RSA PSS as a Signature algorithm? > https://bugs.openjdk.java.net/browse/JDK-8146293 > > In the german energy market RSA PSS is used for signing messages, and > authorities demand to use it also for certificate signatures (RFC 4055) > starting with 2018. This is somewhat paranoid but hey, it's a field > requirement. > > At the moment BouncyCastle can be used as a Signature provider and if > also used to create X509Certificate objects it can even verify the > Signature. > > BTW: when the BC provider is registered the JDK X509Certificate.verify() > finds the RSA PSS OID and uses the BC implementation, however the > verification fails for non-Standard parameters (which is not uncommon > since people try to avoid SHA1 in MFG1j as it does not parse and set the > aproperiate parameters. > > I wonder if the modularity of X509Certificate could be enhanced to allow > that? Having an option to extract ParameterSpec from a random signature > block would certainly be a nice feature (similar to looking up the > algorithm itself by OID) > > BTW there was some discussion on PKCS#11 supporting it - I think the > Athena PKCS11 lib with their JCOS based IDProtect tokens supports RSAPSS > as an mechanism. > > But I guess that are three different topic, JCE Signature, > X509CertExtension and PKCS11 mechanism. > > Gruss > Bernd > -- > http://bernd.eckenfels.net > > From alex.buckley at oracle.com Fri Jun 2 00:13:11 2017 From: alex.buckley at oracle.com (Alex Buckley) Date: Thu, 01 Jun 2017 17:13:11 -0700 Subject: JPMS Access Checks, Verification and the Security Manager In-Reply-To: References: <13bb0dc2-9518-b72c-385f-1db95a8edab9@oracle.com> <59249529.3030103@oracle.com> Message-ID: <5930AD97.3090406@oracle.com> On 5/24/2017 12:13 AM, Volker Simonis wrote: > OK, so from what you say I understand that the verification errors I > see with the Security Manager enabled are an implementation detail of > HotSpot (because verification uses the same class loading mechanism > like the runtime) which is not required but still acceptable under the > JVMS. Is that correct? The JVMS is precise about which exceptions are allowed to be thrown by a JVM implementation during verification, and AccessControlException is not one of them. However, the JVMS is only one part of the Java SE Platform Specification. It is quite proper if another part specifies an AccessControlException when a class in a restricted package is referenced by a class without permission. I'm thinking in particular of the API specification for SecurityManager::checkPackageAccess. It states, "This method is called by the loadClass method of class loaders." Plainly, the intention is that a class (Tricky) which initiates the loading of another class (com.sun.crypto.provider.SunJCE) can do so only if it has permission to reference the other class. Unfortunately, the statement as written is only guaranteed to be true for the built-in class loaders of the Java SE Platform and not for user-defined class loaders. Accordingly, we will update the API specification to clarify how a JVM implementation may support the Security Manager in checking permissions when classes are loaded and resolved. But to answer your original question, an application CAN fail because the verifier can't load classes due to Security Manager restrictions; you may have to grant additional permissions if application classes wish to reference certain JDK 9 packages. Alex From ioi.lam at oracle.com Fri Jun 2 15:44:36 2017 From: ioi.lam at oracle.com (Ioi Lam) Date: Fri, 2 Jun 2017 08:44:36 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> Message-ID: <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> On 6/2/17 6:40 AM, Chris Hegarty wrote: > On 02/06/17 00:14, Ioi Lam wrote: >> ... >> >> The gem is hidden in the compile.0.jta file. It contains something like: >> >> -sourcepath :/jdk/foobar/test/lib: >> >> So if my test refers to a class under /test/lib, such as >> jdk.test.lib.process.ProcessTools, javac will be able to locate it under >> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >> build it automatically. >> >> So really, there's no reason why the test must explicitly do an @build >> of the library classes that it uses. > > Sure, you're relying on the implicit compilation of dependencies > by javac. Look at the output, where it compiles the library > classes to. It is part of the classes directory for the > individual test. That means that the library classes will need > to be compiled many many times. The @build tag will compile > the library classes to a common output directory, where they > can be reused ( unless I'm missing something ). > > -Chris. Yes, @build will compile classes so that they can be reused. But why should it be the responsibility of every test to do this? To reuse my malloc metaphore -- is it reasonable for every program that uses malloc to explicitly build libc? By the way, jtreg arranges the output directory of the test by the directory they sit in, so jdk/test/foo/bar/XTest.java jdk/test/foo/bar/YTest.java will all output their .class files to the same directory. Therefore, the amount of duplicated classes is not as bad as you might think. We've been omitting the @build tags in the hotspot tests and we haven't seen any problems. - Ioi From ioi.lam at oracle.com Fri Jun 2 16:04:52 2017 From: ioi.lam at oracle.com (Ioi Lam) Date: Fri, 2 Jun 2017 09:04:52 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <5f94d6e8-0a0d-bcf5-c5ed-9a522af3e213@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <87049e76-f95c-50c1-f901-e7e4919e6b12@oracle.com> <5f94d6e8-0a0d-bcf5-c5ed-9a522af3e213@oracle.com> Message-ID: <0ec9930c-02b3-a2de-8aed-de6e6d8c71a5@oracle.com> I agree with what Daniel said. Even without explicit @build tags (as in the reproducer in CODETOOLS-790198), if you use something like @run main RedefineClassHelper that would cause an implicit invocation of "@build test/lib", because RedefineClassHelper.java is part of test/lib So it's not possible to avoid @build altogether (even if you're not using reflection). ===== More explanation of the jtreg bug: In the CODETOOLS-790198 reproducer's case, another test (ModifyAnonymous.java) uses jdk.test.lib.compiler.InMemoryJavaCompiler without an explicit @build. Later, when RedefineRunningMethodsWithResolutionErrors.java is executed and runs "@run main RedefineClassHelper", the jtreg bug causes classes/test/lib to be partially compiled -- RedefineClassHelper.class is there, but InMemoryJavaCompiler.class is missing. Sure, according to the jtreg docs, "@build jdk.test.lib.compiler.InMemoryJavaCompiler" should be added to ModifyAnonymous.java. However, when you test fails because ANOTHER TEST forgets to add an @build, and you are looking at a sea of over 1000 test cases, you're completely lost. So what we have is a jtreg rule that says "you should ...", but there's no enforcement (every test runs perfectly fine by itself), and when failure happens there's no diagnostic that tells you who's to blame. Seems like a perfect recipe for anarchy. - Ioi On 6/2/17 2:19 AM, Daniel Fuchs wrote: > Hi guys, > > The jtreg bug really needs to be fixed. > What I hear is that adding an explicit @build in one test > can make an unrelated test that depends on the same library > but doesn't have the explicit @build fail (and possibly > randomly and intermittently depending of the order in > which tests are run). > > This is very unintuitive, and the 'obvious' (thouhj maybe > wrong) fix for anyone stumbling on the issue would be to fix > the failing test by adding the explicit @build - not grep > the whole test base in search for a test that might have an > explicit @build, which as pointed elsewhere might well be > legitimate if the test is using reflection. > > So until the jtreg bug is fixed, I am not at all sure that > removing all the explicit @build is the correct thing to do, > as it's still bound to make existing unrelated tests fail > randomly if new tests with an explicit @build are added > later on... > > my2c > > -- daniel > > On 01/06/2017 23:37, Ioi Lam wrote: >> >> >> On 6/1/17 1:17 PM, Igor Ignatyev wrote: >>>> On Jun 1, 2017, at 1:20 AM, Chris Hegarty >>>> wrote: >>>> >>>> Igor, >>>> >>>>> On 1 Jun 2017, at 04:32, Igor Ignatyev >>>>> wrote: >>>>> >>>>> Hi Felix, >>>>> >>>>> I have suggested the exact opposite change[1-3] to fix the same >>>>> problem. >>>> I?m sorry, but this is all just too confusing. After your change, >>>> who, or what, is >>>> responsible for building/compiling the test library dependencies? >>> jtreg is responsible, there is an implicit build for each @run, and >>> jtreg will analyze a test class to get transitive closure for static >>> dependencies, hence you have to have @build only for classes which >>> are not in constant pool, e.g. used only by reflection or whose >>> classnames are only used to spawn a new java instance. >> >> >> I suspect the problem is caused by a long standing bug in jtreg that >> results in library classes being partially compiled. Please see my >> evaluation in >> >> https://bugs.openjdk.java.net/browse/CODETOOLS-7901986 >> >> In the bug report, there is test case that can reliably reproduce the >> NoClassDefFoundError problem. >> >> I think adding all the @build commands in the tests are just >> band-aids. Things will break unless every test explicitly uses @build >> to build every class in every library that they use, including all >> the private classes that are not directly accessible by the test cases. >> >> For example: doing this may be enough for now: >> >> * @build jdk.test.lib.process.* >> >> But what if in the future, jdk.test.lib.process is restructured to >> have a private package jdk.test.lib.process.hidden? To work around >> CODETOOLS-7901986, all the test cases that must be modified to the >> following, which unnecessarily exposes library implementation details >> to the library users: >> >> * @build jdk.test.lib.process.* jdk.test.lib.process.hidden.* >> >> Just imagine this -- "in order to use malloc() you must explicitly >> build not only malloc(), but also sbrk() ... and every other function >> in libc". That seems unreasonable to me. >> >> By the way, we made a fix in the HotSpot tests (see >> https://bugs.openjdk.java.net/browse/JDK-8157957) that got rid of >> many (but not all) of the NoClassDefFoundErrors by *removing* the >> @build lines ..... >> >> My proposal is, instead of just adding @build for band-aid, we should >> fix CODETOOLS-7901986 instead. >> >> Thanks >> - Ioi >> >> >>>> >>>> Test library code has no @modules tags, so does not explicitly >>>> declare its >>>> module dependencies. Instead module dependencies, required by test >>>> library code, are declared in the test using the library. If we >>>> wildcard, or >>>> otherwise leave broad build dependencies, from tests then there is no >>>> way to know what new module dependencies may be added in the future. >>>> That is, one of, the reason(s) I asked Felix to be explicit about >>>> the build >>>> dependencies. >>> having explicit builds does not really help w/ module dependency, if >>> someone change a testlibrary class so it starts to depend on another >>> testlibrary class, jtreg will implicitly build it and if this class >>> has some module dependencies, you will have to reflect them in the >>> test. >>> >>> generally speaking, I don't like having explicit build actions >>> because build actions themselves are implicit, so they don't really >>> help, it's still will be hard to spot missed explicit builds. not >>> having (unneeded) explicit builds is an easy rule to follow and we >>> can easily find all places which don't follow this rule by grep. >>> >>> -- Igor >>>> -Chris. >>>> >>>>> [1] https://bugs.openjdk.java.net/browse/JDK-8181391 >>>>> [2] >>>>> http://mail.openjdk.java.net/pipermail/core-libs-dev/2017-June/048012.html >>>>> >>>>> [3] >>>>> http://cr.openjdk.java.net/~iignatyev//8181391/webrev.00/index.html >> > From ioi.lam at oracle.com Fri Jun 2 16:14:19 2017 From: ioi.lam at oracle.com (Ioi Lam) Date: Fri, 2 Jun 2017 09:14:19 -0700 Subject: RFR 8181299/10, Several jdk tests fail with java.lang.NoClassDefFoundError: jdk/test/lib/process/StreamPumper In-Reply-To: <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> References: <6aed9ea5-3bf9-5c51-8c87-bab37fc835f1@oracle.com> <9e03e223-73df-959d-41fe-2a0b9cccd98d@oracle.com> <2e7ae109-2458-1540-069f-cb489a6a4ff3@oracle.com> <82b8b490-41c4-3450-a32c-a830a906d43c@oracle.com> Message-ID: On 6/2/17 8:44 AM, Ioi Lam wrote: > > > On 6/2/17 6:40 AM, Chris Hegarty wrote: >> On 02/06/17 00:14, Ioi Lam wrote: >>> ... >>> >>> The gem is hidden in the compile.0.jta file. It contains something >>> like: >>> >>> -sourcepath :/jdk/foobar/test/lib: >>> >>> So if my test refers to a class under /test/lib, such as >>> jdk.test.lib.process.ProcessTools, javac will be able to locate it >>> under >>> /jdk/foobar/test/lib/jdk/test/lib/process/ProcessTools.java, and will >>> build it automatically. >>> >>> So really, there's no reason why the test must explicitly do an @build >>> of the library classes that it uses. >> >> Sure, you're relying on the implicit compilation of dependencies >> by javac. Look at the output, where it compiles the library >> classes to. It is part of the classes directory for the >> individual test. That means that the library classes will need >> to be compiled many many times. The @build tag will compile >> the library classes to a common output directory, where they >> can be reused ( unless I'm missing something ). >> >> -Chris. > Yes, @build will compile classes so that they can be reused. But why > should it be the responsibility of every test to do this? > > To reuse my malloc metaphore -- is it reasonable for every program > that uses malloc to explicitly build libc? > > By the way, jtreg arranges the output directory of the test by the > directory they sit in, so > > jdk/test/foo/bar/XTest.java > jdk/test/foo/bar/YTest.java > > will all output their .class files to the same directory. Therefore, > the amount of duplicated classes is not as bad as you might think. > We've been omitting the @build tags in the hotspot tests and we > haven't seen any problems. > > - Ioi To avoid repeat compilation of the library classes, a more reasonable solution would be: [1] Before test execution -- scan all the selected test to find all libraries specified by @library tags [2] Fully compile all the libraries into their own output directories [3] Then, start execution of the selected tests From ecki at zusammenkunft.net Sat Jun 10 02:18:59 2017 From: ecki at zusammenkunft.net (Bernd) Date: Sat, 10 Jun 2017 04:18:59 +0200 Subject: Stricter Public Key checking corrupts JKS Message-ID: I noticed there is a bug (8177657,etc) about stricter DER checking on JDK Certificate code. I have an JKS Keystore which no longer can be opened because of that. I understand that the strict parsing has to stay for public keys, however I wonder if anything can be done about loading the other keys from the keystore or at least reporting the alias of the unparseable entry. The Problem was introduced with 8u121, 8u112 can open the file and it exists in 7u131 as well. Exception in thread "main" java.security.cert.CertificateParsingException: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509CertInfo.(X509CertInfo.java:169) at sun.security.x509.X509CertImpl.parse(X509CertImpl.java:1804) at sun.security.x509.X509CertImpl.(X509CertImpl.java:195) at sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:102) at java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339) at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:755) at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:56) at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224) at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:70) at java.security.KeyStore.load(KeyStore.java:1445) at net.eckenfels.test.certpath.KeystoreImport.main(KeystoreImport.java:29) Caused by: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509Key.parse(X509Key.java:174) at sun.security.x509.CertificateX509Key.(CertificateX509Key.java:75) at sun.security.x509.X509CertInfo.parse(X509CertInfo.java:667) at sun.security.x509.X509CertInfo.(X509CertInfo.java:167) ... 10 more Caused by: java.security.InvalidKeyException: java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509Key.buildX509Key(X509Key.java:227) at sun.security.x509.X509Key.parse(X509Key.java:170) ... 13 more Caused by: java.security.spec.InvalidKeySpecException: java.security.InvalidKeyException: Invalid RSA public key at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:205) at java.security.KeyFactory.generatePublic(KeyFactory.java:334) at sun.security.x509.X509Key.buildX509Key(X509Key.java:223) ... 14 more Caused by: java.security.InvalidKeyException: Invalid RSA public key at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:120) at sun.security.x509.X509Key.decode(X509Key.java:391) at sun.security.x509.X509Key.decode(X509Key.java:403) at sun.security.rsa.RSAPublicKeyImpl.(RSAPublicKeyImpl.java:84) at sun.security.rsa.RSAKeyFactory.generatePublic(RSAKeyFactory.java:298) at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:201) ... 16 more Caused by: java.io.IOException: Invalid encoding: redundant leading 0s at sun.security.util.DerInputBuffer.getBigInteger(DerInputBuffer.java:152) at sun.security.util.DerInputStream.getBigInteger(DerInputStream.java:207) at sun.security.rsa.RSAPrivateCrtKeyImpl.getBigInteger(RSAPrivateCrtKeyImpl.java:214) at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:115) ... 21 more -------------- next part -------------- An HTML attachment was scrubbed... URL: From ecki at zusammenkunft.net Sat Jun 10 17:45:14 2017 From: ecki at zusammenkunft.net (Bernd) Date: Sat, 10 Jun 2017 19:45:14 +0200 Subject: Untranslated common (ZIPCode OID.2.5.4.17) attribute Message-ID: Hello, when printing out the DN of a X509 Certificate Subject as issued by many public CAs I noticed it often adds a OID.2.5.4.17. which is the postalCode (ZIP) attribute. It would be cool if we can add it (however it is a not trivial type) http://hg.openjdk.java.net/jdk10/jdk10/jdk/file/53142e39bfa7/src/java.base/share/classes/sun/security/x509/AVA.java Gruss Bernd -------------- next part -------------- An HTML attachment was scrubbed... URL: From weijun.wang at oracle.com Mon Jun 12 04:22:37 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 12 Jun 2017 12:22:37 +0800 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds Message-ID: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> Please review this fix at http://cr.openjdk.java.net/~weijun/8181841/webrev.00 So I just ignore the extra digits. Do you think this is OK? It does mean different encodings might equal to each other. Thanks Max From sha.jiang at oracle.com Mon Jun 12 07:20:17 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Mon, 12 Jun 2017 15:20:17 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> Message-ID: <02569226-c63e-52ce-0519-e88cabde8358@oracle.com> Hi Max, Would you like to review the updated webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.02/ It can create certificate without -sigalg and -keysize, and jar signing also can use this certificate. Best regards, John Jiang On 09/06/2017 22:04, Weijun Wang wrote: > > On 06/09/2017 09:25 PM, sha.jiang at oracle.com wrote: >> Hi Max, >> >> On 09/06/2017 20:05, Weijun Wang wrote: >>> The test can be more friendly with default values. >>> >>> For example, in createCertificates(), you can generate certs that >>> use default sigalg and keysize (i.e. without specifying -siglag and >>> -keysize), and give them aliases with "default" or "null" inside. >>> >>> And in jar signing when signing with one -sigalg you can also choose >>> cert generated with different or default sigalgs. >> I supposed this test just focus on signed jar verifying, but not >> certificate creating and jar signing. So, I'm not sure such cases are >> necessary. > > Well sometimes a test can do many things. If you only care about jar > verification, why bother creating certs with different digest algorithms? > > On the other hand, if you do care about more, then in > > 338 // If the digest algorithm is not specified, then it > 339 // uses certificate with SHA256 digest and 1024 key > 340 // size. > 341 if (digestAlgorithm == DEFAULT) { > 342 certDigest = SHA256; > 343 certKeySize = 1024; > 344 } > > it seems a little awkward to hardcode the algorithm and keysize. If > signing is using a default algorithm, it seems natural to use the cert > that was generated with a default algorithm. In fact, this test case > is quite useful that it ensures our different tools are using the same > (or at least interoperable) default algorithms. > > --Max > >>> >>> BTW, I remember certain pairs of -keysize and -sigalg do not work >>> together. For example, 1024 bit of DSA key cannot be used with >>> SHA512withDSA signature algorithm. Have you noticed it? >> It looks SHA512withDSA is not supported yet. >> I was using JDK10 build 10. When the test tried to create certificate >> with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, the below error >> raised: >> keytool error: java.security.NoSuchAlgorithmException: unrecognized >> algorithm name: SHA512withDSA >> >> If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: >> keytool error: java.security.InvalidKeyException: The security >> strength of SHA-1 digest algorithm is not sufficient for this key size >> >> Again, this test focus on signed jar verifying. If some problems are >> raised on certificate creating or jar signing, the associated >> verifying cases will be ignored. >> >> Best regards, >> John Jiang >>> >>> Thanks >>> Max >>> >>> >>> On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >>>> Hi Sean and Max, >>>> Thanks for your comments. >>>> Please review the updated webrev: >>>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >>>> >>>> The test has been modified significantly. The main points are: >>>> 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA >>>> and EC. >>>> 2. Adds cases on SHA-512. Now the test supports digest algorithms >>>> SHA-1, SHA-256 and SHA-512. >>>> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] >>>> for RSA and DSA. >>>> 4. Adds cases on default signature algorithm. Now the test report >>>> can display the default algorithmat column [Signature Algorithm]. >>>> 5. Adds property -Djava.security.egd=file:/dev/./urandom for >>>> keytool and jarsigner commands. >>>> 6. Create a separated application, JdkUtils.java, to determine the >>>> JDK build version (java.runtime.version) and check if a signature >>>> algorithm is supported by a JDK. >>>> 7. Introduces a new property, named javaSecurityFile, for allowing >>>> users to specify alternative java security properties file. >>>> 8. Renames report column [Cert Type] to [Certificate]. This column >>>> displays the certificate identifiers, which is a combination of key >>>> algorithm, digest algorithm, key size and expired mark (if any). >>>> 9. The test summary also be updated accordingly. >>>> >>>> Best regards, >>>> John Jiang >>>> >>>> >>>> On 07/06/2017 23:11, Sean Mullan wrote: >>>>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>>>> Hi Sean, >>>>>> >>>>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>>>> Hi John, >>>>>>> >>>>>>> This looks like a very useful test. I have not gone through all >>>>>>> of the code, but here are a few comments for now until I have >>>>>>> more time: >>>>>>> >>>>>>> - add tests for EC keys >>>>>>> - add tests for SHA-512 variants of the signature algorithms >>>>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>>>> - you can use the diamond operator <> in various places >>>>>>> - might be more compact if jdkList() used Files.lines() to parse >>>>>>> the file into a stream then an array >>>>>> I did consider about the above two points. Because the test will >>>>>> be backported to JDK 6, so I only used the features those >>>>>> supported by JDK 6. >>>>>> I supposed that would make the backport easier. Does it make sense? >>>>> >>>>> Yes, that makes sense. >>>>> >>>>> --Sean >>>>> >>>>>> >>>>>> Best regards, >>>>>> John Jiang >>>>>>> - did you consider using the jarsigner API >>>>>>> (jdk.security.jarsigner) instead of the command-line? I think >>>>>>> this would be better (if possible) and it would give us some >>>>>>> more tests of that API. >>>>>>> >>>>>>> --Sean >>>>>>> >>>>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>>>> Hi, >>>>>>>> Please review this manual test for checking if a jar, which is >>>>>>>> signed and timestamped by a JDK build, could be verified by >>>>>>>> other JDK builds. >>>>>>>> It also can be used to check if the default timestamp digest >>>>>>>> algorithm on signing is SHA-256. >>>>>>>> For more details, please look through the test summary. >>>>>>>> >>>>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>>>> >>>>>>>> Best regards, >>>>>>>> John Jiang >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From weijun.wang at oracle.com Mon Jun 12 09:29:46 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 12 Jun 2017 17:29:46 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <02569226-c63e-52ce-0519-e88cabde8358@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> <02569226-c63e-52ce-0519-e88cabde8358@oracle.com> Message-ID: <85acb154-534d-daa7-d200-c3bea4607de9@oracle.com> Great. Only 2 questions: 459 // Return key sizes according to the specified key algorithm. 460 private static int[] keySizes(String digestAlgorithm, String keyAlgorithm) { 461 if (digestAlgorithm == DEFAULT) { 462 return new int[] { 0 }; 463 } 464 465 if (keyAlgorithm == RSA || keyAlgorithm == DSA) { 466 return new int[] { 1024, 2048 }; 467 } else if (keyAlgorithm == EC) { 468 return new int[] { 384, 571 }; 469 } 470 471 return null; 472 } Why is keysize dependent on digestalg? I mean, is it possible to always return {1024,2048,0} and {384,571,0}? 379 // If signing fails, the following verifying has to 380 // be ignored. 381 if (signingStatus == STATUS.ERROR) { 382 continue; 383 } Now that you've already checked sigalg support earlier in what cases it could go wrong here? Thanks Max On 06/12/2017 03:20 PM, sha.jiang at oracle.com wrote: > Hi Max, > Would you like to review the updated webrev: > http://cr.openjdk.java.net/~jjiang/8179614/webrev.02/ > It can create certificate without -sigalg and -keysize, and jar signing > also can use this certificate. > > Best regards, > John Jiang > > On 09/06/2017 22:04, Weijun Wang wrote: >> >> On 06/09/2017 09:25 PM, sha.jiang at oracle.com wrote: >>> Hi Max, >>> >>> On 09/06/2017 20:05, Weijun Wang wrote: >>>> The test can be more friendly with default values. >>>> >>>> For example, in createCertificates(), you can generate certs that >>>> use default sigalg and keysize (i.e. without specifying -siglag and >>>> -keysize), and give them aliases with "default" or "null" inside. >>>> >>>> And in jar signing when signing with one -sigalg you can also choose >>>> cert generated with different or default sigalgs. >>> I supposed this test just focus on signed jar verifying, but not >>> certificate creating and jar signing. So, I'm not sure such cases are >>> necessary. >> >> Well sometimes a test can do many things. If you only care about jar >> verification, why bother creating certs with different digest algorithms? >> >> On the other hand, if you do care about more, then in >> >> 338 // If the digest algorithm is not specified, then it >> 339 // uses certificate with SHA256 digest and 1024 key >> 340 // size. >> 341 if (digestAlgorithm == DEFAULT) { >> 342 certDigest = SHA256; >> 343 certKeySize = 1024; >> 344 } >> >> it seems a little awkward to hardcode the algorithm and keysize. If >> signing is using a default algorithm, it seems natural to use the cert >> that was generated with a default algorithm. In fact, this test case >> is quite useful that it ensures our different tools are using the same >> (or at least interoperable) default algorithms. >> >> --Max >> >>>> >>>> BTW, I remember certain pairs of -keysize and -sigalg do not work >>>> together. For example, 1024 bit of DSA key cannot be used with >>>> SHA512withDSA signature algorithm. Have you noticed it? >>> It looks SHA512withDSA is not supported yet. >>> I was using JDK10 build 10. When the test tried to create certificate >>> with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, the below error >>> raised: >>> keytool error: java.security.NoSuchAlgorithmException: unrecognized >>> algorithm name: SHA512withDSA >>> >>> If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: >>> keytool error: java.security.InvalidKeyException: The security >>> strength of SHA-1 digest algorithm is not sufficient for this key size >>> >>> Again, this test focus on signed jar verifying. If some problems are >>> raised on certificate creating or jar signing, the associated >>> verifying cases will be ignored. >>> >>> Best regards, >>> John Jiang >>>> >>>> Thanks >>>> Max >>>> >>>> >>>> On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >>>>> Hi Sean and Max, >>>>> Thanks for your comments. >>>>> Please review the updated webrev: >>>>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >>>>> >>>>> The test has been modified significantly. The main points are: >>>>> 1. Adds cases on EC. Now the test supports key algorithms RSA, DSA >>>>> and EC. >>>>> 2. Adds cases on SHA-512. Now the test supports digest algorithms >>>>> SHA-1, SHA-256 and SHA-512. >>>>> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, 2048] >>>>> for RSA and DSA. >>>>> 4. Adds cases on default signature algorithm. Now the test report >>>>> can display the default algorithmat column [Signature Algorithm]. >>>>> 5. Adds property -Djava.security.egd=file:/dev/./urandom for >>>>> keytool and jarsigner commands. >>>>> 6. Create a separated application, JdkUtils.java, to determine the >>>>> JDK build version (java.runtime.version) and check if a signature >>>>> algorithm is supported by a JDK. >>>>> 7. Introduces a new property, named javaSecurityFile, for allowing >>>>> users to specify alternative java security properties file. >>>>> 8. Renames report column [Cert Type] to [Certificate]. This column >>>>> displays the certificate identifiers, which is a combination of key >>>>> algorithm, digest algorithm, key size and expired mark (if any). >>>>> 9. The test summary also be updated accordingly. >>>>> >>>>> Best regards, >>>>> John Jiang >>>>> >>>>> >>>>> On 07/06/2017 23:11, Sean Mullan wrote: >>>>>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>>>>> Hi Sean, >>>>>>> >>>>>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>>>>> Hi John, >>>>>>>> >>>>>>>> This looks like a very useful test. I have not gone through all >>>>>>>> of the code, but here are a few comments for now until I have >>>>>>>> more time: >>>>>>>> >>>>>>>> - add tests for EC keys >>>>>>>> - add tests for SHA-512 variants of the signature algorithms >>>>>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>>>>> - you can use the diamond operator <> in various places >>>>>>>> - might be more compact if jdkList() used Files.lines() to parse >>>>>>>> the file into a stream then an array >>>>>>> I did consider about the above two points. Because the test will >>>>>>> be backported to JDK 6, so I only used the features those >>>>>>> supported by JDK 6. >>>>>>> I supposed that would make the backport easier. Does it make sense? >>>>>> >>>>>> Yes, that makes sense. >>>>>> >>>>>> --Sean >>>>>> >>>>>>> >>>>>>> Best regards, >>>>>>> John Jiang >>>>>>>> - did you consider using the jarsigner API >>>>>>>> (jdk.security.jarsigner) instead of the command-line? I think >>>>>>>> this would be better (if possible) and it would give us some >>>>>>>> more tests of that API. >>>>>>>> >>>>>>>> --Sean >>>>>>>> >>>>>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>>>>> Hi, >>>>>>>>> Please review this manual test for checking if a jar, which is >>>>>>>>> signed and timestamped by a JDK build, could be verified by >>>>>>>>> other JDK builds. >>>>>>>>> It also can be used to check if the default timestamp digest >>>>>>>>> algorithm on signing is SHA-256. >>>>>>>>> For more details, please look through the test summary. >>>>>>>>> >>>>>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>>>>> >>>>>>>>> Best regards, >>>>>>>>> John Jiang >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From sean.mullan at oracle.com Mon Jun 12 11:29:15 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Mon, 12 Jun 2017 07:29:15 -0400 Subject: Stricter Public Key checking corrupts JKS In-Reply-To: References: Message-ID: <4c80e1cf-17f9-335e-e80e-91b23ee44e19@oracle.com> Hi Bernd, This issue should be fixed in 8u131. Can you try that and let us know? --Sean On 6/9/17 10:18 PM, Bernd wrote: > I noticed there is a bug (8177657,etc) about stricter DER checking on > JDK Certificate code. I have an JKS Keystore which no longer can be > opened because of that. > > I understand that the strict parsing has to stay for public keys, > however I wonder if anything can be done about loading the other keys > from the keystore or at least reporting the alias of the unparseable entry. > > The Problem was introduced with 8u121, 8u112 can open the file and it > exists in 7u131 as well. > > Exception in thread "main" > java.security.cert.CertificateParsingException: java.io.IOException: > subject key, java.security.InvalidKeyException: Invalid RSA public key > at sun.security.x509.X509CertInfo.(X509CertInfo.java:169) > at sun.security.x509.X509CertImpl.parse(X509CertImpl.java:1804) > at sun.security.x509.X509CertImpl.(X509CertImpl.java:195) > at > sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:102) > at > java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339) > at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:755) > at > sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:56) > at > sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224) > at > sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:70) > at java.security.KeyStore.load(KeyStore.java:1445) > at > net.eckenfels.test.certpath.KeystoreImport.main(KeystoreImport.java:29) > Caused by: java.io.IOException: subject key, > java.security.InvalidKeyException: Invalid RSA public key > at sun.security.x509.X509Key.parse(X509Key.java:174) > at > sun.security.x509.CertificateX509Key.(CertificateX509Key.java:75) > at sun.security.x509.X509CertInfo.parse(X509CertInfo.java:667) > at sun.security.x509.X509CertInfo.(X509CertInfo.java:167) > ... 10 more > Caused by: java.security.InvalidKeyException: > java.security.InvalidKeyException: Invalid RSA public key > at sun.security.x509.X509Key.buildX509Key(X509Key.java:227) > at sun.security.x509.X509Key.parse(X509Key.java:170) > ... 13 more > Caused by: java.security.spec.InvalidKeySpecException: > java.security.InvalidKeyException: Invalid RSA public key > at > sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:205) > at java.security.KeyFactory.generatePublic(KeyFactory.java:334) > at sun.security.x509.X509Key.buildX509Key(X509Key.java:223) > ... 14 more > Caused by: java.security.InvalidKeyException: Invalid RSA public key > at > sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:120) > at sun.security.x509.X509Key.decode(X509Key.java:391) > at sun.security.x509.X509Key.decode(X509Key.java:403) > at sun.security.rsa.RSAPublicKeyImpl.(RSAPublicKeyImpl.java:84) > at > sun.security.rsa.RSAKeyFactory.generatePublic(RSAKeyFactory.java:298) > at > sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:201) > ... 16 more > Caused by: java.io.IOException: Invalid encoding: redundant leading 0s > at > sun.security.util.DerInputBuffer.getBigInteger(DerInputBuffer.java:152) > at > sun.security.util.DerInputStream.getBigInteger(DerInputStream.java:207) > at > sun.security.rsa.RSAPrivateCrtKeyImpl.getBigInteger(RSAPrivateCrtKeyImpl.java:214) > at > sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:115) > ... 21 more > From vincent.x.ryan at oracle.com Mon Jun 12 12:47:47 2017 From: vincent.x.ryan at oracle.com (Vincent Ryan) Date: Mon, 12 Jun 2017 13:47:47 +0100 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> Message-ID: This approach looks fine to me given the limitation on the precision of Date. Just one issue: why remove the upper bound at l.277 in DerInputBuffer.java Thanks. > On 12 Jun 2017, at 05:22, Weijun Wang wrote: > > Please review this fix at > > http://cr.openjdk.java.net/~weijun/8181841/webrev.00 > > So I just ignore the extra digits. Do you think this is OK? It does mean different encodings might equal to each other. > > Thanks > Max From weijun.wang at oracle.com Mon Jun 12 13:18:06 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 12 Jun 2017 21:18:06 +0800 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> Message-ID: On 06/12/2017 08:47 PM, Vincent Ryan wrote: > This approach looks fine to me given the limitation on the precision of Date. > Just one issue: why remove the upper bound at l.277 in DerInputBuffer.java Before this fix, the maximum length of a GeneralizedTime is 20170101235555.123+0800 0---------1---------2-- Now that we allow arbitrary length of fractional part, there will be no upper bound. --Max > > Thanks. > > >> On 12 Jun 2017, at 05:22, Weijun Wang wrote: >> >> Please review this fix at >> >> http://cr.openjdk.java.net/~weijun/8181841/webrev.00 >> >> So I just ignore the extra digits. Do you think this is OK? It does mean different encodings might equal to each other. >> >> Thanks >> Max > From mstjohns at comcast.net Mon Jun 12 17:07:54 2017 From: mstjohns at comcast.net (Michael StJohns) Date: Mon, 12 Jun 2017 13:07:54 -0400 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> Message-ID: <092e08a3-901b-0f9e-ba14-c8987b61d5cd@comcast.net> On 6/12/2017 9:18 AM, Weijun Wang wrote: > > > On 06/12/2017 08:47 PM, Vincent Ryan wrote: >> This approach looks fine to me given the limitation on the precision >> of Date. >> Just one issue: why remove the upper bound at l.277 in >> DerInputBuffer.java > > Before this fix, the maximum length of a GeneralizedTime is > > 20170101235555.123+0800 > 0---------1---------2-- The actual bound in GeneralizedTime is 6 digits of fractional time (according to ISO 8601) or 25 characters. That should still continue to be enforced. > > Now that we allow arbitrary length of fractional part, there will be > no upper bound. > > --Max > >> >> Thanks. >> >> >>> On 12 Jun 2017, at 05:22, Weijun Wang wrote: >>> >>> Please review this fix at >>> >>> http://cr.openjdk.java.net/~weijun/8181841/webrev.00 >>> >>> So I just ignore the extra digits. Do you think this is OK? It does >>> mean different encodings might equal to each other. >>> >>> Thanks >>> Max >> From sha.jiang at oracle.com Mon Jun 12 22:51:40 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Tue, 13 Jun 2017 06:51:40 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: <85acb154-534d-daa7-d200-c3bea4607de9@oracle.com> References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> <02569226-c63e-52ce-0519-e88cabde8358@oracle.com> <85acb154-534d-daa7-d200-c3bea4607de9@oracle.com> Message-ID: Hi Max, On 12/06/2017 17:29, Weijun Wang wrote: > Great. Only 2 questions: > > 459 // Return key sizes according to the specified key algorithm. > 460 private static int[] keySizes(String digestAlgorithm, String > keyAlgorithm) { > 461 if (digestAlgorithm == DEFAULT) { > 462 return new int[] { 0 }; > 463 } > 464 > 465 if (keyAlgorithm == RSA || keyAlgorithm == DSA) { > 466 return new int[] { 1024, 2048 }; > 467 } else if (keyAlgorithm == EC) { > 468 return new int[] { 384, 571 }; > 469 } > 470 > 471 return null; > 472 } > > Why is keysize dependent on digestalg? I mean, is it possible to > always return {1024,2048,0} and {384,571,0}? Get it, thanks! > > 379 // If signing fails, the following verifying has to > 380 // be ignored. > 381 if (signingStatus == STATUS.ERROR) { > 382 continue; > 383 } > > Now that you've already checked sigalg support earlier in what cases > it could go wrong here? Jar signing still could fail. For example, TSA service is unavailable. Best regards, John Jiang > > Thanks > Max > > On 06/12/2017 03:20 PM, sha.jiang at oracle.com wrote: >> Hi Max, >> Would you like to review the updated webrev: >> http://cr.openjdk.java.net/~jjiang/8179614/webrev.02/ >> It can create certificate without -sigalg and -keysize, and jar >> signing also can use this certificate. >> >> Best regards, >> John Jiang >> >> On 09/06/2017 22:04, Weijun Wang wrote: >>> >>> On 06/09/2017 09:25 PM, sha.jiang at oracle.com wrote: >>>> Hi Max, >>>> >>>> On 09/06/2017 20:05, Weijun Wang wrote: >>>>> The test can be more friendly with default values. >>>>> >>>>> For example, in createCertificates(), you can generate certs that >>>>> use default sigalg and keysize (i.e. without specifying -siglag >>>>> and -keysize), and give them aliases with "default" or "null" inside. >>>>> >>>>> And in jar signing when signing with one -sigalg you can also >>>>> choose cert generated with different or default sigalgs. >>>> I supposed this test just focus on signed jar verifying, but not >>>> certificate creating and jar signing. So, I'm not sure such cases >>>> are necessary. >>> >>> Well sometimes a test can do many things. If you only care about jar >>> verification, why bother creating certs with different digest >>> algorithms? >>> >>> On the other hand, if you do care about more, then in >>> >>> 338 // If the digest algorithm is not specified, then it >>> 339 // uses certificate with SHA256 digest and 1024 key >>> 340 // size. >>> 341 if (digestAlgorithm == DEFAULT) { >>> 342 certDigest = SHA256; >>> 343 certKeySize = 1024; >>> 344 } >>> >>> it seems a little awkward to hardcode the algorithm and keysize. If >>> signing is using a default algorithm, it seems natural to use the >>> cert that was generated with a default algorithm. In fact, this test >>> case is quite useful that it ensures our different tools are using >>> the same (or at least interoperable) default algorithms. >>> >>> --Max >>> >>>>> >>>>> BTW, I remember certain pairs of -keysize and -sigalg do not work >>>>> together. For example, 1024 bit of DSA key cannot be used with >>>>> SHA512withDSA signature algorithm. Have you noticed it? >>>> It looks SHA512withDSA is not supported yet. >>>> I was using JDK10 build 10. When the test tried to create >>>> certificate with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, >>>> the below error raised: >>>> keytool error: java.security.NoSuchAlgorithmException: unrecognized >>>> algorithm name: SHA512withDSA >>>> >>>> If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: >>>> keytool error: java.security.InvalidKeyException: The security >>>> strength of SHA-1 digest algorithm is not sufficient for this key size >>>> >>>> Again, this test focus on signed jar verifying. If some problems >>>> are raised on certificate creating or jar signing, the associated >>>> verifying cases will be ignored. >>>> >>>> Best regards, >>>> John Jiang >>>>> >>>>> Thanks >>>>> Max >>>>> >>>>> >>>>> On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >>>>>> Hi Sean and Max, >>>>>> Thanks for your comments. >>>>>> Please review the updated webrev: >>>>>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >>>>>> >>>>>> The test has been modified significantly. The main points are: >>>>>> 1. Adds cases on EC. Now the test supports key algorithms RSA, >>>>>> DSA and EC. >>>>>> 2. Adds cases on SHA-512. Now the test supports digest algorithms >>>>>> SHA-1, SHA-256 and SHA-512. >>>>>> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, >>>>>> 2048] for RSA and DSA. >>>>>> 4. Adds cases on default signature algorithm. Now the test report >>>>>> can display the default algorithmat column [Signature Algorithm]. >>>>>> 5. Adds property -Djava.security.egd=file:/dev/./urandom for >>>>>> keytool and jarsigner commands. >>>>>> 6. Create a separated application, JdkUtils.java, to determine >>>>>> the JDK build version (java.runtime.version) and check if a >>>>>> signature algorithm is supported by a JDK. >>>>>> 7. Introduces a new property, named javaSecurityFile, for >>>>>> allowing users to specify alternative java security properties file. >>>>>> 8. Renames report column [Cert Type] to [Certificate]. This >>>>>> column displays the certificate identifiers, which is a >>>>>> combination of key algorithm, digest algorithm, key size and >>>>>> expired mark (if any). >>>>>> 9. The test summary also be updated accordingly. >>>>>> >>>>>> Best regards, >>>>>> John Jiang >>>>>> >>>>>> >>>>>> On 07/06/2017 23:11, Sean Mullan wrote: >>>>>>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>>>>>> Hi Sean, >>>>>>>> >>>>>>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>>>>>> Hi John, >>>>>>>>> >>>>>>>>> This looks like a very useful test. I have not gone through >>>>>>>>> all of the code, but here are a few comments for now until I >>>>>>>>> have more time: >>>>>>>>> >>>>>>>>> - add tests for EC keys >>>>>>>>> - add tests for SHA-512 variants of the signature algorithms >>>>>>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>>>>>> - you can use the diamond operator <> in various places >>>>>>>>> - might be more compact if jdkList() used Files.lines() to >>>>>>>>> parse the file into a stream then an array >>>>>>>> I did consider about the above two points. Because the test >>>>>>>> will be backported to JDK 6, so I only used the features those >>>>>>>> supported by JDK 6. >>>>>>>> I supposed that would make the backport easier. Does it make >>>>>>>> sense? >>>>>>> >>>>>>> Yes, that makes sense. >>>>>>> >>>>>>> --Sean >>>>>>> >>>>>>>> >>>>>>>> Best regards, >>>>>>>> John Jiang >>>>>>>>> - did you consider using the jarsigner API >>>>>>>>> (jdk.security.jarsigner) instead of the command-line? I think >>>>>>>>> this would be better (if possible) and it would give us some >>>>>>>>> more tests of that API. >>>>>>>>> >>>>>>>>> --Sean >>>>>>>>> >>>>>>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>>>>>> Hi, >>>>>>>>>> Please review this manual test for checking if a jar, which >>>>>>>>>> is signed and timestamped by a JDK build, could be verified >>>>>>>>>> by other JDK builds. >>>>>>>>>> It also can be used to check if the default timestamp digest >>>>>>>>>> algorithm on signing is SHA-256. >>>>>>>>>> For more details, please look through the test summary. >>>>>>>>>> >>>>>>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>>>>>> >>>>>>>>>> Best regards, >>>>>>>>>> John Jiang >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From sha.jiang at oracle.com Tue Jun 13 01:28:00 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Tue, 13 Jun 2017 09:28:00 +0800 Subject: RFR[10] JDK-8179564: Missing @bug for tests added with JDK-8165367 Message-ID: <38446edd-b216-c63b-5f1a-2348a5100d5d@oracle.com> Hi, Please review the below patch for adding @bug tag for test sun/security/ssl/CertPathRestrictions/TLSRestrictions.java. diff -r 3801153e1036 test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java --- a/test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java Mon Jun 12 12:45:52 2017 -0700 +++ b/test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java Tue Jun 13 09:13:50 2017 +0800 @@ -48,6 +48,7 @@ /* * @test + * @bug 8165367 * @summary Verify the restrictions for certificate path on JSSE with custom trust store. * @library /test/lib * @build jdk.test.lib.Utils Best regards, John Jiang From weijun.wang at oracle.com Tue Jun 13 01:36:53 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Tue, 13 Jun 2017 09:36:53 +0800 Subject: RFR[10] JDK-8179564: Missing @bug for tests added with JDK-8165367 In-Reply-To: <38446edd-b216-c63b-5f1a-2348a5100d5d@oracle.com> References: <38446edd-b216-c63b-5f1a-2348a5100d5d@oracle.com> Message-ID: This looks good. Thanks Max On 06/13/2017 09:28 AM, sha.jiang at oracle.com wrote: > Hi, > Please review the below patch for adding @bug tag for test > sun/security/ssl/CertPathRestrictions/TLSRestrictions.java. > > diff -r 3801153e1036 > test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java > --- a/test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java > Mon Jun 12 12:45:52 2017 -0700 > +++ b/test/sun/security/ssl/CertPathRestrictions/TLSRestrictions.java > Tue Jun 13 09:13:50 2017 +0800 > @@ -48,6 +48,7 @@ > > /* > * @test > + * @bug 8165367 > * @summary Verify the restrictions for certificate path on JSSE with > custom trust store. > * @library /test/lib > * @build jdk.test.lib.Utils > > Best regards, > John Jiang > From weijun.wang at oracle.com Tue Jun 13 01:44:48 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Tue, 13 Jun 2017 09:44:48 +0800 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: <092e08a3-901b-0f9e-ba14-c8987b61d5cd@comcast.net> References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> <092e08a3-901b-0f9e-ba14-c8987b61d5cd@comcast.net> Message-ID: <6ab15c43-e469-36ff-4fde-d9e3d333cfe5@oracle.com> Hi Michael I cannot access ISO 8601 but according to https://en.wikipedia.org/wiki/GeneralizedTime: > A GeneralizedTime is a time format in the ASN.1 notation. It consists > of a string value representing the calendar date, as defined in ISO > 8601, a time of day with an optional fractional seconds element and > the optional local time differential factor as defined in ISO 8601. > > In contrast to the UTCTime class of ASN.1 the GeneralizedTime uses a > four-digit representation of the year to avoid possible ambiguity. > Another difference is the possibility to encode time information of > any wanted precision via the fractional seconds element. So my understanding is that ISO 8601 is only for "the optional local time differential factor", and it does mention "any wanted precision". In fact, I tried to generate a DER encoding of a GeneralizedTime with a long fractional part and "openssl asn1parse" accepts it and displays all the digits. I can read X.680 but it does not mention any restriction. Thanks Max On 06/13/2017 01:07 AM, Michael StJohns wrote > The actual bound in GeneralizedTime is 6 digits of fractional time > (according to ISO 8601) or 25 characters. That should still continue > to be enforced. > From weijun.wang at oracle.com Tue Jun 13 01:48:23 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Tue, 13 Jun 2017 09:48:23 +0800 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: <6ab15c43-e469-36ff-4fde-d9e3d333cfe5@oracle.com> References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> <092e08a3-901b-0f9e-ba14-c8987b61d5cd@comcast.net> <6ab15c43-e469-36ff-4fde-d9e3d333cfe5@oracle.com> Message-ID: On 06/13/2017 09:44 AM, Weijun Wang wrote: > Hi Michael > > I cannot access ISO 8601 but according to > https://en.wikipedia.org/wiki/GeneralizedTime: > >> A GeneralizedTime is a time format in the ASN.1 notation. It consists >> of a string value representing the calendar date, as defined in ISO >> 8601, a time of day with an optional fractional seconds element and >> the optional local time differential factor as defined in ISO 8601. >> >> In contrast to the UTCTime class of ASN.1 the GeneralizedTime uses a >> four-digit representation of the year to avoid possible ambiguity. >> Another difference is the possibility to encode time information of >> any wanted precision via the fractional seconds element. > > So my understanding is that ISO 8601 is only for "the optional local > time differential factor", and it does mention "any wanted precision". Oh, ISO 8601 is for "the calendar date" and "the optional local time differential factor", but not "a time of day with an optional fractional seconds element". --Max > > In fact, I tried to generate a DER encoding of a GeneralizedTime with a > long fractional part and "openssl asn1parse" accepts it and displays all > the digits. > > I can read X.680 but it does not mention any restriction. > > Thanks > Max > > On 06/13/2017 01:07 AM, Michael StJohns wrote >> The actual bound in GeneralizedTime is 6 digits of fractional time >> (according to ISO 8601) or 25 characters. That should still continue >> to be enforced. >> From vincent.x.ryan at oracle.com Tue Jun 13 11:31:57 2017 From: vincent.x.ryan at oracle.com (Vincent Ryan) Date: Tue, 13 Jun 2017 12:31:57 +0100 Subject: [9] RFR 8181978: Keystore probing mechanism fails for large PKCS12 keystores Message-ID: Martin has reported a serious regression involving PKCS12 keystores in JDK 9. It affects large PKCS12 keystores loaded using the new KeyStore.getInstance(File, xxx) method. The error is due to a typo in the masks used by the keystore type detection mechanism. Bug: https://bugs.openjdk.java.net/browse/JDK-8181978 Webrev: http://cr.openjdk.java.net/~vinnie/8181978/webrev.00/ From sean.mullan at oracle.com Tue Jun 13 12:13:12 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Tue, 13 Jun 2017 08:13:12 -0400 Subject: [9] RFR 8181978: Keystore probing mechanism fails for large PKCS12 keystores In-Reply-To: References: Message-ID: Looks fine to me. --Sean On 6/13/17 7:31 AM, Vincent Ryan wrote: > Martin has reported a serious regression involving PKCS12 keystores in JDK 9. > It affects large PKCS12 keystores loaded using the new KeyStore.getInstance(File, xxx) method. > The error is due to a typo in the masks used by the keystore type detection mechanism. > > Bug: https://bugs.openjdk.java.net/browse/JDK-8181978 > Webrev: http://cr.openjdk.java.net/~vinnie/8181978/webrev.00/ > From ecki at zusammenkunft.net Tue Jun 13 13:25:23 2017 From: ecki at zusammenkunft.net (Bernd Eckenfels) Date: Tue, 13 Jun 2017 13:25:23 +0000 Subject: Stricter Public Key checking corrupts JKS In-Reply-To: References: <4c80e1cf-17f9-335e-e80e-91b23ee44e19@oracle.com>, Message-ID: The keystore I have here (which has leading 0 in Modulus in 1 cert and 0 in serial number in another) does not open in test program or keytool.exe with 8u131 (sorry last mail 7u131 was a typo) This happens before the password query: C:\Users> "c:\Program Files\Java\jdk1.8.0_131\bin\keytool.exe" -list -keystore c:\temp\ks\broken.jks Keytool-Fehler: java.security.cert.CertificateParsingException: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key I think it is OK to barf when the signature if the data is not normallized, but for not loading the whole keystore its a bit painful. NB: the extend of this problem seems not big, so far we had one customer with two partners, but not all of them might use the latest java yet. The stacktrace I posted (repeatet here) is JDK 8U131 (Win64) > "c:\Program Files\Java\jdk1.8.0_131\bin\java" -cp \ws\github\javacryptotest\target\classes net.eckenfels.test.certpath.KeystoreExploder c:\temp\ks\broken.jks Writing c:\temp\ks\broken.jks to C:\Users directory ... Exception in thread "main" java.security.cert.CertificateParsingException: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509CertInfo.(X509CertInfo.java:169) at sun.security.x509.X509CertImpl.parse(X509CertImpl.java:1804) at sun.security.x509.X509CertImpl.(X509CertImpl.java:195) at sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:102) at java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339) at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:755) at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:56) at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224) at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:70) at java.security.KeyStore.load(KeyStore.java:1445) at net.eckenfels.test.certpath.KeystoreExploder.main(KeystoreExploder.java:66) ... Caused by: java.io.IOException: Invalid encoding: redundant leading 0s at sun.security.util.DerInputBuffer.getBigInteger(DerInputBuffer.java:152) at sun.security.util.DerInputStream.getBigInteger(DerInputStream.java:207) at sun.security.rsa.RSAPrivateCrtKeyImpl.getBigInteger(RSAPrivateCrtKeyImpl.java:214) at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:115) ... 21 more I can provide you with the keystore offlist (contains a few comany names which should not be public). BTW: it reads "RSAPrivateCRtKeyImpl but the cert is a TrustedCertEntry. Gruss Bernd 2017-06-12 13:29 GMT+02:00 Sean Mullan >: Hi Bernd, This issue should be fixed in 8u131. Can you try that and let us know? --Sean On 6/9/17 10:18 PM, Bernd wrote: I noticed there is a bug (8177657,etc) about stricter DER checking on JDK Certificate code. I have an JKS Keystore which no longer can be opened because of that. I understand that the strict parsing has to stay for public keys, however I wonder if anything can be done about loading the other keys from the keystore or at least reporting the alias of the unparseable entry. The Problem was introduced with 8u121, 8u112 can open the file and it exists in 7u131 as well. Exception in thread "main" java.security.cert.CertificateParsingException: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509CertInfo.(X509CertInfo.java:169) at sun.security.x509.X509CertImpl.parse(X509CertImpl.java:1804) at sun.security.x509.X509CertImpl.(X509CertImpl.java:195) at sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:102) at java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339) at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore.java:755) at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeyStore.java:56) at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:224) at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.java:70) at java.security.KeyStore.load(KeyStore.java:1445) at net.eckenfels.test.certpath.KeystoreImport.main(KeystoreImport.java:29) Caused by: java.io.IOException: subject key, java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509Key.parse(X509Key.java:174) at sun.security.x509.CertificateX509Key.(CertificateX509Key.java:75) at sun.security.x509.X509CertInfo.parse(X509CertInfo.java:667) at sun.security.x509.X509CertInfo.(X509CertInfo.java:167) ... 10 more Caused by: java.security.InvalidKeyException: java.security.InvalidKeyException: Invalid RSA public key at sun.security.x509.X509Key.buildX509Key(X509Key.java:227) at sun.security.x509.X509Key.parse(X509Key.java:170) ... 13 more Caused by: java.security.spec.InvalidKeySpecException: java.security.InvalidKeyException: Invalid RSA public key at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:205) at java.security.KeyFactory.generatePublic(KeyFactory.java:334) at sun.security.x509.X509Key.buildX509Key(X509Key.java:223) ... 14 more Caused by: java.security.InvalidKeyException: Invalid RSA public key at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:120) at sun.security.x509.X509Key.decode(X509Key.java:391) at sun.security.x509.X509Key.decode(X509Key.java:403) at sun.security.rsa.RSAPublicKeyImpl.(RSAPublicKeyImpl.java:84) at sun.security.rsa.RSAKeyFactory.generatePublic(RSAKeyFactory.java:298) at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFactory.java:201) ... 16 more Caused by: java.io.IOException: Invalid encoding: redundant leading 0s at sun.security.util.DerInputBuffer.getBigInteger(DerInputBuffer.java:152) at sun.security.util.DerInputStream.getBigInteger(DerInputStream.java:207) at sun.security.rsa.RSAPrivateCrtKeyImpl.getBigInteger(RSAPrivateCrtKeyImpl.java:214) at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyImpl.java:115) ... 21 more -------------- next part -------------- An HTML attachment was scrubbed... URL: From sha.jiang at oracle.com Tue Jun 13 15:47:14 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Tue, 13 Jun 2017 23:47:14 +0800 Subject: RFR JDK-8179614: Test for jarsigner on verifying jars that are signed and timestamped by other JDK releases In-Reply-To: References: <5b75241c-1148-fe4f-bd6e-328f5737fed4@oracle.com> <7bf702ef-5ba4-e2b4-5b88-1c6ab222534f@oracle.com> <0fbe602f-6ba7-a05c-f163-9e15aeddeab4@oracle.com> <14c8a086-7e48-312b-3c15-917dcdc7061c@oracle.com> <0f0ce594-6ec3-0315-6627-a7cbe7cde294@oracle.com> <02569226-c63e-52ce-0519-e88cabde8358@oracle.com> <85acb154-534d-daa7-d200-c3bea4607de9@oracle.com> Message-ID: <4dae591e-12c5-c5de-fb6b-793d1b8b3d96@oracle.com> Sean and Max, Please review this updated webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.03/ The main changes are: 1. It provides two new properties, tsaList and tsaListFile, for specifying a list of TSA services. And a new report column [TSA] is introduced. This column just display the TSA indices and all of TSA services are displayed at the top of the report. 2. If property strict is true, the cases on failed signing are not ignored. They still be listed in the test report, and the status of verifying are NONE. Best regards, John Jiang On 13/06/2017 06:51, sha.jiang at oracle.com wrote: > Hi Max, > > On 12/06/2017 17:29, Weijun Wang wrote: >> Great. Only 2 questions: >> >> 459 // Return key sizes according to the specified key algorithm. >> 460 private static int[] keySizes(String digestAlgorithm, String >> keyAlgorithm) { >> 461 if (digestAlgorithm == DEFAULT) { >> 462 return new int[] { 0 }; >> 463 } >> 464 >> 465 if (keyAlgorithm == RSA || keyAlgorithm == DSA) { >> 466 return new int[] { 1024, 2048 }; >> 467 } else if (keyAlgorithm == EC) { >> 468 return new int[] { 384, 571 }; >> 469 } >> 470 >> 471 return null; >> 472 } >> >> Why is keysize dependent on digestalg? I mean, is it possible to >> always return {1024,2048,0} and {384,571,0}? > Get it, thanks! >> >> 379 // If signing fails, the following verifying has to >> 380 // be ignored. >> 381 if (signingStatus == STATUS.ERROR) { >> 382 continue; >> 383 } >> >> Now that you've already checked sigalg support earlier in what cases >> it could go wrong here? > Jar signing still could fail. For example, TSA service is unavailable. > > Best regards, > John Jiang >> >> Thanks >> Max >> >> On 06/12/2017 03:20 PM, sha.jiang at oracle.com wrote: >>> Hi Max, >>> Would you like to review the updated webrev: >>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.02/ >>> It can create certificate without -sigalg and -keysize, and jar >>> signing also can use this certificate. >>> >>> Best regards, >>> John Jiang >>> >>> On 09/06/2017 22:04, Weijun Wang wrote: >>>> >>>> On 06/09/2017 09:25 PM, sha.jiang at oracle.com wrote: >>>>> Hi Max, >>>>> >>>>> On 09/06/2017 20:05, Weijun Wang wrote: >>>>>> The test can be more friendly with default values. >>>>>> >>>>>> For example, in createCertificates(), you can generate certs that >>>>>> use default sigalg and keysize (i.e. without specifying -siglag >>>>>> and -keysize), and give them aliases with "default" or "null" >>>>>> inside. >>>>>> >>>>>> And in jar signing when signing with one -sigalg you can also >>>>>> choose cert generated with different or default sigalgs. >>>>> I supposed this test just focus on signed jar verifying, but not >>>>> certificate creating and jar signing. So, I'm not sure such cases >>>>> are necessary. >>>> >>>> Well sometimes a test can do many things. If you only care about >>>> jar verification, why bother creating certs with different digest >>>> algorithms? >>>> >>>> On the other hand, if you do care about more, then in >>>> >>>> 338 // If the digest algorithm is not specified, then it >>>> 339 // uses certificate with SHA256 digest and 1024 key >>>> 340 // size. >>>> 341 if (digestAlgorithm == DEFAULT) { >>>> 342 certDigest = SHA256; >>>> 343 certKeySize = 1024; >>>> 344 } >>>> >>>> it seems a little awkward to hardcode the algorithm and keysize. If >>>> signing is using a default algorithm, it seems natural to use the >>>> cert that was generated with a default algorithm. In fact, this >>>> test case is quite useful that it ensures our different tools are >>>> using the same (or at least interoperable) default algorithms. >>>> >>>> --Max >>>> >>>>>> >>>>>> BTW, I remember certain pairs of -keysize and -sigalg do not work >>>>>> together. For example, 1024 bit of DSA key cannot be used with >>>>>> SHA512withDSA signature algorithm. Have you noticed it? >>>>> It looks SHA512withDSA is not supported yet. >>>>> I was using JDK10 build 10. When the test tried to create >>>>> certificate with -keyalg DSA -sigalg SHA512withDSA -keysize 1024, >>>>> the below error raised: >>>>> keytool error: java.security.NoSuchAlgorithmException: >>>>> unrecognized algorithm name: SHA512withDSA >>>>> >>>>> If used -keyalg DSA -sigalg SHA1withDSA -keysize 2048, the error was: >>>>> keytool error: java.security.InvalidKeyException: The security >>>>> strength of SHA-1 digest algorithm is not sufficient for this key >>>>> size >>>>> >>>>> Again, this test focus on signed jar verifying. If some problems >>>>> are raised on certificate creating or jar signing, the associated >>>>> verifying cases will be ignored. >>>>> >>>>> Best regards, >>>>> John Jiang >>>>>> >>>>>> Thanks >>>>>> Max >>>>>> >>>>>> >>>>>> On 06/09/2017 04:44 PM, sha.jiang at oracle.com wrote: >>>>>>> Hi Sean and Max, >>>>>>> Thanks for your comments. >>>>>>> Please review the updated webrev: >>>>>>> http://cr.openjdk.java.net/~jjiang/8179614/webrev.01/ >>>>>>> >>>>>>> The test has been modified significantly. The main points are: >>>>>>> 1. Adds cases on EC. Now the test supports key algorithms RSA, >>>>>>> DSA and EC. >>>>>>> 2. Adds cases on SHA-512. Now the test supports digest >>>>>>> algorithms SHA-1, SHA-256 and SHA-512. >>>>>>> 3. Adds cases on key size. Exactly, [384, 571] for EC, [1024, >>>>>>> 2048] for RSA and DSA. >>>>>>> 4. Adds cases on default signature algorithm. Now the test >>>>>>> report can display the default algorithmat column [Signature >>>>>>> Algorithm]. >>>>>>> 5. Adds property -Djava.security.egd=file:/dev/./urandom for >>>>>>> keytool and jarsigner commands. >>>>>>> 6. Create a separated application, JdkUtils.java, to determine >>>>>>> the JDK build version (java.runtime.version) and check if a >>>>>>> signature algorithm is supported by a JDK. >>>>>>> 7. Introduces a new property, named javaSecurityFile, for >>>>>>> allowing users to specify alternative java security properties >>>>>>> file. >>>>>>> 8. Renames report column [Cert Type] to [Certificate]. This >>>>>>> column displays the certificate identifiers, which is a >>>>>>> combination of key algorithm, digest algorithm, key size and >>>>>>> expired mark (if any). >>>>>>> 9. The test summary also be updated accordingly. >>>>>>> >>>>>>> Best regards, >>>>>>> John Jiang >>>>>>> >>>>>>> >>>>>>> On 07/06/2017 23:11, Sean Mullan wrote: >>>>>>>> On 6/6/17 9:14 PM, sha.jiang at oracle.com wrote: >>>>>>>>> Hi Sean, >>>>>>>>> >>>>>>>>> On 07/06/2017 04:27, Sean Mullan wrote: >>>>>>>>>> Hi John, >>>>>>>>>> >>>>>>>>>> This looks like a very useful test. I have not gone through >>>>>>>>>> all of the code, but here are a few comments for now until I >>>>>>>>>> have more time: >>>>>>>>>> >>>>>>>>>> - add tests for EC keys >>>>>>>>>> - add tests for SHA-512 variants of the signature algorithms >>>>>>>>>> - add tests for larger key sizes (ex: 2048 for DSA/RSA) >>>>>>>>>> - you can use the diamond operator <> in various places >>>>>>>>>> - might be more compact if jdkList() used Files.lines() to >>>>>>>>>> parse the file into a stream then an array >>>>>>>>> I did consider about the above two points. Because the test >>>>>>>>> will be backported to JDK 6, so I only used the features those >>>>>>>>> supported by JDK 6. >>>>>>>>> I supposed that would make the backport easier. Does it make >>>>>>>>> sense? >>>>>>>> >>>>>>>> Yes, that makes sense. >>>>>>>> >>>>>>>> --Sean >>>>>>>> >>>>>>>>> >>>>>>>>> Best regards, >>>>>>>>> John Jiang >>>>>>>>>> - did you consider using the jarsigner API >>>>>>>>>> (jdk.security.jarsigner) instead of the command-line? I think >>>>>>>>>> this would be better (if possible) and it would give us some >>>>>>>>>> more tests of that API. >>>>>>>>>> >>>>>>>>>> --Sean >>>>>>>>>> >>>>>>>>>> On 6/5/17 6:31 AM, sha.jiang at oracle.com wrote: >>>>>>>>>>> Hi, >>>>>>>>>>> Please review this manual test for checking if a jar, which >>>>>>>>>>> is signed and timestamped by a JDK build, could be verified >>>>>>>>>>> by other JDK builds. >>>>>>>>>>> It also can be used to check if the default timestamp digest >>>>>>>>>>> algorithm on signing is SHA-256. >>>>>>>>>>> For more details, please look through the test summary. >>>>>>>>>>> >>>>>>>>>>> Issue: https://bugs.openjdk.java.net/browse/JDK-8179614 >>>>>>>>>>> Webrev: http://cr.openjdk.java.net/~jjiang/8179614/webrev.00/ >>>>>>>>>>> >>>>>>>>>>> Best regards, >>>>>>>>>>> John Jiang >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > > From mbalao at redhat.com Tue Jun 13 22:41:12 2017 From: mbalao at redhat.com (Martin Balao) Date: Tue, 13 Jun 2017 19:41:12 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> Message-ID: Hi Xuelei, The new webrev.01 is ready: * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01/ (browse online) * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01.zip (zip, download) The following changes have been implemented since the previous webrev.00: * Pre-agreed support removed from server-side * Unnecessary overhead and minium benefits for JSSE. * Enabling the use of Trusted CA Indication extension for clients through TrustManager objects was reverted. Trusted CA Indication extension can now be enabled through: 1) SSLEngine, 2) SSLSocket, or 3) SSLParameters (which can be applied to both SSLEngine and SSLSocket objects). Trusted CA Indication extension is mandatory for servers. * SunX509KeyManagerImpl old key manager ("SunX509" algorithm) is now out of scope. This key manager does not support other TLS extensions as Server Name Indication (SNI), which is far more relevant than Trusted CA Indication. The new X509KeyManagerImpl key manager ("PKIX" algorithm) is now in scope. * Client requested indications are now an ExtendedSSLSession attribute. ServerHandshaker gets the information from the Client Hello message (now parsed by TrustedCAIndicationExtension class instead of TrustedAuthorityIndicator) and sets it in the ExtendedSSLSession (SSLSessionImpl object). The key manager (i.e.: X509KeyManagerImpl), when choosing a server alias, may now get the information from the ExtendedSSLSession object and guide the certificate selection based on it. * In order to allow multiple key managers to use Trusted Authority Indicators information and to allow multiple Trusted Authority Indicators implementations, TrustedAuthorityIndicator has now been split in an abstract class (TrustedAuthorityIndicator, located in javax.net.ssl) and an implementation class (TrustedAuthorityIndicatorImpl, located in sun.security.ssl). No coupling was added between javax.net.ssl and sun.security.ssl packages. * Documentation extended and improved. * Test cases (server and client) updated to reflect the new interface and supported key manager. Look forward to your new review! Kind regards, Martin.- On Fri, Jun 9, 2017 at 6:15 PM, Xuelei Fan wrote: > I'm OK to use SSLParameters. Thank you very much for considering a new > design. > > Xuelei > > On 6/9/2017 1:10 PM, Martin Balao wrote: > >> Hi Xuelei, >> >> I didn't notice that some of the SSLSocket contructors did not establish >> the connection, so SSLParameters can be effective for Trusted CA >> Indication. This was an invalid argument on my side, sorry. >> >> As for the configuration to enable the extension, it's probably not >> necessary on the Server side because -as you mentioned- it is mandatory and >> there is no harm in supporting it. However, it has to be configurable on >> the Client side because -as we previously discussed- the client may cause a >> handshake failure if the server does not support the extension. I'd prefer >> the Client configuring the SSLSocket through SSLParameters instead of a >> system-wide property -which has even more impact than the TrustManager >> approach-. Would this work for you? >> >> > In JSSE, the benefits pre_agreed option can get by customizing the >> key/trust manager, so I did not see too much benefits to support this >> option in JDK >> >> I understand your point and will remove support for "pre_agreed". >> >> >> On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan > > wrote: >> >> >> >> On 6/8/2017 8:36 PM, Xuelei Fan wrote: >> >> The trusted authorities can be get from client trust manager. >> Server can choose the best matching of server certificate of the >> client requested trusted authorities. >> >> > >> I missed the point that the key manager need to know the client >> requested trusted authorities for the choosing. So may need a new >> SSLSession attribute (See similar method in ExtendedSSLSession). >> >> Xuelei >> >> >> >> Yes, an attribute on SSLSession may do the job (both when Key Manager >> receives a SSLSocket and a SSLEngine). >> >> Kind regards, >> Martin.- >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mstjohns at comcast.net Wed Jun 14 01:27:42 2017 From: mstjohns at comcast.net (Michael StJohns) Date: Tue, 13 Jun 2017 21:27:42 -0400 Subject: RFR 8181841: A TSA server returns timestamp with precision higher than milliseconds In-Reply-To: References: <2ed9f7f3-d6b6-cae9-7ad2-c8ba72a0aa2a@oracle.com> <092e08a3-901b-0f9e-ba14-c8987b61d5cd@comcast.net> <6ab15c43-e469-36ff-4fde-d9e3d333cfe5@oracle.com> Message-ID: Hi Max - I think I pushed the wrong button and sent out the wrong email. My original email had this (limit is 6 digits) comment in it, but I'd actually decided not to send it because I couldn't find a definitive source. X.680 used the phrase "a time of day, to any of the precisions defined in ISO 8601..." (clause 42.2 (b) ) and I'd found another reference that suggested that 6 digits was the maximum "defined" precision. But I couldn't find an actual version of 8601 so I wasn't sure if the reference was reasonably interpreting 8601 so I'd decided not to send. So what I'm saying is - ignore my email. Sorry. Mike On 6/12/2017 9:48 PM, Weijun Wang wrote: > > > On 06/13/2017 09:44 AM, Weijun Wang wrote: >> Hi Michael >> >> I cannot access ISO 8601 but according to >> https://en.wikipedia.org/wiki/GeneralizedTime: >> >>> A GeneralizedTime is a time format in the ASN.1 notation. It consists >>> of a string value representing the calendar date, as defined in ISO >>> 8601, a time of day with an optional fractional seconds element and >>> the optional local time differential factor as defined in ISO 8601. >>> >>> In contrast to the UTCTime class of ASN.1 the GeneralizedTime uses a >>> four-digit representation of the year to avoid possible ambiguity. >>> Another difference is the possibility to encode time information of >>> any wanted precision via the fractional seconds element. >> >> So my understanding is that ISO 8601 is only for "the optional local >> time differential factor", and it does mention "any wanted precision". > > Oh, ISO 8601 is for "the calendar date" and "the optional local time > differential factor", but not "a time of day with an optional > fractional seconds element". > > --Max > >> >> In fact, I tried to generate a DER encoding of a GeneralizedTime with >> a long fractional part and "openssl asn1parse" accepts it and >> displays all the digits. >> >> I can read X.680 but it does not mention any restriction. >> >> Thanks >> Max >> >> On 06/13/2017 01:07 AM, Michael StJohns wrote >>> The actual bound in GeneralizedTime is 6 digits of fractional time >>> (according to ISO 8601) or 25 characters. That should still continue >>> to be enforced. >>> From ecki at zusammenkunft.net Wed Jun 14 16:38:23 2017 From: ecki at zusammenkunft.net (Bernd) Date: Wed, 14 Jun 2017 18:38:23 +0200 Subject: Stricter Public Key checking corrupts JKS In-Reply-To: <4c80e1cf-17f9-335e-e80e-91b23ee44e19@oracle.com> References: <4c80e1cf-17f9-335e-e80e-91b23ee44e19@oracle.com> Message-ID: Hello Sean, I tried now 1.8.0_152ea b04 (May 2017) and using the keytool works now again to open the JKS with this broken certificate. It is also possible to use CertificateFactory.getInstance("X.509").getenrateCertificate(in) with the questionalble certs. This is great! From the look of JDK-8175251 it seems it also will show up in the next CPU. The same bug also claims its fixed in 131, however it talks about b33 where the public version is b11. So maybe thats the reason for 131 beeing still affected? I suspect the broken certificates will not be useable in a certificate chain, I will test that later on. Gruss Bernd 2017-06-12 13:29 GMT+02:00 Sean Mullan : > Hi Bernd, > > This issue should be fixed in 8u131. Can you try that and let us know? > > --Sean > > > On 6/9/17 10:18 PM, Bernd wrote: > >> I noticed there is a bug (8177657,etc) about stricter DER checking on JDK >> Certificate code. I have an JKS Keystore which no longer can be opened >> because of that. >> >> I understand that the strict parsing has to stay for public keys, however >> I wonder if anything can be done about loading the other keys from the >> keystore or at least reporting the alias of the unparseable entry. >> >> The Problem was introduced with 8u121, 8u112 can open the file and it >> exists in 7u131 as well. >> >> Exception in thread "main" java.security.cert.CertificateParsingException: >> java.io.IOException: subject key, java.security.InvalidKeyException: >> Invalid RSA public key >> at sun.security.x509.X509CertInfo.(X509CertInfo.java:169) >> at sun.security.x509.X509CertImpl.parse(X509CertImpl.java:1804) >> at sun.security.x509.X509CertImpl.(X509CertImpl.java:195) >> at sun.security.provider.X509Factory.engineGenerateCertificate( >> X509Factory.java:102) >> at java.security.cert.CertificateFactory.generateCertificate(Ce >> rtificateFactory.java:339) >> at sun.security.provider.JavaKeyStore.engineLoad(JavaKeyStore. >> java:755) >> at sun.security.provider.JavaKeyStore$JKS.engineLoad(JavaKeySto >> re.java:56) >> at sun.security.provider.KeyStoreDelegator.engineLoad(KeyStoreD >> elegator.java:224) >> at sun.security.provider.JavaKeyStore$DualFormatJKS.engineLoad( >> JavaKeyStore.java:70) >> at java.security.KeyStore.load(KeyStore.java:1445) >> at net.eckenfels.test.certpath.KeystoreImport.main(KeystoreImpo >> rt.java:29) >> Caused by: java.io.IOException: subject key, >> java.security.InvalidKeyException: Invalid RSA public key >> at sun.security.x509.X509Key.parse(X509Key.java:174) >> at sun.security.x509.CertificateX509Key.(CertificateX509K >> ey.java:75) >> at sun.security.x509.X509CertInfo.parse(X509CertInfo.java:667) >> at sun.security.x509.X509CertInfo.(X509CertInfo.java:167) >> ... 10 more >> Caused by: java.security.InvalidKeyException: >> java.security.InvalidKeyException: Invalid RSA public key >> at sun.security.x509.X509Key.buildX509Key(X509Key.java:227) >> at sun.security.x509.X509Key.parse(X509Key.java:170) >> ... 13 more >> Caused by: java.security.spec.InvalidKeySpecException: >> java.security.InvalidKeyException: Invalid RSA public key >> at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFa >> ctory.java:205) >> at java.security.KeyFactory.generatePublic(KeyFactory.java:334) >> at sun.security.x509.X509Key.buildX509Key(X509Key.java:223) >> ... 14 more >> Caused by: java.security.InvalidKeyException: Invalid RSA public key >> at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyI >> mpl.java:120) >> at sun.security.x509.X509Key.decode(X509Key.java:391) >> at sun.security.x509.X509Key.decode(X509Key.java:403) >> at sun.security.rsa.RSAPublicKeyImpl.(RSAPublicKeyImpl. >> java:84) >> at sun.security.rsa.RSAKeyFactory.generatePublic(RSAKeyFactory. >> java:298) >> at sun.security.rsa.RSAKeyFactory.engineGeneratePublic(RSAKeyFa >> ctory.java:201) >> ... 16 more >> Caused by: java.io.IOException: Invalid encoding: redundant leading 0s >> at sun.security.util.DerInputBuffer.getBigInteger(DerInputBuffe >> r.java:152) >> at sun.security.util.DerInputStream.getBigInteger(DerInputStrea >> m.java:207) >> at sun.security.rsa.RSAPrivateCrtKeyImpl.getBigInteger(RSAPriva >> teCrtKeyImpl.java:214) >> at sun.security.rsa.RSAPublicKeyImpl.parseKeyBits(RSAPublicKeyI >> mpl.java:115) >> ... 21 more >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From xuelei.fan at oracle.com Wed Jun 14 22:17:20 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Wed, 14 Jun 2017 15:17:20 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> Message-ID: <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> Hi Martin, The big picture of the design looks pretty good to me, except a few comment about the JSSE conventions. I appreciate it very much. By the way, I need more time to look into the details of the specification and implementation. In order to keep the APIs simple and small, SSLParameters is preferred as the only configuration port for common cases. I may suggest to remove the s/getUseTrustedCAIndication() methods in SSLEngine/SSLSocket. The identify type is defined as an enum TrustedAuthorityIndicator.IdentifierType. In the future, if more type is added, we need to update the specification by adding a new enum item. Enum is preferred in JDK, but for good extensibility, in general JSSE does not use enum in public APIs for extensible properties. I may suggest to use String (or integer/byte, I prefer to use String) as the type. The standard trusted authority indicator algorithm (identifier) can be documented in the "Java Cryptography Architecture Standard Algorithm Name Documentation"[1]. In TrustedAuthorityIndicator class, some methods, like getIdentifierTypeFromCode(), getCodeFromIdentifierType(), implies(), getLength(), equals() and hashCode() look more like implementation logic. I may suggest remove them from public APIs. I did not see the benefit to have X509Certificate in the TrustedAuthorityIndicator class. The class is mainly used for server side certificate selection. X509Certificate could be unknown for a indicator. I may suggestion remove the related methods and properties. After that, as there is no requirement to instantiate TrustedAuthorityIndicator class in application code, looks like it may be enough to use an interface to represent a trusted authorities: public interface TrustedAuthorityIndicator { // identifier type, standard algorithm name String/int/Byte getType(); // identifier byte[] getEncoded(); } What do you think? Thanks & Regards, Xuelei [1] https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html On 6/13/2017 3:41 PM, Martin Balao wrote: > Hi Xuelei, > > The new webrev.01 is ready: > > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01/ > (browse online) > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01.zip > (zip, download) > > The following changes have been implemented since the previous webrev.00: > > * Pre-agreed support removed from server-side > * Unnecessary overhead and minium benefits for JSSE. > > * Enabling the use of Trusted CA Indication extension for clients > through TrustManager objects was reverted. Trusted CA Indication > extension can now be enabled through: 1) SSLEngine, 2) SSLSocket, or 3) > SSLParameters (which can be applied to both SSLEngine and SSLSocket > objects). Trusted CA Indication extension is mandatory for servers. > > * SunX509KeyManagerImpl old key manager ("SunX509" algorithm) is now > out of scope. This key manager does not support other TLS extensions as > Server Name Indication (SNI), which is far more relevant than Trusted CA > Indication. The new X509KeyManagerImpl key manager ("PKIX" algorithm) is > now in scope. > > * Client requested indications are now an ExtendedSSLSession > attribute. ServerHandshaker gets the information from the Client Hello > message (now parsed by TrustedCAIndicationExtension class instead of > TrustedAuthorityIndicator) and sets it in the ExtendedSSLSession > (SSLSessionImpl object). The key manager (i.e.: X509KeyManagerImpl), > when choosing a server alias, may now get the information from the > ExtendedSSLSession object and guide the certificate selection based on it. > * In order to allow multiple key managers to use Trusted Authority > Indicators information and to allow multiple Trusted Authority > Indicators implementations, TrustedAuthorityIndicator has now been split > in an abstract class (TrustedAuthorityIndicator, located in > javax.net.ssl) and an implementation class > (TrustedAuthorityIndicatorImpl, located in sun.security.ssl). No > coupling was added between javax.net.ssl and sun.security.ssl packages. > > * Documentation extended and improved. > * Test cases (server and client) updated to reflect the new interface > and supported key manager. > > Look forward to your new review! > > Kind regards, > Martin.- > > > > On Fri, Jun 9, 2017 at 6:15 PM, Xuelei Fan > wrote: > > I'm OK to use SSLParameters. Thank you very much for considering a > new design. > > Xuelei > > On 6/9/2017 1:10 PM, Martin Balao wrote: > > Hi Xuelei, > > I didn't notice that some of the SSLSocket contructors did not > establish the connection, so SSLParameters can be effective for > Trusted CA Indication. This was an invalid argument on my side, > sorry. > > As for the configuration to enable the extension, it's probably > not necessary on the Server side because -as you mentioned- it > is mandatory and there is no harm in supporting it. However, it > has to be configurable on the Client side because -as we > previously discussed- the client may cause a handshake failure > if the server does not support the extension. I'd prefer the > Client configuring the SSLSocket through SSLParameters instead > of a system-wide property -which has even more impact than the > TrustManager approach-. Would this work for you? > > > In JSSE, the benefits pre_agreed option can get by > customizing the key/trust manager, so I did not see too much > benefits to support this option in JDK > > I understand your point and will remove support for "pre_agreed". > > > On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan > > >> > wrote: > > > > On 6/8/2017 8:36 PM, Xuelei Fan wrote: > > The trusted authorities can be get from client trust > manager. Server can choose the best matching of server > certificate of the > client requested trusted authorities. > > > > I missed the point that the key manager need to know the client > requested trusted authorities for the choosing. So may > need a new > SSLSession attribute (See similar method in > ExtendedSSLSession). > > Xuelei > > > > Yes, an attribute on SSLSession may do the job (both when Key > Manager receives a SSLSocket and a SSLEngine). > > Kind regards, > Martin.- > > From xuelei.fan at oracle.com Thu Jun 15 18:09:55 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 15 Jun 2017 11:09:55 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> Message-ID: <9aedc534-36d6-c003-137c-33b760275eaa@oracle.com> Hi Martin, I think more about the new TrustedAuthorityIndicator class. Maybe, it can be replaced with the existing java.util.Map.Entry class (using java.util.AbstractMap.SimpleImmutableEntry for the implementation). ExtendedSSLSession.java List> getRequestedTrustedCAs(); Xuelei On 6/14/2017 3:17 PM, Xuelei Fan wrote: > Hi Martin, > > The big picture of the design looks pretty good to me, except a few > comment about the JSSE conventions. I appreciate it very much. By the > way, I need more time to look into the details of the specification and > implementation. > > > In order to keep the APIs simple and small, SSLParameters is preferred > as the only configuration port for common cases. I may suggest to > remove the s/getUseTrustedCAIndication() methods in SSLEngine/SSLSocket. > > The identify type is defined as an enum > TrustedAuthorityIndicator.IdentifierType. In the future, if more type > is added, we need to update the specification by adding a new enum item. > Enum is preferred in JDK, but for good extensibility, in general JSSE > does not use enum in public APIs for extensible properties. I may > suggest to use String (or integer/byte, I prefer to use String) as the > type. The standard trusted authority indicator algorithm (identifier) > can be documented in the "Java Cryptography Architecture Standard > Algorithm Name Documentation"[1]. > > In TrustedAuthorityIndicator class, some methods, like > getIdentifierTypeFromCode(), getCodeFromIdentifierType(), implies(), > getLength(), equals() and hashCode() look more like implementation > logic. I may suggest remove them from public APIs. > > I did not see the benefit to have X509Certificate in the > TrustedAuthorityIndicator class. The class is mainly used for server > side certificate selection. X509Certificate could be unknown for a > indicator. I may suggestion remove the related methods and properties. > > After that, as there is no requirement to instantiate > TrustedAuthorityIndicator class in application code, looks like it may > be enough to use an interface to represent a trusted authorities: > public interface TrustedAuthorityIndicator { > // identifier type, standard algorithm name > String/int/Byte getType(); > > // identifier > byte[] getEncoded(); > } > > What do you think? > > > Thanks & Regards, > Xuelei > > [1] > https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html > > > > On 6/13/2017 3:41 PM, Martin Balao wrote: >> Hi Xuelei, >> >> The new webrev.01 is ready: >> >> * >> http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01/ >> (browse online) >> * >> http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01.zip >> (zip, download) >> >> The following changes have been implemented since the previous webrev.00: >> >> * Pre-agreed support removed from server-side >> * Unnecessary overhead and minium benefits for JSSE. >> >> * Enabling the use of Trusted CA Indication extension for clients >> through TrustManager objects was reverted. Trusted CA Indication >> extension can now be enabled through: 1) SSLEngine, 2) SSLSocket, or >> 3) SSLParameters (which can be applied to both SSLEngine and SSLSocket >> objects). Trusted CA Indication extension is mandatory for servers. >> >> * SunX509KeyManagerImpl old key manager ("SunX509" algorithm) is now >> out of scope. This key manager does not support other TLS extensions >> as Server Name Indication (SNI), which is far more relevant than >> Trusted CA Indication. The new X509KeyManagerImpl key manager ("PKIX" >> algorithm) is now in scope. >> >> * Client requested indications are now an ExtendedSSLSession >> attribute. ServerHandshaker gets the information from the Client Hello >> message (now parsed by TrustedCAIndicationExtension class instead of >> TrustedAuthorityIndicator) and sets it in the ExtendedSSLSession >> (SSLSessionImpl object). The key manager (i.e.: X509KeyManagerImpl), >> when choosing a server alias, may now get the information from the >> ExtendedSSLSession object and guide the certificate selection based on >> it. >> * In order to allow multiple key managers to use Trusted Authority >> Indicators information and to allow multiple Trusted Authority >> Indicators implementations, TrustedAuthorityIndicator has now been >> split in an abstract class (TrustedAuthorityIndicator, located in >> javax.net.ssl) and an implementation class >> (TrustedAuthorityIndicatorImpl, located in sun.security.ssl). No >> coupling was added between javax.net.ssl and sun.security.ssl packages. >> >> * Documentation extended and improved. >> * Test cases (server and client) updated to reflect the new >> interface and supported key manager. >> >> Look forward to your new review! >> >> Kind regards, >> Martin.- >> >> >> >> On Fri, Jun 9, 2017 at 6:15 PM, Xuelei Fan > > wrote: >> >> I'm OK to use SSLParameters. Thank you very much for considering a >> new design. >> >> Xuelei >> >> On 6/9/2017 1:10 PM, Martin Balao wrote: >> >> Hi Xuelei, >> >> I didn't notice that some of the SSLSocket contructors did not >> establish the connection, so SSLParameters can be effective for >> Trusted CA Indication. This was an invalid argument on my side, >> sorry. >> >> As for the configuration to enable the extension, it's probably >> not necessary on the Server side because -as you mentioned- it >> is mandatory and there is no harm in supporting it. However, it >> has to be configurable on the Client side because -as we >> previously discussed- the client may cause a handshake failure >> if the server does not support the extension. I'd prefer the >> Client configuring the SSLSocket through SSLParameters instead >> of a system-wide property -which has even more impact than the >> TrustManager approach-. Would this work for you? >> >> > In JSSE, the benefits pre_agreed option can get by >> customizing the key/trust manager, so I did not see too much >> benefits to support this option in JDK >> >> I understand your point and will remove support for "pre_agreed". >> >> >> On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan >> >> >> >> wrote: >> >> >> >> On 6/8/2017 8:36 PM, Xuelei Fan wrote: >> >> The trusted authorities can be get from client trust >> manager. Server can choose the best matching of server >> certificate of the >> client requested trusted authorities. >> >> > >> I missed the point that the key manager need to know the >> client >> requested trusted authorities for the choosing. So may >> need a new >> SSLSession attribute (See similar method in >> ExtendedSSLSession). >> >> Xuelei >> >> >> >> Yes, an attribute on SSLSession may do the job (both when Key >> Manager receives a SSLSocket and a SSLEngine). >> >> Kind regards, >> Martin.- >> >> From mbalao at redhat.com Thu Jun 15 19:05:52 2017 From: mbalao at redhat.com (Martin Balao) Date: Thu, 15 Jun 2017 16:05:52 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> Message-ID: Hi Xuelei, The new webrev.02 is ready: * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_15/8046295.webrev.02/ (browse online) * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_15/8046295.webrev.02.zip (zip, download) The following changes have been implemented since the previous webrev.01: * s/getUseTrustedCAIndication() methods in SSLEngine/SSLSocket and in SSLEngineImpl/SSLSocketImpl removed. s/getSSLParameters is now the only way to set or get the use of the Trusted CA Indication extension. An exception is no longer thrown if trying to disable the extension for a server, but the change takes no effect as the extension is mandatory for servers. X509KeyManagerImpl modified to use SSLParameters to get information regarding if Trusted CA Indication is enabled and should guide the certificate choice. * TrustedAuthorityIndicator.IdentifierType has been moved from enum to String, to follow JSSE conventions. I understand how important is to be consistent. However, I still believe that an enum is a better fit for this value and does not prevent from future extension. We are choosing from a closed set (strictly defined by the RFC) and that's what enum allows to express. From the client point of view/API, it's very handy that the type gives you information regarding the allowed choices for the parameter. You don't necessarily have to look for implementation details or documentation but you can just leverage on the strongly typed language. It's also likely that enums are faster for comparisons than strings, but that's not the main point here. * Removed X509Certificate from TrustedAuthorityIndicator class (method and property). It was there for informational purposes (when TrustedAuthorityIndicator was built from a certificate by a client and the whole extension indicators converted to String). * "equals" and "hashCode" methods moved from TrustedAuthorityIndicator to TrustedAuthorityIndicatorImpl class. * "getLength" method removed from TrustedAuthorityIndicator class. It's possible to get the encoded buffer and the length from there. * "getData" method renamed to "getEncoded" in TrustedAuthorityIndicator class. * "trustedAuthorityEncodedData" renamed to "encodedData" in TrustedAuthorityIndicator and TrustedAuthorityIndicatorImpl classes * "identifier" and "encodedData" instance variables moved from TrustedAuthorityIndicator to TrustedAuthorityIndicatorImpl class. * "getEncoded" and "getIdentifier" are now abstract methods in TrustedAuthorityIndicator, and their implementation is in TrustedAuthorityIndicatorImpl class. * "getIdentifier" method renamed to "getType" in TrustedAuthorityIndicator and TrustedAuthorityIndicatorImpl classes ("identifier" instance variable and parameter in TrustedAuthorityIndicatorImpl class renamed to "type"). * Test cases (server and client) updated to reflect the new interface (enabling the use of the extension through SSLParameters) However, some changes are still not implemented and I have some concerns: 1) I still believe that identifier type information has to be on TrustedAuthorityIndicator class somehow, and implementations restricted on what they can return as part of "getType" method. This is strictly specified by the RFC TrustedAuthorityIndicator class represents, and I find desirable that any implementation is enforced to be compliant to that. If we remove all of that (including the enum), TrustedAuthorityIndicator looks too generic and does not reflect (in my opinion) what it really is. It'd also be chaotic if different implementations call pre-agreed type as "preagreed", "pre-agreed", "PRE_AGREED", etc. I prefer stricter and more explicit interfaces. 2) I agree that type mappings can be seen as part of an implementation, but they were in TrustedAuthorityIndicator (as protected) because every implementation is highly likely to need them and we can avoid the necessity for repeated code/mappings. The same for "type" and "encodedData" variables or even "hashCode" and "equals" methods. That's why I was thinking more of an abstract class and not an interface, as it happens (in example) with SNIServerName. 3) I think that "implies" method on TrustedAuthorityIndicator should be also part of the class/interface, because that's the whole point of a Trusted Authority Information: to allow queries for a given certificate. This is, in fact, the only thing a server wants from one of these objects. My concern is that if we remove this requirement for an implementation, the interface looks more like a byte buffer holder. I'd appreciate if you can re-consider these items. Thanks, Martin.- On Wed, Jun 14, 2017 at 7:17 PM, Xuelei Fan wrote: > Hi Martin, > > The big picture of the design looks pretty good to me, except a few > comment about the JSSE conventions. I appreciate it very much. By the > way, I need more time to look into the details of the specification and > implementation. > > > In order to keep the APIs simple and small, SSLParameters is preferred as > the only configuration port for common cases. I may suggest to remove the > s/getUseTrustedCAIndication() methods in SSLEngine/SSLSocket. > > The identify type is defined as an enum TrustedAuthorityIndicator.IdentifierType. > In the future, if more type is added, we need to update the specification > by adding a new enum item. Enum is preferred in JDK, but for good > extensibility, in general JSSE does not use enum in public APIs for > extensible properties. I may suggest to use String (or integer/byte, I > prefer to use String) as the type. The standard trusted authority > indicator algorithm (identifier) can be documented in the "Java > Cryptography Architecture Standard Algorithm Name Documentation"[1]. > > In TrustedAuthorityIndicator class, some methods, like > getIdentifierTypeFromCode(), getCodeFromIdentifierType(), implies(), > getLength(), equals() and hashCode() look more like implementation logic. > I may suggest remove them from public APIs. > > I did not see the benefit to have X509Certificate in the > TrustedAuthorityIndicator class. The class is mainly used for server side > certificate selection. X509Certificate could be unknown for a indicator. > I may suggestion remove the related methods and properties. > > After that, as there is no requirement to instantiate > TrustedAuthorityIndicator class in application code, looks like it may be > enough to use an interface to represent a trusted authorities: > public interface TrustedAuthorityIndicator { > // identifier type, standard algorithm name > String/int/Byte getType(); > > // identifier > byte[] getEncoded(); > } > > What do you think? > > > Thanks & Regards, > Xuelei > > [1] https://docs.oracle.com/javase/8/docs/technotes/guides/ > security/StandardNames.html > > > On 6/13/2017 3:41 PM, Martin Balao wrote: > >> Hi Xuelei, >> >> The new webrev.01 is ready: >> >> * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_truste >> d_ca/2017_06_13/8046295.webrev.01/ (browse online) >> * http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_truste >> d_ca/2017_06_13/8046295.webrev.01.zip (zip, download) >> >> The following changes have been implemented since the previous webrev.00: >> >> * Pre-agreed support removed from server-side >> * Unnecessary overhead and minium benefits for JSSE. >> >> * Enabling the use of Trusted CA Indication extension for clients >> through TrustManager objects was reverted. Trusted CA Indication extension >> can now be enabled through: 1) SSLEngine, 2) SSLSocket, or 3) SSLParameters >> (which can be applied to both SSLEngine and SSLSocket objects). Trusted CA >> Indication extension is mandatory for servers. >> >> * SunX509KeyManagerImpl old key manager ("SunX509" algorithm) is now >> out of scope. This key manager does not support other TLS extensions as >> Server Name Indication (SNI), which is far more relevant than Trusted CA >> Indication. The new X509KeyManagerImpl key manager ("PKIX" algorithm) is >> now in scope. >> >> * Client requested indications are now an ExtendedSSLSession attribute. >> ServerHandshaker gets the information from the Client Hello message (now >> parsed by TrustedCAIndicationExtension class instead of >> TrustedAuthorityIndicator) and sets it in the ExtendedSSLSession >> (SSLSessionImpl object). The key manager (i.e.: X509KeyManagerImpl), when >> choosing a server alias, may now get the information from the >> ExtendedSSLSession object and guide the certificate selection based on it. >> * In order to allow multiple key managers to use Trusted Authority >> Indicators information and to allow multiple Trusted Authority Indicators >> implementations, TrustedAuthorityIndicator has now been split in an >> abstract class (TrustedAuthorityIndicator, located in javax.net.ssl) and an >> implementation class (TrustedAuthorityIndicatorImpl, located in >> sun.security.ssl). No coupling was added between javax.net.ssl and >> sun.security.ssl packages. >> >> * Documentation extended and improved. >> * Test cases (server and client) updated to reflect the new interface >> and supported key manager. >> >> Look forward to your new review! >> >> Kind regards, >> Martin.- >> >> >> >> On Fri, Jun 9, 2017 at 6:15 PM, Xuelei Fan > > wrote: >> >> I'm OK to use SSLParameters. Thank you very much for considering a >> new design. >> >> Xuelei >> >> On 6/9/2017 1:10 PM, Martin Balao wrote: >> >> Hi Xuelei, >> >> I didn't notice that some of the SSLSocket contructors did not >> establish the connection, so SSLParameters can be effective for >> Trusted CA Indication. This was an invalid argument on my side, >> sorry. >> >> As for the configuration to enable the extension, it's probably >> not necessary on the Server side because -as you mentioned- it >> is mandatory and there is no harm in supporting it. However, it >> has to be configurable on the Client side because -as we >> previously discussed- the client may cause a handshake failure >> if the server does not support the extension. I'd prefer the >> Client configuring the SSLSocket through SSLParameters instead >> of a system-wide property -which has even more impact than the >> TrustManager approach-. Would this work for you? >> >> > In JSSE, the benefits pre_agreed option can get by >> customizing the key/trust manager, so I did not see too much >> benefits to support this option in JDK >> >> I understand your point and will remove support for "pre_agreed". >> >> >> On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan >> >> >> >> wrote: >> >> >> >> On 6/8/2017 8:36 PM, Xuelei Fan wrote: >> >> The trusted authorities can be get from client trust >> manager. Server can choose the best matching of server >> certificate of the >> client requested trusted authorities. >> >> > >> I missed the point that the key manager need to know the >> client >> requested trusted authorities for the choosing. So may >> need a new >> SSLSession attribute (See similar method in >> ExtendedSSLSession). >> >> Xuelei >> >> >> >> Yes, an attribute on SSLSession may do the job (both when Key >> Manager receives a SSLSocket and a SSLEngine). >> >> Kind regards, >> Martin.- >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From mbalao at redhat.com Thu Jun 15 19:13:55 2017 From: mbalao at redhat.com (Martin Balao) Date: Thu, 15 Jun 2017 16:13:55 -0300 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: <9aedc534-36d6-c003-137c-33b760275eaa@oracle.com> References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> <9aedc534-36d6-c003-137c-33b760275eaa@oracle.com> Message-ID: Hi Xuelei, On Thu, Jun 15, 2017 at 3:09 PM, Xuelei Fan wrote: > > > I think more about the new TrustedAuthorityIndicator class. Maybe, it can > be replaced with the existing java.util.Map.Entry class (using > java.util.AbstractMap.SimpleImmutableEntry for the implementation). > > ExtendedSSLSession.java > List> getRequestedTrustedCAs(); > > This looks a bit implicit and cryptic to me. I'd go for a more explicit, extensible and OO interface... -------------- next part -------------- An HTML attachment was scrubbed... URL: From artem.smotrakov at oracle.com Thu Jun 15 20:57:00 2017 From: artem.smotrakov at oracle.com (Artem Smotrakov) Date: Thu, 15 Jun 2017 13:57:00 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows Message-ID: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> Hi Xuelei, Could you please take a look at this patch? It enables SHA224-based signature algorithms on Windows since they should be provided not only by SunMSCAPI provider. Please see details in the bug description. The test works fine on all supported platforms. Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ Artem From xuelei.fan at oracle.com Thu Jun 15 21:20:47 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 15 Jun 2017 14:20:47 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> Message-ID: <41195604-5ccc-19ee-3e6a-df0778c6518e@oracle.com> Looks fine to me. Thanks, Xuelei On 6/15/2017 1:57 PM, Artem Smotrakov wrote: > Hi Xuelei, > > Could you please take a look at this patch? > > It enables SHA224-based signature algorithms on Windows since they > should be provided not only by SunMSCAPI provider. Please see details in > the bug description. > > The test works fine on all supported platforms. > > Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 > Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ > > Artem From ecki at zusammenkunft.net Thu Jun 15 23:06:30 2017 From: ecki at zusammenkunft.net (Bernd Eckenfels) Date: Thu, 15 Jun 2017 23:06:30 +0000 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> Message-ID: Hello, If I recall correctly the idea of disabling those algorithms if SunMSCAPI IS(!) present was to avoid agreeing on a Signature algorithm which could not be supported by RSA offloaded keys inside CryptoAPI. Having said that the suggested ciphers might need to be made dependent on the capabilities of the Signature provider for a given key type (especially if it is a key handle only). Has this changed and the signatures are supported now by MSCapi? Gruss Bernd -- http://bernd.eckenfels.net ________________________________ From: security-dev on behalf of Artem Smotrakov Sent: Thursday, June 15, 2017 10:57:00 PM To: Xuelei Fan; Security Dev OpenJDK Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows Hi Xuelei, Could you please take a look at this patch? It enables SHA224-based signature algorithms on Windows since they should be provided not only by SunMSCAPI provider. Please see details in the bug description. The test works fine on all supported platforms. Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ Artem -------------- next part -------------- An HTML attachment was scrubbed... URL: From xuelei.fan at oracle.com Thu Jun 15 23:37:26 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Thu, 15 Jun 2017 16:37:26 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> Message-ID: <25705e1d-faef-85ea-de0f-d78330ed300c@oracle.com> Hi Bernd, Thanks for the correction. I really missed the point that there are issues to enabled SHA-224 for SunMSCAPI provider. On 6/15/2017 4:06 PM, Bernd Eckenfels wrote: > Hello, > > If I recall correctly the idea of disabling those algorithms if > SunMSCAPI IS(!) present was to avoid agreeing on a Signature algorithm > which could not be supported by RSA offloaded keys inside CryptoAPI. > > Having said that the suggested ciphers might need to be made dependent > on the capabilities of the Signature provider for a given key type > (especially if it is a key handle only). > Agreed. Besides, we may check the availability of each signature and hash algorithms, rather than hard-coded them. I filed a new bug for the tracking: https://bugs.openjdk.java.net/browse/JDK-8182318 Thanks & Regards, Xuelei > Has this changed and the signatures are supported now by MSCapi? > > Gruss > Bernd > -- > http://bernd.eckenfels.net > ------------------------------------------------------------------------ > *From:* security-dev on behalf > of Artem Smotrakov > *Sent:* Thursday, June 15, 2017 10:57:00 PM > *To:* Xuelei Fan; Security Dev OpenJDK > *Subject:* [10] RFR: 8182143: SHA224-based signature algorithms are not > enabled for TLSv12 on Windows > Hi Xuelei, > > Could you please take a look at this patch? > > It enables SHA224-based signature algorithms on Windows since they > should be provided not only by SunMSCAPI provider. Please see details in > the bug description. > > The test works fine on all supported platforms. > > Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 > Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ > > Artem From artem.smotrakov at oracle.com Fri Jun 16 01:13:36 2017 From: artem.smotrakov at oracle.com (Artem Smotrakov) Date: Thu, 15 Jun 2017 18:13:36 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: <25705e1d-faef-85ea-de0f-d78330ed300c@oracle.com> References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> <25705e1d-faef-85ea-de0f-d78330ed300c@oracle.com> Message-ID: <08f15f4c-f439-fce3-e8f2-5ff8606a3e05@oracle.com> That sounds strange to me. I assume that if an algorithm is provided by a provider on all platforms, then it should work on all platforms no matter what. I am not sure that I really understand the problem, but probably it's about some problems that may occur if multiple providers are used together when for a TLS connection. I may guess that the problem may be in incompatibility of key implementations for different providers. If so, this looks like an issue to me. Please correct me if I am wrong. Probably there may be some specific case which fails, but SignatureAlgorithms.java test works fine now, and seems like SHA224 can be successfully used for establishing a connection. I am okay to back out the fix, but it would be good to have a testcase which shows the problem why the fix should be backed out. Then, we can work on a solution for that. Artem On 06/15/2017 04:37 PM, Xuelei Fan wrote: > Hi Bernd, > > Thanks for the correction. I really missed the point that there are > issues to enabled SHA-224 for SunMSCAPI provider. > > On 6/15/2017 4:06 PM, Bernd Eckenfels wrote: >> Hello, >> >> If I recall correctly the idea of disabling those algorithms if >> SunMSCAPI IS(!) present was to avoid agreeing on a Signature >> algorithm which could not be supported by RSA offloaded keys inside >> CryptoAPI. >> >> Having said that the suggested ciphers might need to be made >> dependent on the capabilities of the Signature provider for a given >> key type (especially if it is a key handle only). >> > Agreed. Besides, we may check the availability of each signature and > hash algorithms, rather than hard-coded them. I filed a new bug for > the tracking: > https://bugs.openjdk.java.net/browse/JDK-8182318 > > Thanks & Regards, > Xuelei > >> Has this changed and the signatures are supported now by MSCapi? >> >> Gruss >> Bernd >> -- >> http://bernd.eckenfels.net >> ------------------------------------------------------------------------ >> *From:* security-dev on >> behalf of Artem Smotrakov >> *Sent:* Thursday, June 15, 2017 10:57:00 PM >> *To:* Xuelei Fan; Security Dev OpenJDK >> *Subject:* [10] RFR: 8182143: SHA224-based signature algorithms are >> not enabled for TLSv12 on Windows >> Hi Xuelei, >> >> Could you please take a look at this patch? >> >> It enables SHA224-based signature algorithms on Windows since they >> should be provided not only by SunMSCAPI provider. Please see details in >> the bug description. >> >> The test works fine on all supported platforms. >> >> Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 >> Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ >> >> Artem From Xuelei.Fan at Oracle.Com Fri Jun 16 02:32:24 2017 From: Xuelei.Fan at Oracle.Com (Xuelei Fan) Date: Thu, 15 Jun 2017 19:32:24 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: <08f15f4c-f439-fce3-e8f2-5ff8606a3e05@oracle.com> References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> <25705e1d-faef-85ea-de0f-d78330ed300c@oracle.com> <08f15f4c-f439-fce3-e8f2-5ff8606a3e05@oracle.com> Message-ID: Hi Artem, If the key is generated in MSCAPI, the signature algorithm implemented in other provider cannot be actually used. BTW, we also need to consider the case that only MSCAPI provider is enabled in practice. In general, a signature algorithm is not included in the supported list unless all related providers support the signature algorithm. We're looking for better solution, but not yet have one in hand. Xuelei > On Jun 15, 2017, at 6:13 PM, Artem Smotrakov wrote: > > That sounds strange to me. I assume that if an algorithm is provided by a provider on all platforms, then it should work on all platforms no matter what. I am not sure that I really understand the problem, but probably it's about some problems that may occur if multiple providers are used together when for a TLS connection. I may guess that the problem may be in incompatibility of key implementations for different providers. If so, this looks like an issue to me. Please correct me if I am wrong. > > Probably there may be some specific case which fails, but SignatureAlgorithms.java test works fine now, and seems like SHA224 can be successfully used for establishing a connection. > > I am okay to back out the fix, but it would be good to have a testcase which shows the problem why the fix should be backed out. Then, we can work on a solution for that. > > Artem > > >> On 06/15/2017 04:37 PM, Xuelei Fan wrote: >> Hi Bernd, >> >> Thanks for the correction. I really missed the point that there are issues to enabled SHA-224 for SunMSCAPI provider. >> >>> On 6/15/2017 4:06 PM, Bernd Eckenfels wrote: >>> Hello, >>> >>> If I recall correctly the idea of disabling those algorithms if SunMSCAPI IS(!) present was to avoid agreeing on a Signature algorithm which could not be supported by RSA offloaded keys inside CryptoAPI. >>> >>> Having said that the suggested ciphers might need to be made dependent on the capabilities of the Signature provider for a given key type (especially if it is a key handle only). >>> >> Agreed. Besides, we may check the availability of each signature and hash algorithms, rather than hard-coded them. I filed a new bug for the tracking: >> https://bugs.openjdk.java.net/browse/JDK-8182318 >> >> Thanks & Regards, >> Xuelei >> >>> Has this changed and the signatures are supported now by MSCapi? >>> >>> Gruss >>> Bernd >>> -- >>> http://bernd.eckenfels.net >>> ------------------------------------------------------------------------ >>> *From:* security-dev on behalf of Artem Smotrakov >>> *Sent:* Thursday, June 15, 2017 10:57:00 PM >>> *To:* Xuelei Fan; Security Dev OpenJDK >>> *Subject:* [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows >>> Hi Xuelei, >>> >>> Could you please take a look at this patch? >>> >>> It enables SHA224-based signature algorithms on Windows since they >>> should be provided not only by SunMSCAPI provider. Please see details in >>> the bug description. >>> >>> The test works fine on all supported platforms. >>> >>> Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 >>> Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ >>> >>> Artem > From sean.mullan at oracle.com Fri Jun 16 15:00:23 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Fri, 16 Jun 2017 11:00:23 -0400 Subject: RFR [9]: 8181295: Document that SecurityManager::checkPackageAccess may be called by the VM Message-ID: <5f72118f-0d18-7901-4317-ecd1626122f2@oracle.com> Please review this clarification to the SecurityManager::checkPackageAccess method to note that the method may be called by the Virtual Machine when loading classes: http://cr.openjdk.java.net/~mullan/webrevs/8181295/webrev.00/ A small correction was also made to the checkPackageDefinition method to note that it may be called by the defineClass (and not the loadClass) method of class loaders. --Sean From mandy.chung at oracle.com Fri Jun 16 15:13:51 2017 From: mandy.chung at oracle.com (Mandy Chung) Date: Fri, 16 Jun 2017 08:13:51 -0700 Subject: RFR [9]: 8181295: Document that SecurityManager::checkPackageAccess may be called by the VM In-Reply-To: <5f72118f-0d18-7901-4317-ecd1626122f2@oracle.com> References: <5f72118f-0d18-7901-4317-ecd1626122f2@oracle.com> Message-ID: > On Jun 16, 2017, at 8:00 AM, Sean Mullan wrote: > > Please review this clarification to the SecurityManager::checkPackageAccess method to note that the method may be called by the Virtual Machine when loading classes: > > http://cr.openjdk.java.net/~mullan/webrevs/8181295/webrev.00/ > > A small correction was also made to the checkPackageDefinition method to note that it may be called by the defineClass (and not the loadClass) method of class loaders. checkPackageDefinition is always a question for me and it?s not called in the JDK implementation. Is there any test verifying that (i.e. called from defineClass)? I?m okay to change ?is? to ?may? in checkPackageDefinition in this patch. I can?t validate this spec change. I suggest to separate this from JDK-8181295 and follow up in a future release. Mandy From artem.smotrakov at oracle.com Fri Jun 16 17:30:45 2017 From: artem.smotrakov at oracle.com (Artem Smotrakov) Date: Fri, 16 Jun 2017 10:30:45 -0700 Subject: [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows In-Reply-To: References: <0a9db8a7-d07a-97ca-ec8a-443e955c0b05@oracle.com> <25705e1d-faef-85ea-de0f-d78330ed300c@oracle.com> <08f15f4c-f439-fce3-e8f2-5ff8606a3e05@oracle.com> Message-ID: <34136981-c999-9836-7854-2f86518a4313@oracle.com> Hi Xuelei, Please see inline. On 06/15/2017 07:32 PM, Xuelei Fan wrote: > Hi Artem, > > If the key is generated in MSCAPI, the signature algorithm implemented in other provider cannot be actually used. Yes, that's what I meant by key implementation incompatibility. Here is an example https://bugs.openjdk.java.net/browse/JDK-8176183 We updated the test to use the same provider for key generation and signatures. I'll file a bug for that. > > BTW, we also need to consider the case that only MSCAPI provider is enabled in practice. Agree. If a provider provides everything what's necessary for a TLS connection - it should work. Artem > > In general, a signature algorithm is not included in the supported list unless all related providers support the signature algorithm. We're looking for better solution, but not yet have one in hand. > > Xuelei > >> On Jun 15, 2017, at 6:13 PM, Artem Smotrakov wrote: >> >> That sounds strange to me. I assume that if an algorithm is provided by a provider on all platforms, then it should work on all platforms no matter what. I am not sure that I really understand the problem, but probably it's about some problems that may occur if multiple providers are used together when for a TLS connection. I may guess that the problem may be in incompatibility of key implementations for different providers. If so, this looks like an issue to me. Please correct me if I am wrong. >> >> Probably there may be some specific case which fails, but SignatureAlgorithms.java test works fine now, and seems like SHA224 can be successfully used for establishing a connection. >> >> I am okay to back out the fix, but it would be good to have a testcase which shows the problem why the fix should be backed out. Then, we can work on a solution for that. >> >> Artem >> >> >>> On 06/15/2017 04:37 PM, Xuelei Fan wrote: >>> Hi Bernd, >>> >>> Thanks for the correction. I really missed the point that there are issues to enabled SHA-224 for SunMSCAPI provider. >>> >>>> On 6/15/2017 4:06 PM, Bernd Eckenfels wrote: >>>> Hello, >>>> >>>> If I recall correctly the idea of disabling those algorithms if SunMSCAPI IS(!) present was to avoid agreeing on a Signature algorithm which could not be supported by RSA offloaded keys inside CryptoAPI. >>>> >>>> Having said that the suggested ciphers might need to be made dependent on the capabilities of the Signature provider for a given key type (especially if it is a key handle only). >>>> >>> Agreed. Besides, we may check the availability of each signature and hash algorithms, rather than hard-coded them. I filed a new bug for the tracking: >>> https://bugs.openjdk.java.net/browse/JDK-8182318 >>> >>> Thanks & Regards, >>> Xuelei >>> >>>> Has this changed and the signatures are supported now by MSCapi? >>>> >>>> Gruss >>>> Bernd >>>> -- >>>> http://bernd.eckenfels.net >>>> ------------------------------------------------------------------------ >>>> *From:* security-dev on behalf of Artem Smotrakov >>>> *Sent:* Thursday, June 15, 2017 10:57:00 PM >>>> *To:* Xuelei Fan; Security Dev OpenJDK >>>> *Subject:* [10] RFR: 8182143: SHA224-based signature algorithms are not enabled for TLSv12 on Windows >>>> Hi Xuelei, >>>> >>>> Could you please take a look at this patch? >>>> >>>> It enables SHA224-based signature algorithms on Windows since they >>>> should be provided not only by SunMSCAPI provider. Please see details in >>>> the bug description. >>>> >>>> The test works fine on all supported platforms. >>>> >>>> Bug: https://bugs.openjdk.java.net/browse/JDK-8182143 >>>> Webrev: http://cr.openjdk.java.net/~asmotrak/8182143/webrev.00/ >>>> >>>> Artem From sean.mullan at oracle.com Fri Jun 16 20:25:05 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Fri, 16 Jun 2017 16:25:05 -0400 Subject: RFR [9]: 8181295: Document that SecurityManager::checkPackageAccess may be called by the VM In-Reply-To: References: <5f72118f-0d18-7901-4317-ecd1626122f2@oracle.com> Message-ID: <8e144a29-1877-a3a2-2851-86d6714e4def@oracle.com> On 6/16/17 11:13 AM, Mandy Chung wrote: > >> On Jun 16, 2017, at 8:00 AM, Sean Mullan wrote: >> >> Please review this clarification to the SecurityManager::checkPackageAccess method to note that the method may be called by the Virtual Machine when loading classes: >> >> http://cr.openjdk.java.net/~mullan/webrevs/8181295/webrev.00/ >> >> A small correction was also made to the checkPackageDefinition method to note that it may be called by the defineClass (and not the loadClass) method of class loaders. > > checkPackageDefinition is always a question for me and it?s not called in the JDK implementation. Is there any test verifying that (i.e. called from defineClass)? > > I?m okay to change ?is? to ?may? in checkPackageDefinition in this patch. I can?t validate this spec change. I suggest to separate this from JDK-8181295 and follow up in a future release. Ok, that's fine. Instead of changing the wording, I would prefer to revert the change to checkPackageDefinition and file a new issue to address that separately in a subsequent release as it is not as critical and not specifically related to this issue. Thanks, Sean From artem.smotrakov at oracle.com Fri Jun 16 21:17:38 2017 From: artem.smotrakov at oracle.com (Artem Smotrakov) Date: Fri, 16 Jun 2017 14:17:38 -0700 Subject: [10] RFR: 8182388: Backout 8182143 Message-ID: <1fcf9d6b-b3ea-b396-64c6-fb70673b5f2e@oracle.com> This patch backs out 8182143 because of possible issues on Windows even if we don't have a test to reproduce it. Checking if SunMSCAPI provider is enabled looks like a hack. I filed https://bugs.openjdk.java.net/browse/JDK-8182386 Bug: https://bugs.openjdk.java.net/browse/JDK-8182388 Webrev: http://cr.openjdk.java.net/~asmotrak/8182388/webrev.00/ Artem From mandy.chung at oracle.com Fri Jun 16 21:57:55 2017 From: mandy.chung at oracle.com (Mandy Chung) Date: Fri, 16 Jun 2017 14:57:55 -0700 Subject: RFR [9]: 8181295: Document that SecurityManager::checkPackageAccess may be called by the VM In-Reply-To: <8e144a29-1877-a3a2-2851-86d6714e4def@oracle.com> References: <5f72118f-0d18-7901-4317-ecd1626122f2@oracle.com> <8e144a29-1877-a3a2-2851-86d6714e4def@oracle.com> Message-ID: <1BE97D8C-245A-42EF-BF4F-94DD56C81B36@oracle.com> > On Jun 16, 2017, at 1:25 PM, Sean Mullan wrote: > > On 6/16/17 11:13 AM, Mandy Chung wrote: >>> On Jun 16, 2017, at 8:00 AM, Sean Mullan wrote: >>> >>> Please review this clarification to the SecurityManager::checkPackageAccess method to note that the method may be called by the Virtual Machine when loading classes: >>> >>> http://cr.openjdk.java.net/~mullan/webrevs/8181295/webrev.00/ >>> >>> A small correction was also made to the checkPackageDefinition method to note that it may be called by the defineClass (and not the loadClass) method of class loaders. >> checkPackageDefinition is always a question for me and it?s not called in the JDK implementation. Is there any test verifying that (i.e. called from defineClass)? >> I?m okay to change ?is? to ?may? in checkPackageDefinition in this patch. I can?t validate this spec change. I suggest to separate this from JDK-8181295 and follow up in a future release. > > Ok, that's fine. Instead of changing the wording, I would prefer to revert the change to checkPackageDefinition and file a new issue to address that separately in a subsequent release as it is not as critical and not specifically related to this issue. That?s fine with me. Approved. I don?t need an updated webrev. Mandy From xuelei.fan at oracle.com Fri Jun 16 23:17:44 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Fri, 16 Jun 2017 16:17:44 -0700 Subject: [10] RFR: 8182388: Backout 8182143 In-Reply-To: <1fcf9d6b-b3ea-b396-64c6-fb70673b5f2e@oracle.com> References: <1fcf9d6b-b3ea-b396-64c6-fb70673b5f2e@oracle.com> Message-ID: <92449ead-f5b5-18c5-17b3-bbb72194e631@oracle.com> Looks fine to me. Xuelei On 6/16/2017 2:17 PM, Artem Smotrakov wrote: > This patch backs out 8182143 because of possible issues on Windows even > if we don't have a test to reproduce it. > > Checking if SunMSCAPI provider is enabled looks like a hack. I filed > https://bugs.openjdk.java.net/browse/JDK-8182386 > > Bug: https://bugs.openjdk.java.net/browse/JDK-8182388 > Webrev: http://cr.openjdk.java.net/~asmotrak/8182388/webrev.00/ > > Artem From ecki at zusammenkunft.net Fri Jun 16 23:46:24 2017 From: ecki at zusammenkunft.net (Bernd) Date: Sat, 17 Jun 2017 01:46:24 +0200 Subject: [10] RFR: 8182388: Backout 8182143 In-Reply-To: <1fcf9d6b-b3ea-b396-64c6-fb70673b5f2e@oracle.com> References: <1fcf9d6b-b3ea-b396-64c6-fb70673b5f2e@oracle.com> Message-ID: I think the new bug description is backward, as you cannot expect to implement all algorithms in all providers or use a key class fron one provider in antoher (especially not if they use mechanisms in external APIs like PKCS11 or MSCAPI with HSM). "Crypto keys should be compatible between security providers" https://bugs.openjdk.java.net/browse/JDK-8182386 So the limiting of ciphers should be based on the actual provider used (or key selected) and not based on the subset of all providers present. Maybe something like "JSSE should adjust available ciphers based on effective provider". Its just a question how the current api can support that (this is also somewhat related to the point of key usage flags which also may restrict some ciphers which is only known until the actual Key instance can be examined). 2017-06-16 23:17 GMT+02:00 Artem Smotrakov : > This patch backs out 8182143 because of possible issues on Windows even if > we don't have a test to reproduce it. > > Checking if SunMSCAPI provider is enabled looks like a hack. I filed > https://bugs.openjdk.java.net/browse/JDK-8182386 > > Bug: https://bugs.openjdk.java.net/browse/JDK-8182388 > Webrev: http://cr.openjdk.java.net/~asmotrak/8182388/webrev.00/ > > Artem > -------------- next part -------------- An HTML attachment was scrubbed... URL: From weijun.wang at oracle.com Mon Jun 19 03:33:31 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 19 Jun 2017 11:33:31 +0800 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module Message-ID: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Hi All Please take a review at http://cr.openjdk.java.net/~weijun/8182118/webrev.00/ Basically, a description line is added into package-info.java of each of these packages: - com/sun/security/auth: Contains the implementation of {@code java.security.Principal}. - com/sun/security/auth/module: Contains the implementation of {@code javax.security.auth.spi.LoginModule}. - com/sun/security/auth/login: Contains the implementation of {@code javax.security.auth.login.Configuration}. - com/sun/security/auth/callback: Contains the implementation of {@code javax.security.auth.callback.Callback}. with @since 1.4. I thought about using {@link java.security.Principal} but seems it's not supported in package-info.java. BTW, is this bug meant for JDK 9? I just read the mail from Mark saying only P1 fixes will be allowed from now on. Thanks Max From mandy.chung at oracle.com Mon Jun 19 04:33:57 2017 From: mandy.chung at oracle.com (Mandy Chung) Date: Sun, 18 Jun 2017 21:33:57 -0700 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: > On Jun 18, 2017, at 8:33 PM, Weijun Wang wrote: > > Hi All > > Please take a review at > > http://cr.openjdk.java.net/~weijun/8182118/webrev.00/ > > Basically, a description line is added into package-info.java of each of these packages: > > - com/sun/security/auth: > > Contains the implementation of {@code java.security.Principal}. > > - com/sun/security/auth/module: > > Contains the implementation of {@code javax.security.auth.spi.LoginModule}. > > - com/sun/security/auth/login: > > Contains the implementation of {@code javax.security.auth.login.Configuration}. > > - com/sun/security/auth/callback: > > Contains the implementation of {@code javax.security.auth.callback.Callback}. > What about ?Provides the implementation of ?.? I suggest to use @link to the type. > with @since 1.4. > > I thought about using {@link java.security.Principal} but seems it's not supported in package-info.java. java/lang/package-info.java and many package summary use @link. > > BTW, is this bug meant for JDK 9? I just read the mail from Mark saying only P1 fixes will be allowed from now on. If you push it your Monday, you should be able to make jdk-9+175 integration (6/22 GAC). Otherwise P1 fixes only. Mandy From weijun.wang at oracle.com Mon Jun 19 05:24:02 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 19 Jun 2017 13:24:02 +0800 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: Updated at http://cr.openjdk.java.net/~weijun/8182118/webrev.01/. >> >> I thought about using {@link java.security.Principal} but seems it's not supported in package-info.java. > > java/lang/package-info.java and many package summary use @link. Looks like in my last try I only generated javadoc for jdk.security.auth. After adding java.base, the links show up. > >> >> BTW, is this bug meant for JDK 9? I just read the mail from Mark saying only P1 fixes will be allowed from now on. > > If you push it your Monday, you should be able to make jdk-9+175 integration (6/22 GAC). Otherwise P1 fixes only. I see. Thanks Max > > Mandy > From sean.mullan at oracle.com Mon Jun 19 11:52:42 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Mon, 19 Jun 2017 07:52:42 -0400 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: On 6/19/17 12:33 AM, Mandy Chung wrote: > >> On Jun 18, 2017, at 8:33 PM, Weijun Wang wrote: >> >> Hi All >> >> Please take a review at >> >> http://cr.openjdk.java.net/~weijun/8182118/webrev.00/ >> >> Basically, a description line is added into package-info.java of each of these packages: >> >> - com/sun/security/auth: >> >> Contains the implementation of {@code java.security.Principal}. There is more than one, so this should be "implementations". >> - com/sun/security/auth/module: >> >> Contains the implementation of {@code javax.security.auth.spi.LoginModule}. >> >> - com/sun/security/auth/login: >> >> Contains the implementation of {@code javax.security.auth.login.Configuration}. There is more than one, so this should be "implementations". >> - com/sun/security/auth/callback: >> >> Contains the implementation of {@code javax.security.auth.callback.Callback}. Shouldn't this be "CallbackHandler"? > What about ?Provides the implementation of ?.? +1, but I prefer "Provides an implementation of ..." "the" sounds like this can be the one and only implementation. "an" sounds better. --Sean > I suggest to use @link to the type. > >> with @since 1.4. >> >> I thought about using {@link java.security.Principal} but seems it's not supported in package-info.java. > > java/lang/package-info.java and many package summary use @link. > >> >> BTW, is this bug meant for JDK 9? I just read the mail from Mark saying only P1 fixes will be allowed from now on. > > If you push it your Monday, you should be able to make jdk-9+175 integration (6/22 GAC). Otherwise P1 fixes only. > > Mandy > From weijun.wang at oracle.com Mon Jun 19 12:17:04 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 19 Jun 2017 20:17:04 +0800 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: On 06/19/2017 07:52 PM, Sean Mullan wrote: > On 6/19/17 12:33 AM, Mandy Chung wrote: >> >>> On Jun 18, 2017, at 8:33 PM, Weijun Wang wrote: >>> >>> Hi All >>> >>> Please take a review at >>> >>> http://cr.openjdk.java.net/~weijun/8182118/webrev.00/ >>> >>> Basically, a description line is added into package-info.java of each >>> of these packages: >>> >>> - com/sun/security/auth: >>> >>> Contains the implementation of {@code java.security.Principal}. > > There is more than one, so this should be "implementations". In fact, I originally used "implementations" (without "the") and "an implementation", but then I saw the module-info.java for the module saying "Contains the implementation of the javax.security.auth.* interfaces" and thought "the implementation" is always correct. > >>> - com/sun/security/auth/module: >>> >>> Contains the implementation of {@code >>> javax.security.auth.spi.LoginModule}. >>> >>> - com/sun/security/auth/login: >>> >>> Contains the implementation of {@code >>> javax.security.auth.login.Configuration}. > > There is more than one, so this should be "implementations". > >>> - com/sun/security/auth/callback: >>> >>> Contains the implementation of {@code >>> javax.security.auth.callback.Callback}. > > Shouldn't this be "CallbackHandler"? Ah, yes. > >> What about ?Provides the implementation of ?.? > > +1, but I prefer "Provides an implementation of ..." > > "the" sounds like this can be the one and only implementation. "an" > sounds better. Thanks Max > > --Sean > >> I suggest to use @link to the type. >> >>> with @since 1.4. >>> >>> I thought about using {@link java.security.Principal} but seems it's >>> not supported in package-info.java. >> >> java/lang/package-info.java and many package summary use @link. >> >>> >>> BTW, is this bug meant for JDK 9? I just read the mail from Mark >>> saying only P1 fixes will be allowed from now on. >> >> If you push it your Monday, you should be able to make jdk-9+175 >> integration (6/22 GAC). Otherwise P1 fixes only. >> >> Mandy >> From sean.mullan at oracle.com Mon Jun 19 12:23:45 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Mon, 19 Jun 2017 08:23:45 -0400 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: On 6/19/17 8:17 AM, Weijun Wang wrote: >> There is more than one, so this should be "implementations". > > In fact, I originally used "implementations" (without "the") and "an > implementation", but then I saw the module-info.java for the module > saying "Contains the implementation of the javax.security.auth.* > interfaces" and thought "the implementation" is always correct. I don't see where it uses the word "Contains". I would probably just tweak that to say "Provides implementations of the javax.security.auth.* interfaces and various authentication modules." This would make the wording more consistent in the module and packages. --Sean From weijun.wang at oracle.com Mon Jun 19 13:17:00 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Mon, 19 Jun 2017 21:17:00 +0800 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> Message-ID: <4739962d-cdcb-a34c-c60d-63af0e4fb19e@oracle.com> Updated at http://cr.openjdk.java.net/~weijun/8182118/webrev.02/. --Max On 06/19/2017 08:23 PM, Sean Mullan wrote: > On 6/19/17 8:17 AM, Weijun Wang wrote: >>> There is more than one, so this should be "implementations". >> >> In fact, I originally used "implementations" (without "the") and "an >> implementation", but then I saw the module-info.java for the module >> saying "Contains the implementation of the javax.security.auth.* >> interfaces" and thought "the implementation" is always correct. > > I don't see where it uses the word "Contains". > > I would probably just tweak that to say "Provides implementations of the > javax.security.auth.* interfaces and various authentication modules." > > This would make the wording more consistent in the module and packages. > > --Sean From sean.mullan at oracle.com Mon Jun 19 14:12:05 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Mon, 19 Jun 2017 10:12:05 -0400 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: <4739962d-cdcb-a34c-c60d-63af0e4fb19e@oracle.com> References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> <4739962d-cdcb-a34c-c60d-63af0e4fb19e@oracle.com> Message-ID: Your local workspace needs to be refreshed -- you are picking up an older version of src/jdk.security.auth/share/classes/module-info.java The module description after you apply your change should be: * Provides implementations of the {@code javax.security.auth.*} * interfaces and various authentication modules. * * @provides javax.security.auth.spi.LoginModule Note the {@code} around the package name and the @provides which is not in your webrev. Otherwise looks fine, just make the change above -- no need to post another update. --Sean On 6/19/17 9:17 AM, Weijun Wang wrote: > Updated at http://cr.openjdk.java.net/~weijun/8182118/webrev.02/. > > --Max > > On 06/19/2017 08:23 PM, Sean Mullan wrote: >> On 6/19/17 8:17 AM, Weijun Wang wrote: >>>> There is more than one, so this should be "implementations". >>> >>> In fact, I originally used "implementations" (without "the") and "an >>> implementation", but then I saw the module-info.java for the module >>> saying "Contains the implementation of the javax.security.auth.* >>> interfaces" and thought "the implementation" is always correct. >> >> I don't see where it uses the word "Contains". >> >> I would probably just tweak that to say "Provides implementations of >> the javax.security.auth.* interfaces and various authentication modules." >> >> This would make the wording more consistent in the module and packages. >> >> --Sean From mandy.chung at oracle.com Mon Jun 19 14:55:37 2017 From: mandy.chung at oracle.com (Mandy Chung) Date: Mon, 19 Jun 2017 07:55:37 -0700 Subject: RFR 8182118: Package summary is missing in jdk.security.auth module In-Reply-To: <4739962d-cdcb-a34c-c60d-63af0e4fb19e@oracle.com> References: <89713028-a886-ffb1-8f56-a194afc95811@oracle.com> <4739962d-cdcb-a34c-c60d-63af0e4fb19e@oracle.com> Message-ID: <83684E5B-ABFA-43AB-B370-5987D47E7972@oracle.com> Looks fine to me. Mandy > On Jun 19, 2017, at 6:17 AM, Weijun Wang wrote: > > Updated at http://cr.openjdk.java.net/~weijun/8182118/webrev.02/. > > --Max > > On 06/19/2017 08:23 PM, Sean Mullan wrote: >> On 6/19/17 8:17 AM, Weijun Wang wrote: >>>> There is more than one, so this should be "implementations". >>> >>> In fact, I originally used "implementations" (without "the") and "an implementation", but then I saw the module-info.java for the module saying "Contains the implementation of the javax.security.auth.* interfaces" and thought "the implementation" is always correct. >> I don't see where it uses the word "Contains". >> I would probably just tweak that to say "Provides implementations of the javax.security.auth.* interfaces and various authentication modules." >> This would make the wording more consistent in the module and packages. >> --Sean From anders.rundgren.net at gmail.com Tue Jun 20 20:32:36 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Tue, 20 Jun 2017 22:32:36 +0200 Subject: Support for CFRG (curve25519 etc) in Java/JCE Message-ID: Hi List, I'm an long time user of Java and JCE. I've just begun looking into the recently standardized curve25519 crypto. Since I didn't find any JEP or JSR for this I took the liberty creating an issue on Bouncycastle's GitHub: https://github.com/bcgit/bc-java/issues/193#issuecomment-309183825 WDYT? Thanx, Anders Rundgren, Co-inventor of signed JavaScript/JSON objects: https://cyberphone.github.io/doc/security/jcs.html From xuelei.fan at oracle.com Wed Jun 21 00:33:58 2017 From: xuelei.fan at oracle.com (Xuelei Fan) Date: Tue, 20 Jun 2017 17:33:58 -0700 Subject: Code review request: JDK-8046295 - Support Trusted CA Indication extension In-Reply-To: References: <619b0a8b-5be7-1d29-6d8b-acf3cdc34a8c@oracle.com> <66aaaaad-175b-89f9-ba18-254b7881b0d1@oracle.com> <4d15dae7-e1a1-93ab-6b15-e95044c8f2f2@oracle.com> Message-ID: Hi Martin, The TLS 1.3 spec is replacing the Trusted CA Indication (trusted_ca_keys) extension with a new Certificate Authorities (certificate_authorities) extension. See more details about the specification in the TLS 1.3 draft: https://tools.ietf.org/html/draft-ietf-tls-tls13-20#section-4.2.4 Both serves a similar purpose, but the trusted_ca_keys extension will not be used in TLS 1.3 any more. The "trusted_ca_keys" extension will only be used for legacy protocol versions (TLS 1.2/1.1/1.0). There are two options to me: 1. Supports the certificate_authorities, but not trusted_ca_keys extension. It is acceptable to me as trusted_ca_keys is for legacy use only and the certificate_authorities extension is the future. Plus, the certificate_authorities extension can also be used for TLS 1.2 and previous versions. 2. Supports both the certificate_authorities and trusted_ca_keys extensions. As far as I know, I did not see much benefit of this option unless the trusted_ca_keys extension is widely used in practice. If I did not miss something, the APIs you designed can still be used for the certificate_authorities extension, with a little bit update. What do you think? Thanks & Regards, Xuelei On 6/15/2017 12:05 PM, Martin Balao wrote: > Hi Xuelei, > > The new webrev.02 is ready: > > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_15/8046295.webrev.02/ > (browse online) > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_15/8046295.webrev.02.zip > (zip, download) > > The following changes have been implemented since the previous webrev.01: > > * s/getUseTrustedCAIndication() methods in SSLEngine/SSLSocket and in > SSLEngineImpl/SSLSocketImpl removed. s/getSSLParameters is now the only > way to set or get the use of the Trusted CA Indication extension. An > exception is no longer thrown if trying to disable the extension for a > server, but the change takes no effect as the extension is mandatory for > servers. X509KeyManagerImpl modified to use SSLParameters to get > information regarding if Trusted CA Indication is enabled and should > guide the certificate choice. > > * TrustedAuthorityIndicator.IdentifierType has been moved from enum to > String, to follow JSSE conventions. I understand how important is to be > consistent. However, I still believe that an enum is a better fit for > this value and does not prevent from future extension. We are choosing > from a closed set (strictly defined by the RFC) and that's what enum > allows to express. From the client point of view/API, it's very handy > that the type gives you information regarding the allowed choices for > the parameter. You don't necessarily have to look for implementation > details or documentation but you can just leverage on the strongly typed > language. It's also likely that enums are faster for comparisons than > strings, but that's not the main point here. > > * Removed X509Certificate from TrustedAuthorityIndicator class (method > and property). It was there for informational purposes (when > TrustedAuthorityIndicator was built from a certificate by a client and > the whole extension indicators converted to String). > > * "equals" and "hashCode" methods moved from TrustedAuthorityIndicator > to TrustedAuthorityIndicatorImpl class. > > * "getLength" method removed from TrustedAuthorityIndicator class. > It's possible to get the encoded buffer and the length from there. > > * "getData" method renamed to "getEncoded" in > TrustedAuthorityIndicator class. > > * "trustedAuthorityEncodedData" renamed to "encodedData" in > TrustedAuthorityIndicator and TrustedAuthorityIndicatorImpl classes > > * "identifier" and "encodedData" instance variables moved from > TrustedAuthorityIndicator to TrustedAuthorityIndicatorImpl class. > > * "getEncoded" and "getIdentifier" are now abstract methods in > TrustedAuthorityIndicator, and their implementation is in > TrustedAuthorityIndicatorImpl class. > > * "getIdentifier" method renamed to "getType" in > TrustedAuthorityIndicator and TrustedAuthorityIndicatorImpl classes > ("identifier" instance variable and parameter in > TrustedAuthorityIndicatorImpl class renamed to "type"). > > * Test cases (server and client) updated to reflect the new interface > (enabling the use of the extension through SSLParameters) > > However, some changes are still not implemented and I have some concerns: > > 1) I still believe that identifier type information has to be on > TrustedAuthorityIndicator class somehow, and implementations restricted > on what they can return as part of "getType" method. This is strictly > specified by the RFC TrustedAuthorityIndicator class represents, and I > find desirable that any implementation is enforced to be compliant to > that. If we remove all of that (including the enum), > TrustedAuthorityIndicator looks too generic and does not reflect (in my > opinion) what it really is. It'd also be chaotic if different > implementations call pre-agreed type as "preagreed", "pre-agreed", > "PRE_AGREED", etc. I prefer stricter and more explicit interfaces. > > 2) I agree that type mappings can be seen as part of an implementation, > but they were in TrustedAuthorityIndicator (as protected) because every > implementation is highly likely to need them and we can avoid the > necessity for repeated code/mappings. The same for "type" and > "encodedData" variables or even "hashCode" and "equals" methods. That's > why I was thinking more of an abstract class and not an interface, as it > happens (in example) with SNIServerName. > > 3) I think that "implies" method on TrustedAuthorityIndicator should be > also part of the class/interface, because that's the whole point of a > Trusted Authority Information: to allow queries for a given certificate. > This is, in fact, the only thing a server wants from one of these > objects. My concern is that if we remove this requirement for an > implementation, the interface looks more like a byte buffer holder. > > I'd appreciate if you can re-consider these items. > > Thanks, > Martin.- > > On Wed, Jun 14, 2017 at 7:17 PM, Xuelei Fan > wrote: > > Hi Martin, > > The big picture of the design looks pretty good to me, except a few > comment about the JSSE conventions. I appreciate it very much. By > the way, I need more time to look into the details of the > specification and implementation. > > > In order to keep the APIs simple and small, SSLParameters is > preferred as the only configuration port for common cases. I may > suggest to remove the s/getUseTrustedCAIndication() methods in > SSLEngine/SSLSocket. > > The identify type is defined as an enum > TrustedAuthorityIndicator.IdentifierType. In the future, if more > type is added, we need to update the specification by adding a new > enum item. Enum is preferred in JDK, but for good extensibility, in > general JSSE does not use enum in public APIs for extensible > properties. I may suggest to use String (or integer/byte, I prefer > to use String) as the type. The standard trusted authority > indicator algorithm (identifier) can be documented in the "Java > Cryptography Architecture Standard Algorithm Name Documentation"[1]. > > In TrustedAuthorityIndicator class, some methods, like > getIdentifierTypeFromCode(), getCodeFromIdentifierType(), implies(), > getLength(), equals() and hashCode() look more like implementation > logic. I may suggest remove them from public APIs. > > I did not see the benefit to have X509Certificate in the > TrustedAuthorityIndicator class. The class is mainly used for > server side certificate selection. X509Certificate could be unknown > for a indicator. I may suggestion remove the related methods and > properties. > > After that, as there is no requirement to instantiate > TrustedAuthorityIndicator class in application code, looks like it > may be enough to use an interface to represent a trusted authorities: > public interface TrustedAuthorityIndicator { > // identifier type, standard algorithm name > String/int/Byte getType(); > > // identifier > byte[] getEncoded(); > } > > What do you think? > > > Thanks & Regards, > Xuelei > > [1] > https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html > > > > On 6/13/2017 3:41 PM, Martin Balao wrote: > > Hi Xuelei, > > The new webrev.01 is ready: > > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01/ > > (browse online) > * > http://people.redhat.com/mbalaoal/webrevs/jdk_8046295_trusted_ca/2017_06_13/8046295.webrev.01.zip > > (zip, download) > > The following changes have been implemented since the previous > webrev.00: > > * Pre-agreed support removed from server-side > * Unnecessary overhead and minium benefits for JSSE. > > * Enabling the use of Trusted CA Indication extension for > clients through TrustManager objects was reverted. Trusted CA > Indication extension can now be enabled through: 1) SSLEngine, > 2) SSLSocket, or 3) SSLParameters (which can be applied to both > SSLEngine and SSLSocket objects). Trusted CA Indication > extension is mandatory for servers. > > * SunX509KeyManagerImpl old key manager ("SunX509" algorithm) > is now out of scope. This key manager does not support other TLS > extensions as Server Name Indication (SNI), which is far more > relevant than Trusted CA Indication. The new X509KeyManagerImpl > key manager ("PKIX" algorithm) is now in scope. > > * Client requested indications are now an ExtendedSSLSession > attribute. ServerHandshaker gets the information from the Client > Hello message (now parsed by TrustedCAIndicationExtension class > instead of TrustedAuthorityIndicator) and sets it in the > ExtendedSSLSession (SSLSessionImpl object). The key manager > (i.e.: X509KeyManagerImpl), when choosing a server alias, may > now get the information from the ExtendedSSLSession object and > guide the certificate selection based on it. > * In order to allow multiple key managers to use Trusted > Authority Indicators information and to allow multiple Trusted > Authority Indicators implementations, TrustedAuthorityIndicator > has now been split in an abstract class > (TrustedAuthorityIndicator, located in javax.net.ssl) and an > implementation class (TrustedAuthorityIndicatorImpl, located in > sun.security.ssl). No coupling was added between javax.net.ssl > and sun.security.ssl packages. > > * Documentation extended and improved. > * Test cases (server and client) updated to reflect the new > interface and supported key manager. > > Look forward to your new review! > > Kind regards, > Martin.- > > > > On Fri, Jun 9, 2017 at 6:15 PM, Xuelei Fan > > >> > wrote: > > I'm OK to use SSLParameters. Thank you very much for > considering a > new design. > > Xuelei > > On 6/9/2017 1:10 PM, Martin Balao wrote: > > Hi Xuelei, > > I didn't notice that some of the SSLSocket contructors > did not > establish the connection, so SSLParameters can be > effective for > Trusted CA Indication. This was an invalid argument on > my side, > sorry. > > As for the configuration to enable the extension, it's > probably > not necessary on the Server side because -as you > mentioned- it > is mandatory and there is no harm in supporting it. > However, it > has to be configurable on the Client side because -as we > previously discussed- the client may cause a handshake > failure > if the server does not support the extension. I'd > prefer the > Client configuring the SSLSocket through SSLParameters > instead > of a system-wide property -which has even more impact > than the > TrustManager approach-. Would this work for you? > > > In JSSE, the benefits pre_agreed option can get by > customizing the key/trust manager, so I did not see too > much > benefits to support this option in JDK > > I understand your point and will remove support for > "pre_agreed". > > > On Fri, Jun 9, 2017 at 1:37 AM, Xuelei Fan > > > > >>> > wrote: > > > > On 6/8/2017 8:36 PM, Xuelei Fan wrote: > > The trusted authorities can be get from client > trust > manager. Server can choose the best matching of > server > certificate of the > client requested trusted authorities. > > > > I missed the point that the key manager need to > know the client > requested trusted authorities for the choosing. > So may > need a new > SSLSession attribute (See similar method in > ExtendedSSLSession). > > Xuelei > > > > Yes, an attribute on SSLSession may do the job (both > when Key > Manager receives a SSLSocket and a SSLEngine). > > Kind regards, > Martin.- > > > From weijun.wang at oracle.com Wed Jun 21 07:05:42 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Wed, 21 Jun 2017 15:05:42 +0800 Subject: How do I know which granted permission is not needed? Message-ID: Suppose I have a Java program running with a security manager and a policy file. There are quite a lot of permissions granted in the policy file but maybe not all of them are necessary. Is there a way I can find out which one is not needed? I tried to write my own security manager to remember all permission objects checked and then compare it with the policy file, but if the policy file has permissions granted to different codebases, I cannot tell which one is for which. Thanks Max From bhanu.prakash.gopularam at oracle.com Wed Jun 21 11:27:07 2017 From: bhanu.prakash.gopularam at oracle.com (Bhanu Gopularam) Date: Wed, 21 Jun 2017 04:27:07 -0700 (PDT) Subject: RFR 8181975: Run sun/security/pkcs11 tests on Mac Message-ID: Hi all, Please review fix for following test bug: Bug Id - https://bugs.openjdk.java.net/browse/JDK-8181975 In test/sun/security/pkcs11/PKCS11Test.java updated path for nss-libs on MacOSX platform. Webrev - http://cr.openjdk.java.net/~bgopularam/bhanu/8181975/webrev.00/ Thanks, Bhanu From sean.mullan at oracle.com Wed Jun 21 11:53:23 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Wed, 21 Jun 2017 07:53:23 -0400 Subject: How do I know which granted permission is not needed? In-Reply-To: References: Message-ID: On 6/21/17 3:05 AM, Weijun Wang wrote: > Suppose I have a Java program running with a security manager and a > policy file. There are quite a lot of permissions granted in the policy > file but maybe not all of them are necessary. > > Is there a way I can find out which one is not needed? I don't know of any easy way to do that, other than code inspection and writing tests that exercise different code paths. --Sean > > I tried to write my own security manager to remember all permission > objects checked and then compare it with the policy file, but if the > policy file has permissions granted to different codebases, I cannot > tell which one is for which. > > Thanks > Max From sean.coffey at oracle.com Wed Jun 21 12:15:37 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Wed, 21 Jun 2017 13:15:37 +0100 Subject: RFR 8181975: Run sun/security/pkcs11 tests on Mac In-Reply-To: References: Message-ID: <3eb155c2-1ecc-1f09-3077-a1968290b5b7@oracle.com> Looks fine to me. Regards, Sean. On 21/06/17 12:27, Bhanu Gopularam wrote: > Hi all, > > Please review fix for following test bug: > > Bug Id - https://bugs.openjdk.java.net/browse/JDK-8181975 > > In test/sun/security/pkcs11/PKCS11Test.java updated path for nss-libs on MacOSX platform. > > Webrev - http://cr.openjdk.java.net/~bgopularam/bhanu/8181975/webrev.00/ > > Thanks, > Bhanu From sean.coffey at oracle.com Wed Jun 21 14:34:45 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Wed, 21 Jun 2017 15:34:45 +0100 Subject: How do I know which granted permission is not needed? In-Reply-To: References: Message-ID: you're mostly likely aware of this debug option but the java.security.debug option allows 'access' which should give you alot more information about each permission check that's been made. Maybe it's a case of scanning the output for permissions not checked and seeing if they're really necessary in your policy file. https://docs.oracle.com/javase/8/docs/technotes/guides/security/troubleshooting-security.html Regards, Sean. On 21/06/17 12:53, Sean Mullan wrote: > On 6/21/17 3:05 AM, Weijun Wang wrote: >> Suppose I have a Java program running with a security manager and a >> policy file. There are quite a lot of permissions granted in the >> policy file but maybe not all of them are necessary. >> >> Is there a way I can find out which one is not needed? > > I don't know of any easy way to do that, other than code inspection > and writing tests that exercise different code paths. > > --Sean > >> >> I tried to write my own security manager to remember all permission >> objects checked and then compare it with the policy file, but if the >> policy file has permissions granted to different codebases, I cannot >> tell which one is for which. >> >> Thanks >> Max From sean.coffey at oracle.com Wed Jun 21 14:58:36 2017 From: sean.coffey at oracle.com (=?UTF-8?Q?Se=c3=a1n_Coffey?=) Date: Wed, 21 Jun 2017 15:58:36 +0100 Subject: Support for CFRG (curve25519 etc) in Java/JCE In-Reply-To: References: Message-ID: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> This appears to be tracked via https://bugs.openjdk.java.net/browse/JDK-8171277 Regards, Sean. On 20/06/17 21:32, Anders Rundgren wrote: > Hi List, > I'm an long time user of Java and JCE. > > I've just begun looking into the recently standardized curve25519 crypto. > > Since I didn't find any JEP or JSR for this I took the liberty > creating an issue on Bouncycastle's GitHub: > https://github.com/bcgit/bc-java/issues/193#issuecomment-309183825 > > WDYT? > > Thanx, > Anders Rundgren, > Co-inventor of signed JavaScript/JSON objects: > https://cyberphone.github.io/doc/security/jcs.html From anders.rundgren.net at gmail.com Wed Jun 21 15:20:20 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Wed, 21 Jun 2017 17:20:20 +0200 Subject: Support for CFRG (curve25519 etc) in Java/JCE In-Reply-To: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> References: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> Message-ID: On 2017-06-21 16:58, Se?n Coffey wrote: > This appears to be tracked via > https://bugs.openjdk.java.net/browse/JDK-8171277 Thanx, but at this stage I'm mainly concerned about the specification. Is the specification also supposed to be created in the issue above? Regards, Anders https://github.com/bcgit/bc-java/issues/193#issuecomment-309183825 > > Regards, > Sean. > > On 20/06/17 21:32, Anders Rundgren wrote: >> Hi List, >> I'm an long time user of Java and JCE. >> >> I've just begun looking into the recently standardized curve25519 crypto. >> >> Since I didn't find any JEP or JSR for this I took the liberty >> creating an issue on Bouncycastle's GitHub: >> https://github.com/bcgit/bc-java/issues/193#issuecomment-309183825 >> >> WDYT? >> >> Thanx, >> Anders Rundgren, >> Co-inventor of signed JavaScript/JSON objects: >> https://cyberphone.github.io/doc/security/jcs.html > From weijun.wang at oracle.com Wed Jun 21 15:20:43 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Wed, 21 Jun 2017 23:20:43 +0800 Subject: How do I know which granted permission is not needed? In-Reply-To: References: Message-ID: <3cec82a4-5af5-74d3-64d8-ccb6eaa1831e@oracle.com> On 06/21/2017 10:34 PM, Se?n Coffey wrote: > you're mostly likely aware of this debug option but the > java.security.debug option allows 'access' which should give you alot > more information about each permission check that's been made. Maybe > it's a case of scanning the output for permissions not checked and > seeing if they're really necessary in your policy file. This is useful, but I still don't know what code source the permission is granted to. For example, suppose I have 2 codebases all granting the same permission. By reading the -Djava.security.debug=access output I cannot find out if one is actually not needed. Daniel suggests I can write my own Policy implementation. > > https://docs.oracle.com/javase/8/docs/technotes/guides/security/troubleshooting-security.html > > > Regards, > Sean. > > On 21/06/17 12:53, Sean Mullan wrote: >> On 6/21/17 3:05 AM, Weijun Wang wrote: >>> Suppose I have a Java program running with a security manager and a >>> policy file. There are quite a lot of permissions granted in the >>> policy file but maybe not all of them are necessary. >>> >>> Is there a way I can find out which one is not needed? >> >> I don't know of any easy way to do that, other than code inspection >> and writing tests that exercise different code paths. I didn't meant to achieve that goal. I only want to know what granted permissions are not checked in one execution. Thanks Max >> >> --Sean >> >>> >>> I tried to write my own security manager to remember all permission >>> objects checked and then compare it with the policy file, but if the >>> policy file has permissions granted to different codebases, I cannot >>> tell which one is for which. >>> >>> Thanks >>> Max > From sean.mullan at oracle.com Wed Jun 21 15:29:40 2017 From: sean.mullan at oracle.com (Sean Mullan) Date: Wed, 21 Jun 2017 11:29:40 -0400 Subject: How do I know which granted permission is not needed? In-Reply-To: <3cec82a4-5af5-74d3-64d8-ccb6eaa1831e@oracle.com> References: <3cec82a4-5af5-74d3-64d8-ccb6eaa1831e@oracle.com> Message-ID: On 6/21/17 11:20 AM, Weijun Wang wrote: > > > On 06/21/2017 10:34 PM, Se?n Coffey wrote: >> you're mostly likely aware of this debug option but the >> java.security.debug option allows 'access' which should give you alot >> more information about each permission check that's been made. Maybe >> it's a case of scanning the output for permissions not checked and >> seeing if they're really necessary in your policy file. > > This is useful, but I still don't know what code source the permission > is granted to. > > For example, suppose I have 2 codebases all granting the same > permission. By reading the -Djava.security.debug=access output I cannot > find out if one is actually not needed. > > Daniel suggests I can write my own Policy implementation. > >> >> https://docs.oracle.com/javase/8/docs/technotes/guides/security/troubleshooting-security.html >> >> >> Regards, >> Sean. >> >> On 21/06/17 12:53, Sean Mullan wrote: >>> On 6/21/17 3:05 AM, Weijun Wang wrote: >>>> Suppose I have a Java program running with a security manager and a >>>> policy file. There are quite a lot of permissions granted in the >>>> policy file but maybe not all of them are necessary. >>>> >>>> Is there a way I can find out which one is not needed? >>> >>> I don't know of any easy way to do that, other than code inspection >>> and writing tests that exercise different code paths. > > I didn't meant to achieve that goal. I only want to know what granted > permissions are not checked in one execution. Hmm. Just remove all granted permissions then, and grant them one by one until it runs w/o error? --Sean > > Thanks > Max > >>> >>> --Sean >>> >>>> >>>> I tried to write my own security manager to remember all permission >>>> objects checked and then compare it with the policy file, but if the >>>> policy file has permissions granted to different codebases, I cannot >>>> tell which one is for which. >>>> >>>> Thanks >>>> Max >> From adam.petcher at oracle.com Wed Jun 21 15:31:47 2017 From: adam.petcher at oracle.com (Adam Petcher) Date: Wed, 21 Jun 2017 11:31:47 -0400 Subject: Support for CFRG (curve25519 etc) in Java/JCE In-Reply-To: References: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> Message-ID: On 6/21/2017 11:20 AM, Anders Rundgren wrote: > > Thanx, but at this stage I'm mainly concerned about the specification. > > Is the specification also supposed to be created in the issue above? There is a JEP in development for RFC 7748, and I expect the API/spec will go in the JEP. It is still in an early phase of development, though. Also, there is a separate ticket for EdDSA: https://bugs.openjdk.java.net/browse/JDK-8166597 > > Regards, > Anders From anders.rundgren.net at gmail.com Wed Jun 21 15:41:39 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Wed, 21 Jun 2017 17:41:39 +0200 Subject: Support for CFRG (curve25519 etc) in Java/JCE In-Reply-To: References: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> Message-ID: On 2017-06-21 17:31, Adam Petcher wrote: > On 6/21/2017 11:20 AM, Anders Rundgren wrote: >> >> Thanx, but at this stage I'm mainly concerned about the specification. >> >> Is the specification also supposed to be created in the issue above? > > There is a JEP in development for RFC 7748, and I expect the API/spec > will go in the JEP. It is still in an early phase of development, > though. Also, there is a separate ticket for EdDSA: > https://bugs.openjdk.java.net/browse/JDK-8166597 That's great! Note that the highly related draft https://tools.ietf.org/html/draft-ietf-curdle-pkix-04 and RFC https://tools.ietf.org/html/rfc8037 do not overload existing EC constructs. Regards, Anders > >> >> Regards, >> Anders > From weijun.wang at oracle.com Wed Jun 21 15:54:40 2017 From: weijun.wang at oracle.com (Weijun Wang) Date: Wed, 21 Jun 2017 23:54:40 +0800 Subject: How do I know which granted permission is not needed? In-Reply-To: References: <3cec82a4-5af5-74d3-64d8-ccb6eaa1831e@oracle.com> Message-ID: <2ecfe111-4f88-0f1d-84c5-25f38d939b74@oracle.com> On 06/21/2017 11:29 PM, Sean Mullan wrote: > > Hmm. Just remove all granted permissions then, and grant them one by one > until it runs w/o error? The test is meant to ensure that any future src code change will not accidentally "remove" a required permission. i.e. if perm A is needed today the test wants to ensure it is always needed in the future. The test is not manual and should run automatically. --Max From sha.jiang at oracle.com Thu Jun 22 02:40:58 2017 From: sha.jiang at oracle.com (sha.jiang at oracle.com) Date: Thu, 22 Jun 2017 10:40:58 +0800 Subject: RFR[10] JDK-8177017: com/oracle/security/ucrypto/TestAES.java fails intermittently Message-ID: <28fb221d-3d1c-8e4b-415e-dfacba28dc98@oracle.com> Hi, According to JDK-8173708, the cases on CFB128 in test com/oracle/security/ucrypto/TestAES.java should be skipped on Solaris 11.2 and previous versions due to a Solaris bug. Please review the patch at: http://cr.openjdk.java.net/~jjiang/8177017/webrev.00/ Best regards, John Jiang From ecki at zusammenkunft.net Thu Jun 22 08:29:52 2017 From: ecki at zusammenkunft.net (Bernd Eckenfels) Date: Thu, 22 Jun 2017 08:29:52 +0000 Subject: RFR[10] JDK-8177017: com/oracle/security/ucrypto/TestAES.java fails intermittently In-Reply-To: <28fb221d-3d1c-8e4b-415e-dfacba28dc98@oracle.com> References: <28fb221d-3d1c-8e4b-415e-dfacba28dc98@oracle.com> Message-ID: Would it be better to not skip the test but remove a broken cipher from the provider in that known circumstances? Gruss Bernd -- http://bernd.eckenfels.net ________________________________ From: security-dev on behalf of sha.jiang at oracle.com Sent: Thursday, June 22, 2017 4:40:58 AM To: security-dev at openjdk.java.net; Valerie Peng Subject: RFR[10] JDK-8177017: com/oracle/security/ucrypto/TestAES.java fails intermittently Hi, According to JDK-8173708, the cases on CFB128 in test com/oracle/security/ucrypto/TestAES.java should be skipped on Solaris 11.2 and previous versions due to a Solaris bug. Please review the patch at: http://cr.openjdk.java.net/~jjiang/8177017/webrev.00/ Best regards, John Jiang -------------- next part -------------- An HTML attachment was scrubbed... URL: From anders.rundgren.net at gmail.com Fri Jun 23 07:57:16 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Fri, 23 Jun 2017 09:57:16 +0200 Subject: OKP Keys? Was: Support for CFRG (curve25519 etc) in Java/JCE In-Reply-To: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> References: <5d106cb5-1b70-5a1a-d024-1d208caef353@oracle.com> Message-ID: The COSE draft also refers to "OKP" keys: https://tools.ietf.org/html/draft-ietf-cose-msg-24 which in my opinion speaks for separating the CFRG algorithms from EC. OKP keys have as far as I can tell (I'm not a cryptographer, just an applier of cryptography), no ECPoint, coFactor, and ECField and an entirely different ASN.1 representation compared to P-256 et al. WDYT? Anders Just updated: https://github.com/bcgit/bc-java/issues/193#issuecomment-309183825 From anders.rundgren.net at gmail.com Sat Jun 24 19:33:14 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Sat, 24 Jun 2017 21:33:14 +0200 Subject: Java/JCE CFRG integration spec on GitHub Message-ID: I turned this topic into a separate GitHub repository instead of a BC issue (which it really isn't). I would be very happy to get some feedback on this proposal, including pull requests. There's a certain urgency, since the BC implementation is just about to start, while I guess Oracle is targeting JDK 10, or a JDK 9 update. https://github.com/cyberphone/java-cfrg-spec Anders From anders.rundgren.net at gmail.com Sun Jun 25 06:21:08 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Sun, 25 Jun 2017 08:21:08 +0200 Subject: JDK 8 does not comply with RFC 5915 Message-ID: During the work with https://github.com/cyberphone/java-cfrg-spec I had to look at the PKCS #8 spec as well. It turns out that JDK 8 does not comply with RFC 5915's SHOULD since EC private keys created by KeyPairGenerator do not contain public key info when getEncoded(). I didn't check PKCS #8 de-serialization and serialization but I guess it doesn't work for that either. This is by no means serious, but differs from BouncyCastle as well as OpenSSL. Anders From mstjohns at comcast.net Mon Jun 26 15:58:08 2017 From: mstjohns at comcast.net (Michael StJohns) Date: Mon, 26 Jun 2017 11:58:08 -0400 Subject: JDK 8 does not comply with RFC 5915 In-Reply-To: References: Message-ID: On 6/25/2017 2:21 AM, Anders Rundgren wrote: > During the work with https://github.com/cyberphone/java-cfrg-spec I > had to look at the PKCS #8 spec as well. > It turns out that JDK 8 does not comply with RFC 5915's SHOULD since > EC private keys created by KeyPairGenerator do not contain public key > info when getEncoded(). > I didn't check PKCS #8 de-serialization and serialization but I guess > it doesn't work for that either. > > This is by no means serious, but differs from BouncyCastle as well as > OpenSSL. > > Anders Umm... SHOULD is not a MUST - JDK8 does comply with the RFC, it just doesn't provide the "convenient" field: > The publicKey > field can be omitted when the public key has been distributed via > another mechanism, which is beyond the scope of this document. > Given the private key and the parameters, the public key can > always be recomputed; this field exists as a convenience to the > consumer. I always thought that RFC5915 should have specified "MAY" there instead. The main reason is that its trivial to reconstitute the public key from the private key so there is mostly no need to keep the two together and the actual text suggested as much. Ideally, there should be a way to control what gets included in the encoding - but the "getEncoded()" method doesn't permit an argument for format. Later, Mike -------------- next part -------------- An HTML attachment was scrubbed... URL: From anders.rundgren.net at gmail.com Mon Jun 26 18:56:12 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Mon, 26 Jun 2017 20:56:12 +0200 Subject: JDK 8 does not comply with RFC 5915 In-Reply-To: References: Message-ID: On 2017-06-26 17:58, Michael StJohns wrote: > On 6/25/2017 2:21 AM, Anders Rundgren wrote: >> During the work with https://github.com/cyberphone/java-cfrg-spec I had to look at the PKCS #8 spec as well. >> It turns out that JDK 8 does not comply with RFC 5915's SHOULD since EC private keys created by KeyPairGenerator do not contain public key info when getEncoded(). >> I didn't check PKCS #8 de-serialization and serialization but I guess it doesn't work for that either. >> >> This is by no means serious, but differs from BouncyCastle as well as OpenSSL. >> >> Anders > > Umm... SHOULD is not a MUST - JDK8 does comply with the RFC, it just doesn't provide the "convenient" field: > >> The publicKey >> field can be omitted when the public key has been distributed via >> another mechanism, which is beyond the scope of this document. >> Given the private key and the parameters, the public key can >> always be recomputed; this field exists as a convenience to the >> consumer. > > I always thought that RFC5915 should have specified "MAY" there instead. The main reason is that its trivial to reconstitute the public key from the private key so there is mostly no need to keep the two together and the actual text suggested as much. Right, for standards developers this is clear. Practitioners OTOH only notice (usually the hard way) that some systems do this and some do that. In this particular case the SHOULD have been interpreted differently by Oracle than by BouncyCastle and OpenSSL. BTW, newer standards like JWK declares this feature as a MUST: https://tools.ietf.org/html/rfc7518#section-6.2.2 > > Ideally, there should be a way to control what gets included in the encoding - but the "getEncoded()" method doesn't permit an argument for format. Yes, indeed. At this late stage we can (at best) hope for "normalization". thanx, Anders > > Later, Mike > > > From mstjohns at comcast.net Mon Jun 26 21:44:56 2017 From: mstjohns at comcast.net (Michael StJohns) Date: Mon, 26 Jun 2017 17:44:56 -0400 Subject: JDK 8 does not comply with RFC 5915 In-Reply-To: References: Message-ID: Inline. On 6/26/2017 2:56 PM, Anders Rundgren wrote: > On 2017-06-26 17:58, Michael StJohns wrote: >> >> Umm... SHOULD is not a MUST - JDK8 does comply with the RFC, it just >> doesn't provide the "convenient" field: >> >> >> I always thought that RFC5915 should have specified "MAY" there >> instead. The main reason is that its trivial to reconstitute the >> public key from the private key so there is mostly no need to keep >> the two together and the actual text suggested as much. > > Right, for standards developers this is clear. Practitioners OTOH only > notice (usually the hard way) that some systems do this and some do that. > > In this particular case the SHOULD have been interpreted differently > by Oracle than by BouncyCastle and OpenSSL. And all three of them are compliant with RFC5915 AFAICT. > > BTW, newer standards like JWK declares this feature as a MUST: > https://tools.ietf.org/html/rfc7518#section-6.2.2 After thinking about it, I'm not sure why you think this is necessary. Here's why I think it isn't: 1) The whole idea of plugin providers is to be able to provide different services and sometimes in different manners. E.g. it's OK that Bouncycastle does it one way and Oracle does it another - I might actually want the more compact format due to storage or transmission constraints. (Or to fit encrypted into a barcode...) 2) PKCS8 as a standalone object is one of the least useful encodings - I'd rather have a PKCS12 most times. Or the bare private value, or a handle to the object on an HSM. I usually end up having to convert anything generated by OpenSSL into one of those forms. 3) I don't think even bouncy castle's KeyFactory implementation of an EC factory allows you to generate a public key from a PKCS8 object. 4) I'm pretty sure that the Sun EC key factory implementation doesn't choke on a PKCS8 object that contains an Public key. 5) All providers that do EC that I'm aware of (e.g. Bouncycastle, IAIK and Sun) expect the input data for a key factory to generate an EC public key to be a SubjectPublicKeyInfo blob. 6) It's *really* mostly not a good idea to be using the PKCS8 object model - it has no protection at all for the private key. I've used it exactly once and paired it with the PKCS12 SafeBag constructs (PKCS8ShroudedKeyBag) to provide protection. 7) the java.security.spec.PKCS8EncodedKeySpec has no way of extracting or setting the attributes section of the private key (where the public key is stored), so there's really no reason to implement support for the public key unless the implementing provider needs it or can use it in some way. 8) Java doesn't even provide a way to figure out the type of the PKCS8 private key - e.g. no way to grab the OID that describes the underlying key material from the PKCS8EncodedKeySpec - so you actually have to know its an EC key some other way. So how would you use the additional Public Key data in Java? Later, Mike From anders.rundgren.net at gmail.com Tue Jun 27 03:04:25 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Tue, 27 Jun 2017 05:04:25 +0200 Subject: JDK 8 does not comply with RFC 5915 In-Reply-To: References: Message-ID: Michael, You are correct, I just wanted to point out the different interpretations of SHOULD. Since you obviously are pretty familiar with Java/JCE, would it be possible for you spending a few minutes reviewing: https://github.com/cyberphone/java-cfrg-spec ? I've just removed the note about private key serialization :-) Anders On 2017-06-26 23:44, Michael StJohns wrote: > Inline. > > On 6/26/2017 2:56 PM, Anders Rundgren wrote: >> On 2017-06-26 17:58, Michael StJohns wrote: >>> >>> Umm... SHOULD is not a MUST - JDK8 does comply with the RFC, it just >>> doesn't provide the "convenient" field: >>> >>> >>> I always thought that RFC5915 should have specified "MAY" there >>> instead. The main reason is that its trivial to reconstitute the >>> public key from the private key so there is mostly no need to keep >>> the two together and the actual text suggested as much. >> >> Right, for standards developers this is clear. Practitioners OTOH only >> notice (usually the hard way) that some systems do this and some do that. >> >> In this particular case the SHOULD have been interpreted differently >> by Oracle than by BouncyCastle and OpenSSL. > > And all three of them are compliant with RFC5915 AFAICT. > >> >> BTW, newer standards like JWK declares this feature as a MUST: >> https://tools.ietf.org/html/rfc7518#section-6.2.2 > > After thinking about it, I'm not sure why you think this is necessary. > > Here's why I think it isn't: > 1) The whole idea of plugin providers is to be able to provide different > services and sometimes in different manners. E.g. it's OK that > Bouncycastle does it one way and Oracle does it another - I might > actually want the more compact format due to storage or transmission > constraints. (Or to fit encrypted into a barcode...) > 2) PKCS8 as a standalone object is one of the least useful encodings - > I'd rather have a PKCS12 most times. Or the bare private value, or a > handle to the object on an HSM. I usually end up having to convert > anything generated by OpenSSL into one of those forms. > 3) I don't think even bouncy castle's KeyFactory implementation of an EC > factory allows you to generate a public key from a PKCS8 object. > 4) I'm pretty sure that the Sun EC key factory implementation doesn't > choke on a PKCS8 object that contains an Public key. > 5) All providers that do EC that I'm aware of (e.g. Bouncycastle, IAIK > and Sun) expect the input data for a key factory to generate an EC > public key to be a SubjectPublicKeyInfo blob. > 6) It's *really* mostly not a good idea to be using the PKCS8 object > model - it has no protection at all for the private key. I've used it > exactly once and paired it with the PKCS12 SafeBag constructs > (PKCS8ShroudedKeyBag) to provide protection. > 7) the java.security.spec.PKCS8EncodedKeySpec has no way of extracting > or setting the attributes section of the private key (where the public > key is stored), so there's really no reason to implement support for the > public key unless the implementing provider needs it or can use it in > some way. > 8) Java doesn't even provide a way to figure out the type of the PKCS8 > private key - e.g. no way to grab the OID that describes the underlying > key material from the PKCS8EncodedKeySpec - so you actually have to know > its an EC key some other way. > > So how would you use the additional Public Key data in Java? > > Later, Mike > > > > > > From anders.rundgren.net at gmail.com Wed Jun 28 16:11:50 2017 From: anders.rundgren.net at gmail.com (Anders Rundgren) Date: Wed, 28 Jun 2017 18:11:50 +0200 Subject: ECNamedCurveSpec Message-ID: Highly related to https://github.com/cyberphone/java-cfrg-spec (no takers?) is the fact that quite a lot of code out there depend on org.bouncycastle.jce.spec.ECNamedCurveSpec Are there any plans to create a "true" java version of ECNamedCurveSpec or are we stuck with "workarounds": https://stackoverflow.com/questions/22646792/how-does-one-convert-a-public-ec-code-point-and-curve-name-into-a-publickey Anders