From staffan.larsen at oracle.com Fri Apr 25 12:02:53 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Fri, 25 Apr 2014 14:02:53 +0200 Subject: Proposal: jtreg tests with native components Message-ID: There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: 1) The binaries are pre-compiled and checked into the repository (often inside jar files). 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. Thanks, /Staffan From jonathan.gibbons at oracle.com Fri Apr 25 15:31:06 2014 From: jonathan.gibbons at oracle.com (Jonathan Gibbons) Date: Fri, 25 Apr 2014 08:31:06 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: References: Message-ID: <535A7FBA.5070300@oracle.com> I'll quibble over the phrase "the same makefile logic". I think it is OK to use the same Makefile infrastructure (e.g. the configure.sh mechanism) and the same top level Makefile, but at some level this is going to need to be distinct Makefile logic specific to compiling the code for the tests. I agree with the general concept of pre-building binaries, but it would be good to see the next level of detail: -- where is the source code for the native code of the tests -- is it one library per test, or what -- what sort of hierarchy do the libraries end up in. I am also concerned for the developer experience. One of the characteristics of the jtreg design has always been that it is dev-friendly, meaning it is easy and fast for developers to edit a test and rerun it. For myself, I don't work on native code, or on repos containing native code, so I'd be interested to hear how this will impact developers working on tests, and on those folk that simply want to run the tests. -- Jon On 04/25/2014 05:02 AM, Staffan Larsen wrote: > There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: > > 1) The binaries are pre-compiled and checked into the repository (often inside jar files). > 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. > > Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. > > I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. > > If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. > > I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. > > To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. > > This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. > > Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. > > Thanks, > /Staffan > From staffan.larsen at oracle.com Mon Apr 28 08:08:59 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 10:08:59 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: <535A7FBA.5070300@oracle.com> References: <535A7FBA.5070300@oracle.com> Message-ID: <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> On 25 apr 2014, at 17:31, Jonathan Gibbons wrote: > I'll quibble over the phrase "the same makefile logic". > > I think it is OK to use the same Makefile infrastructure (e.g. the configure.sh mechanism) and the same top level Makefile, but at some level this is going to need to be distinct Makefile logic specific to compiling the code for the tests. Yes, of course there will be specific makefile logic to compile the code for the tests. What I intended to say was that we can to a significant degree leverage the work that has already been done with the ?other? makefiles. For example, there is configure step that sets up all of the environment, there are utility functions that can be used (SetupNativeCompilation in this case) and so on. It just makes sense to compile the tests in the same way. > I agree with the general concept of pre-building binaries, but it would be good to see the next level of detail: Agree that there are lots of details that need answers. I have started to develop a proof-of-concept and I can share some of the choices I?ve made below. > -- where is the source code for the native code of the tests I choose to put the native source code in the same directory next to the Java test code. That makes it easy to see both the Java and native code at the same time. I wanted to make it easy to add a new native library and have it compiled, so by default the makefiles picks up any C source files and compiles them into libraries, one library per file. By default, the libraries will be named according to the name of the C file, so a file called libMyTest.c will be compiled into libMyTest.so (or MyTest.dll). This is to make simple things simple. However, if you need to do something more complicated, you can add a makefile to the directory and that file will be invoked instead. It then has the full power to compile the native code as it sees fit. > -- is it one library per test, or what I think we can leave that up to the test. According to the above, there can be any number of C files in a test folder and they will all be compiled. A test can then load (System.loadLibrary()) on or more of these at runtime. > -- what sort of hierarchy do the libraries end up in. In my implementation I choose to make this as simple as possible and had all libraries end up in the same output folder. I realize this will create problems if two libraries have the same name. We can avoid this by having a naming convention for test libraries that minimizes conflicts. We can also add makefile logic to detect problems at compile-time. Because I use the existing makefile framework, the output folder is specific to the makefile configuration you currently use. I my implementation I have used $JDK_OUTPUTDIR/test/native which translates to something like build/macosx-x86_64-normal-server-release/jdk/test/native/. Having a single output folder makes it easy to setup the correct java.library.path for a test: they all use the same path. > I am also concerned for the developer experience. One of the characteristics of the jtreg design has always been that it is dev-friendly, meaning it is easy and fast for developers to edit a test and rerun it. For myself, I don't work on native code, or on repos containing native code, so I'd be interested to hear how this will impact developers working on tests, and on those folk that simply want to run the tests. Yes, this is indeed the largest problem to solve here and I want to keep the feature of simple edit-rerun cycles. On the other hand, I don?t want to complicate jtreg so that it has know anything about how to compile native code or invoke makefiles. I have instead chosen to use the makefile ?test? target as the way to run tests and make sure native tests are recompiled. In this case, if you edit a native test you would run ?make test? (possibly with some more arguments) and that would compile the test libraries (and the product) and then call jtreg. Because there is no way (that I can think of at least) for jtreg to know where the compiled test libraries are located, the invokation of jtreg will have to include that path as an argument to jtreg. If you prefer to not use the makefile to run tests, you would have to do two step: first compile native tests (using the makefiles) and then run jtreg manually. You would then have to tell jtreg where the compiled tests are. This does complicate things, no doubt about that. It?s hard to make it completely transparent and this is the best that I?ve been able to come up with. Other ideas are more than welcome. Thanks, /Staffan > > -- Jon > > > On 04/25/2014 05:02 AM, Staffan Larsen wrote: >> There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: >> >> 1) The binaries are pre-compiled and checked into the repository (often inside jar files). >> 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. >> >> Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. >> >> I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. >> >> If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. >> >> I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. >> >> To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. >> >> This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. >> >> Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. >> >> Thanks, >> /Staffan >> > From staffan.larsen at oracle.com Mon Apr 28 08:13:08 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 10:13:08 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: References: <535A7FBA.5070300@oracle.com> Message-ID: <9B1226E5-86DC-4C88-951F-BEE80EAF5D2B@oracle.com> On 25 apr 2014, at 18:09, Martin Buchholz wrote: > I don't see a good solution. Conceptually, the tests are built/executed > independently of the jdk they are testing. But it would be crazy to have a > separate configure/make infrastructure for each native test. If you build > the test native bits together with the jdk, you would have to be careful > not to copy those bits into "real" deployed jdks. That is why I was proposing a separate bundle for the tests. One zip with the product bits. One zip with the pre-compiled tests. > Here's an idea: you can ask a jdk how it was configured/built and reuse > that to build tests. There's prior art, e.g. emacs remembers how it was > configured (but that's not enough to let you build compatible test > binaries). That still has the problem that most tests machines are not setup to build things. On linux it may be reasonable to assume that gcc is installed, but what version? On windows, there are no compilers by default. What if a machine is used for testing several different versions of the JDK using many different compilers? How do we find the correct one? It may not be installed in the same location as on the build machine. These were all problems that I tried to attack by compiling the tests when the product is built. At that point we have the configuration set up. Thanks, /Staffan > > system-configuration-options is a variable defined in `C source code'. > Its value is > " '--build' 'x86_64-linux-gnu' '--build' 'x86_64-linux-gnu' '--prefix=/usr' > '--sharedstatedir=/var/lib' '--libexecdir=/usr/lib' > '--localstatedir=/var/lib' '--infodir=/usr/share/info' > '--mandir=/usr/share/man' '--with-pop=yes' > '--enable-locallisppath=/etc/emacs23:/etc/emacs:/usr/local/share/emacs/23.3/site-lisp:/usr/local/share/emacs/site-lisp:/usr/share/emacs/23.3/site-lisp:/usr/share/emacs/site-lisp:/usr/share/emacs/23.3/leim' > '--with-crt-dir=/usr/lib/x86_64-linux-gnu' '--with-x=yes' > '--with-x-toolkit=gtk' '--with-toolkit-scroll-bars' > 'build_alias=x86_64-linux-gnu' 'CFLAGS=-DDEBIAN -g -O2' 'LDFLAGS=-g' > 'CPPFLAGS=-D_FORTIFY_SOURCE=2'" > > Documentation: > String containing the configuration options Emacs was built with. > > > > On Fri, Apr 25, 2014 at 8:31 AM, Jonathan Gibbons < > jonathan.gibbons at oracle.com> wrote: > >> I'll quibble over the phrase "the same makefile logic". >> >> I think it is OK to use the same Makefile infrastructure (e.g. the >> configure.sh mechanism) and the same top level Makefile, but at some level >> this is going to need to be distinct Makefile logic specific to compiling >> the code for the tests. >> >> I agree with the general concept of pre-building binaries, but it would be >> good to see the next level of detail: >> >> -- where is the source code for the native code of the tests >> -- is it one library per test, or what >> -- what sort of hierarchy do the libraries end up in. >> >> I am also concerned for the developer experience. One of the >> characteristics of the jtreg design has always been that it is >> dev-friendly, meaning it is easy and fast for developers to edit a test and >> rerun it. For myself, I don't work on native code, or on repos >> containing native code, so I'd be interested to hear how this will impact >> developers working on tests, and on those folk that simply want to run the >> tests. >> >> -- Jon >> >> >> >> On 04/25/2014 05:02 AM, Staffan Larsen wrote: >> >>> There are a couple of jtreg tests today that depend on native components >>> (either JNI libraries or executables). These are handled in one of two ways: >>> >>> 1) The binaries are pre-compiled and checked into the repository (often >>> inside jar files). >>> 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is >>> being run. >>> >>> Neither of these are very good solutions. #1 makes it hard to run the >>> setup the test for all platforms and requires binaries in the source >>> control system. #2 is hit-and-miss: the correct compiler may or may not be >>> installed on the test machine, and the approach requires platform specific >>> logic to be maintained. >>> >>> I would like to propose that these native components are instead compiled >>> when the product is built by the same makefile logic as the product. At >>> product build time we know we have access to the (correct) compilers and we >>> have excellent support in the makefiles for building on all platforms. >>> >>> If we build the native test components together with the product, we also >>> have to take care of distributing the result together with the product when >>> we do testing across a larger number of machines. We will also need a way >>> to tell the jtreg tests where these pre-built binaries are located. >>> >>> I suggest that at the end of a distributed build run, the pre-built test >>> binaries are packaged in a zip or tar file (just like the product bits) and >>> stored next to the product bundles. When we run distributed tests, we need >>> to pick up the product bundle and the test bundle before the testing is >>> started. >>> >>> To tell the tests where the native code is, I would like to add a flag to >>> jtreg to point out the path to the binaries. This should cause jtreg to set >>> java.library.path before invoking a test and also set a test.* property >>> which can be used by test to find it?s native components. >>> >>> This kind of setup would make it easier to add and maintain tests that >>> have a native component. I think this will be especially important as more >>> tests are written using jtreg in the hotspot repository. >>> >>> Thoughts on this? Is the general approach ok? There are lots of details >>> to be figured out, but at this stage I would like to hear feedback on the >>> idea as such. >>> >>> Thanks, >>> /Staffan >>> >>> >> From christian.tornqvist at oracle.com Mon Apr 28 12:20:47 2014 From: christian.tornqvist at oracle.com (Christian Tornqvist) Date: Mon, 28 Apr 2014 08:20:47 -0400 Subject: Proposal: jtreg tests with native components In-Reply-To: References: Message-ID: <05f301cf62dc$49602a80$dc207f80$@oracle.com> Hi Staffan, This sounds like a great proposal that would solve many of our issues with tests requiring native code. Would it be possible for the make system to pick up a custom makefile from a test folder? The use cases I see are: 1. Launcher type tests (we have a few of these), these needs to be compiled into an executable 2. Test libraries that might depend on being compiled with certain compiler/linker flags (don't think we have tests like this today though). Thanks for working on this :) Thanks, Christian -----Original Message----- From: jtreg-dev [mailto:jtreg-dev-bounces at openjdk.java.net] On Behalf Of Staffan Larsen Sent: Friday, April 25, 2014 8:03 AM To: build-dev at openjdk.java.net build-dev; jtreg-dev at openjdk.java.net Subject: Proposal: jtreg tests with native components There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: 1) The binaries are pre-compiled and checked into the repository (often inside jar files). 2) The test will try to invoke a compiler (gcc, cl, .) when the test is being run. Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it's native components. This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. Thanks, /Staffan From staffan.larsen at oracle.com Mon Apr 28 12:31:08 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 14:31:08 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: <05f301cf62dc$49602a80$dc207f80$@oracle.com> References: <05f301cf62dc$49602a80$dc207f80$@oracle.com> Message-ID: Hi Christian, Yes, that is my intention. If there is a makefile in the folder, that file will be invoked (in the context of the ?normal? makefiles). I wrote a little about this in my response to Jon. The idea is that for simple native components the compilation is automagic, but it is also possible to have a complete makefile when the simple thing is not enough. /Staffan On 28 apr 2014, at 14:20, Christian Tornqvist wrote: > Hi Staffan, > > This sounds like a great proposal that would solve many of our issues with > tests requiring native code. Would it be possible for the make system to > pick up a custom makefile from a test folder? The use cases I see are: > > 1. Launcher type tests (we have a few of these), these needs to be compiled > into an executable > 2. Test libraries that might depend on being compiled with certain > compiler/linker flags (don't think we have tests like this today though). > > Thanks for working on this :) > > Thanks, > Christian > > -----Original Message----- > From: jtreg-dev [mailto:jtreg-dev-bounces at openjdk.java.net] On Behalf Of > Staffan Larsen > Sent: Friday, April 25, 2014 8:03 AM > To: build-dev at openjdk.java.net build-dev; jtreg-dev at openjdk.java.net > Subject: Proposal: jtreg tests with native components > > There are a couple of jtreg tests today that depend on native components > (either JNI libraries or executables). These are handled in one of two ways: > > 1) The binaries are pre-compiled and checked into the repository (often > inside jar files). > 2) The test will try to invoke a compiler (gcc, cl, .) when the test is > being run. > > Neither of these are very good solutions. #1 makes it hard to run the setup > the test for all platforms and requires binaries in the source control > system. #2 is hit-and-miss: the correct compiler may or may not be installed > on the test machine, and the approach requires platform specific logic to be > maintained. > > I would like to propose that these native components are instead compiled > when the product is built by the same makefile logic as the product. At > product build time we know we have access to the (correct) compilers and we > have excellent support in the makefiles for building on all platforms. > > If we build the native test components together with the product, we also > have to take care of distributing the result together with the product when > we do testing across a larger number of machines. We will also need a way to > tell the jtreg tests where these pre-built binaries are located. > > I suggest that at the end of a distributed build run, the pre-built test > binaries are packaged in a zip or tar file (just like the product bits) and > stored next to the product bundles. When we run distributed tests, we need > to pick up the product bundle and the test bundle before the testing is > started. > > To tell the tests where the native code is, I would like to add a flag to > jtreg to point out the path to the binaries. This should cause jtreg to set > java.library.path before invoking a test and also set a test.* property > which can be used by test to find it's native components. > > This kind of setup would make it easier to add and maintain tests that have > a native component. I think this will be especially important as more tests > are written using jtreg in the hotspot repository. > > Thoughts on this? Is the general approach ok? There are lots of details to > be figured out, but at this stage I would like to hear feedback on the idea > as such. > > Thanks, > /Staffan > > From jonathan.gibbons at oracle.com Mon Apr 28 17:31:04 2014 From: jonathan.gibbons at oracle.com (Jonathan Gibbons) Date: Mon, 28 Apr 2014 10:31:04 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> References: <535A7FBA.5070300@oracle.com> <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> Message-ID: <535E9058.9050102@oracle.com> On 04/28/2014 01:08 AM, Staffan Larsen wrote: > If you prefer to not use the makefile to run tests, you would have to do two step: first compile native tests (using the makefiles) and then run jtreg manually. You would then have to tell jtreg where the compiled tests are. There should be a separate test-bundle target so you can go "make test-bundle" without running the tests. -- Jon From staffan.larsen at oracle.com Mon Apr 28 17:57:41 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 19:57:41 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: <535E9058.9050102@oracle.com> References: <535A7FBA.5070300@oracle.com> <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> <535E9058.9050102@oracle.com> Message-ID: <786D2A4F-A0EA-4254-85F4-5362B7D2E608@oracle.com> On 28 apr 2014, at 19:31, Jonathan Gibbons wrote: > On 04/28/2014 01:08 AM, Staffan Larsen wrote: >> If you prefer to not use the makefile to run tests, you would have to do two step: first compile native tests (using the makefiles) and then run jtreg manually. You would then have to tell jtreg where the compiled tests are. > > There should be a separate test-bundle target so you can go "make test-bundle" without running the tests. Good point. I?ve been calling the target that build the test ?build-tests? which I?m not particularly fond of. ?test-bundle? on the other hand seems to imply an actual bundling (zipping, tarring) of the files (and maybe that was what you were referring to?). The makesfiles today do not have logic for creating these bundles, even for the product. The closest you get for the product is ?images? which will create an exploded bundle (what it looks like before you tar/zip it). This is also what I do for "build-tests?. Still would like a better name. /Staffan From staffan.larsen at oracle.com Mon Apr 28 18:03:29 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 20:03:29 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: References: Message-ID: <684658FB-9709-4283-AE54-BAFFB6C1C843@oracle.com> The change in jtreg that I would like to see is the addition of a flag to specify the path of the test binaries. I?m thinking something like ?-natives:?. This would do two things: - Set the java.library.path when invoking tests. This is needed for System.loadLibrary() to work. - Set a test.natives property. This is needed in the cases where the test binary is not a library, but a program. In this case the test will want to launch that program and needs the path to the program. Both of these would be set to above. Jtreg should check that the path exists. Does that sound like a reasonable change? Thanks, /Staffan On 25 apr 2014, at 14:02, Staffan Larsen wrote: > There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: > > 1) The binaries are pre-compiled and checked into the repository (often inside jar files). > 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. > > Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. > > I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. > > If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. > > I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. > > To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. > > This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. > > Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. > > Thanks, > /Staffan > From jonathan.gibbons at oracle.com Mon Apr 28 18:05:59 2014 From: jonathan.gibbons at oracle.com (Jonathan Gibbons) Date: Mon, 28 Apr 2014 11:05:59 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <786D2A4F-A0EA-4254-85F4-5362B7D2E608@oracle.com> References: <535A7FBA.5070300@oracle.com> <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> <535E9058.9050102@oracle.com> <786D2A4F-A0EA-4254-85F4-5362B7D2E608@oracle.com> Message-ID: <535E9887.4050003@oracle.com> On 04/28/2014 10:57 AM, Staffan Larsen wrote: > Good point. I?ve been calling the target that build the test ?build-tests? which I?m not particularly fond of. ?test-bundle? on the other hand seems to imply an actual bundling (zipping, tarring) of the files (and maybe that was what you were referring to?). The makesfiles today do not have logic for creating these bundles, even for the product. The closest you get for the product is ?images? which will create an exploded bundle (what it looks like before you tar/zip it). This is also what I do for "build-tests?. Still would like a better name. > > /Staffan As long as there is a way of making the artifacts to be used by the tests that is enough for those folk that want to run jtreg directly. -- Jon From staffan.larsen at oracle.com Mon Apr 28 18:11:10 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Mon, 28 Apr 2014 20:11:10 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: <535E9887.4050003@oracle.com> References: <535A7FBA.5070300@oracle.com> <920D7738-6071-4B84-B9EC-8AE1DBA1D0B2@oracle.com> <535E9058.9050102@oracle.com> <786D2A4F-A0EA-4254-85F4-5362B7D2E608@oracle.com> <535E9887.4050003@oracle.com> Message-ID: <3BF9A24F-4530-45FD-9DCE-AAFC17E9AE33@oracle.com> On 28 apr 2014, at 20:05, Jonathan Gibbons wrote: > On 04/28/2014 10:57 AM, Staffan Larsen wrote: >> Good point. I?ve been calling the target that build the test ?build-tests? which I?m not particularly fond of. ?test-bundle? on the other hand seems to imply an actual bundling (zipping, tarring) of the files (and maybe that was what you were referring to?). The makesfiles today do not have logic for creating these bundles, even for the product. The closest you get for the product is ?images? which will create an exploded bundle (what it looks like before you tar/zip it). This is also what I do for "build-tests?. Still would like a better name. >> >> /Staffan > > As long as there is a way of making the artifacts to be used by the tests that is enough for those folk that want to run jtreg directly. Absolutely! > > -- Jon From jonathan.gibbons at oracle.com Mon Apr 28 18:24:53 2014 From: jonathan.gibbons at oracle.com (Jonathan Gibbons) Date: Mon, 28 Apr 2014 11:24:53 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <535E9B72.40005@oracle.com> References: <535E9B72.40005@oracle.com> Message-ID: <535E9CF5.5060700@oracle.com> On 04/28/2014 11:18 AM, Sergey Bylokhov wrote: > Hello, > I know it is crazy idea but why we cannot prebuild all tests at once > and use only one jar like jck do? > Because the test execution model is intentionally different from JCK and other test harnesses like TestNG and JUnit. jtreg specifies that a test is a series of actions, some of which may be compilation steps and some of which may be execution steps. It is not the case that all tests want to be compiled up front and then executed. If you're testing consistent (or inconsistent) compilation, for example, you may want to compile some classes one way, then try compiling and using them a different way, as a way of testing separate compilation. There's a bunch of tests like that for serialization as well. Bottom line: JDK tests are not always plain old API tests. -- Jon From david.holmes at oracle.com Wed Apr 30 09:39:55 2014 From: david.holmes at oracle.com (David Holmes) Date: Wed, 30 Apr 2014 19:39:55 +1000 Subject: Proposal: jtreg tests with native components In-Reply-To: References: Message-ID: <5360C4EB.80201@oracle.com> Hi Staffan, On 25/04/2014 10:02 PM, Staffan Larsen wrote: > There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: > > 1) The binaries are pre-compiled and checked into the repository (often inside jar files). > 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. > > Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. #2 is far from perfect but ... > I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. > > If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. don't under estimate the complexity involved in building then "distributing" the test binaries. You will still need to maintain platform specific logic as you won't necessarily be able to use the CFLAGS etc that the main build process uses. Also talk to SQE as I'm pretty sure there is an existing project to look at how to better handle this, at least for the internal test suites. David ----- > I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. > > To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. > > This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. > > Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. > > Thanks, > /Staffan > From staffan.larsen at oracle.com Wed Apr 30 11:39:54 2014 From: staffan.larsen at oracle.com (Staffan Larsen) Date: Wed, 30 Apr 2014 13:39:54 +0200 Subject: Proposal: jtreg tests with native components In-Reply-To: <5360C4EB.80201@oracle.com> References: <5360C4EB.80201@oracle.com> Message-ID: <6D79B1FC-EEDA-497B-B622-D791D89E0DD3@oracle.com> On 30 apr 2014, at 11:39, David Holmes wrote: > Hi Staffan, > > On 25/04/2014 10:02 PM, Staffan Larsen wrote: >> There are a couple of jtreg tests today that depend on native components (either JNI libraries or executables). These are handled in one of two ways: >> >> 1) The binaries are pre-compiled and checked into the repository (often inside jar files). >> 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is being run. >> >> Neither of these are very good solutions. #1 makes it hard to run the setup the test for all platforms and requires binaries in the source control system. #2 is hit-and-miss: the correct compiler may or may not be installed on the test machine, and the approach requires platform specific logic to be maintained. > > #2 is far from perfect but ... > >> I would like to propose that these native components are instead compiled when the product is built by the same makefile logic as the product. At product build time we know we have access to the (correct) compilers and we have excellent support in the makefiles for building on all platforms. >> >> If we build the native test components together with the product, we also have to take care of distributing the result together with the product when we do testing across a larger number of machines. We will also need a way to tell the jtreg tests where these pre-built binaries are located. > > don't under estimate the complexity involved in building then "distributing" the test binaries. I don?t. It will be complicated, but I?m sure we can do it. > > You will still need to maintain platform specific logic as you won't necessarily be able to use the CFLAGS etc that the main build process uses. Can you explain more? Why can?t I use CFLAGS as it is? > > Also talk to SQE as I'm pretty sure there is an existing project to look at how to better handle this, at least for the internal test suites. I have talked to SQE. I don?t know of any other projects to handle this. /Staffan > > David > ----- > >> I suggest that at the end of a distributed build run, the pre-built test binaries are packaged in a zip or tar file (just like the product bits) and stored next to the product bundles. When we run distributed tests, we need to pick up the product bundle and the test bundle before the testing is started. >> >> To tell the tests where the native code is, I would like to add a flag to jtreg to point out the path to the binaries. This should cause jtreg to set java.library.path before invoking a test and also set a test.* property which can be used by test to find it?s native components. >> >> This kind of setup would make it easier to add and maintain tests that have a native component. I think this will be especially important as more tests are written using jtreg in the hotspot repository. >> >> Thoughts on this? Is the general approach ok? There are lots of details to be figured out, but at this stage I would like to hear feedback on the idea as such. >> >> Thanks, >> /Staffan >> From martinrb at google.com Fri Apr 25 16:09:00 2014 From: martinrb at google.com (Martin Buchholz) Date: Fri, 25 Apr 2014 09:09:00 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <535A7FBA.5070300@oracle.com> References: <535A7FBA.5070300@oracle.com> Message-ID: I don't see a good solution. Conceptually, the tests are built/executed independently of the jdk they are testing. But it would be crazy to have a separate configure/make infrastructure for each native test. If you build the test native bits together with the jdk, you would have to be careful not to copy those bits into "real" deployed jdks. Here's an idea: you can ask a jdk how it was configured/built and reuse that to build tests. There's prior art, e.g. emacs remembers how it was configured (but that's not enough to let you build compatible test binaries). system-configuration-options is a variable defined in `C source code'. Its value is " '--build' 'x86_64-linux-gnu' '--build' 'x86_64-linux-gnu' '--prefix=/usr' '--sharedstatedir=/var/lib' '--libexecdir=/usr/lib' '--localstatedir=/var/lib' '--infodir=/usr/share/info' '--mandir=/usr/share/man' '--with-pop=yes' '--enable-locallisppath=/etc/emacs23:/etc/emacs:/usr/local/share/emacs/23.3/site-lisp:/usr/local/share/emacs/site-lisp:/usr/share/emacs/23.3/site-lisp:/usr/share/emacs/site-lisp:/usr/share/emacs/23.3/leim' '--with-crt-dir=/usr/lib/x86_64-linux-gnu' '--with-x=yes' '--with-x-toolkit=gtk' '--with-toolkit-scroll-bars' 'build_alias=x86_64-linux-gnu' 'CFLAGS=-DDEBIAN -g -O2' 'LDFLAGS=-g' 'CPPFLAGS=-D_FORTIFY_SOURCE=2'" Documentation: String containing the configuration options Emacs was built with. On Fri, Apr 25, 2014 at 8:31 AM, Jonathan Gibbons < jonathan.gibbons at oracle.com> wrote: > I'll quibble over the phrase "the same makefile logic". > > I think it is OK to use the same Makefile infrastructure (e.g. the > configure.sh mechanism) and the same top level Makefile, but at some level > this is going to need to be distinct Makefile logic specific to compiling > the code for the tests. > > I agree with the general concept of pre-building binaries, but it would be > good to see the next level of detail: > > -- where is the source code for the native code of the tests > -- is it one library per test, or what > -- what sort of hierarchy do the libraries end up in. > > I am also concerned for the developer experience. One of the > characteristics of the jtreg design has always been that it is > dev-friendly, meaning it is easy and fast for developers to edit a test and > rerun it. For myself, I don't work on native code, or on repos > containing native code, so I'd be interested to hear how this will impact > developers working on tests, and on those folk that simply want to run the > tests. > > -- Jon > > > > On 04/25/2014 05:02 AM, Staffan Larsen wrote: > >> There are a couple of jtreg tests today that depend on native components >> (either JNI libraries or executables). These are handled in one of two ways: >> >> 1) The binaries are pre-compiled and checked into the repository (often >> inside jar files). >> 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is >> being run. >> >> Neither of these are very good solutions. #1 makes it hard to run the >> setup the test for all platforms and requires binaries in the source >> control system. #2 is hit-and-miss: the correct compiler may or may not be >> installed on the test machine, and the approach requires platform specific >> logic to be maintained. >> >> I would like to propose that these native components are instead compiled >> when the product is built by the same makefile logic as the product. At >> product build time we know we have access to the (correct) compilers and we >> have excellent support in the makefiles for building on all platforms. >> >> If we build the native test components together with the product, we also >> have to take care of distributing the result together with the product when >> we do testing across a larger number of machines. We will also need a way >> to tell the jtreg tests where these pre-built binaries are located. >> >> I suggest that at the end of a distributed build run, the pre-built test >> binaries are packaged in a zip or tar file (just like the product bits) and >> stored next to the product bundles. When we run distributed tests, we need >> to pick up the product bundle and the test bundle before the testing is >> started. >> >> To tell the tests where the native code is, I would like to add a flag to >> jtreg to point out the path to the binaries. This should cause jtreg to set >> java.library.path before invoking a test and also set a test.* property >> which can be used by test to find it?s native components. >> >> This kind of setup would make it easier to add and maintain tests that >> have a native component. I think this will be especially important as more >> tests are written using jtreg in the hotspot repository. >> >> Thoughts on this? Is the general approach ok? There are lots of details >> to be figured out, but at this stage I would like to hear feedback on the >> idea as such. >> >> Thanks, >> /Staffan >> >> > From martinrb at google.com Mon Apr 28 16:20:48 2014 From: martinrb at google.com (Martin Buchholz) Date: Mon, 28 Apr 2014 09:20:48 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <9B1226E5-86DC-4C88-951F-BEE80EAF5D2B@oracle.com> References: <535A7FBA.5070300@oracle.com> <9B1226E5-86DC-4C88-951F-BEE80EAF5D2B@oracle.com> Message-ID: Staffan, Thanks - I think the direction you're going in is the best that can be achieved. On Mon, Apr 28, 2014 at 1:13 AM, Staffan Larsen wrote: > > On 25 apr 2014, at 18:09, Martin Buchholz wrote: > > > I don't see a good solution. Conceptually, the tests are built/executed > > independently of the jdk they are testing. But it would be crazy to > have a > > separate configure/make infrastructure for each native test. If you > build > > the test native bits together with the jdk, you would have to be careful > > not to copy those bits into "real" deployed jdks. > > That is why I was proposing a separate bundle for the tests. One zip with > the product bits. One zip with the pre-compiled tests. > > > Here's an idea: you can ask a jdk how it was configured/built and reuse > > that to build tests. There's prior art, e.g. emacs remembers how it was > > configured (but that's not enough to let you build compatible test > > binaries). > > That still has the problem that most tests machines are not setup to build > things. On linux it may be reasonable to assume that gcc is installed, but > what version? On windows, there are no compilers by default. What if a > machine is used for testing several different versions of the JDK using > many different compilers? How do we find the correct one? It may not be > installed in the same location as on the build machine. > > These were all problems that I tried to attack by compiling the tests when > the product is built. At that point we have the configuration set up. > > Thanks, > /Staffan > > > > > > system-configuration-options is a variable defined in `C source code'. > > Its value is > > " '--build' 'x86_64-linux-gnu' '--build' 'x86_64-linux-gnu' > '--prefix=/usr' > > '--sharedstatedir=/var/lib' '--libexecdir=/usr/lib' > > '--localstatedir=/var/lib' '--infodir=/usr/share/info' > > '--mandir=/usr/share/man' '--with-pop=yes' > > > '--enable-locallisppath=/etc/emacs23:/etc/emacs:/usr/local/share/emacs/23.3/site-lisp:/usr/local/share/emacs/site-lisp:/usr/share/emacs/23.3/site-lisp:/usr/share/emacs/site-lisp:/usr/share/emacs/23.3/leim' > > '--with-crt-dir=/usr/lib/x86_64-linux-gnu' '--with-x=yes' > > '--with-x-toolkit=gtk' '--with-toolkit-scroll-bars' > > 'build_alias=x86_64-linux-gnu' 'CFLAGS=-DDEBIAN -g -O2' 'LDFLAGS=-g' > > 'CPPFLAGS=-D_FORTIFY_SOURCE=2'" > > > > Documentation: > > String containing the configuration options Emacs was built with. > > > > > > > > On Fri, Apr 25, 2014 at 8:31 AM, Jonathan Gibbons < > > jonathan.gibbons at oracle.com> wrote: > > > >> I'll quibble over the phrase "the same makefile logic". > >> > >> I think it is OK to use the same Makefile infrastructure (e.g. the > >> configure.sh mechanism) and the same top level Makefile, but at some > level > >> this is going to need to be distinct Makefile logic specific to > compiling > >> the code for the tests. > >> > >> I agree with the general concept of pre-building binaries, but it would > be > >> good to see the next level of detail: > >> > >> -- where is the source code for the native code of the tests > >> -- is it one library per test, or what > >> -- what sort of hierarchy do the libraries end up in. > >> > >> I am also concerned for the developer experience. One of the > >> characteristics of the jtreg design has always been that it is > >> dev-friendly, meaning it is easy and fast for developers to edit a test > and > >> rerun it. For myself, I don't work on native code, or on repos > >> containing native code, so I'd be interested to hear how this will > impact > >> developers working on tests, and on those folk that simply want to run > the > >> tests. > >> > >> -- Jon > >> > >> > >> > >> On 04/25/2014 05:02 AM, Staffan Larsen wrote: > >> > >>> There are a couple of jtreg tests today that depend on native > components > >>> (either JNI libraries or executables). These are handled in one of two > ways: > >>> > >>> 1) The binaries are pre-compiled and checked into the repository (often > >>> inside jar files). > >>> 2) The test will try to invoke a compiler (gcc, cl, ?) when the test is > >>> being run. > >>> > >>> Neither of these are very good solutions. #1 makes it hard to run the > >>> setup the test for all platforms and requires binaries in the source > >>> control system. #2 is hit-and-miss: the correct compiler may or may > not be > >>> installed on the test machine, and the approach requires platform > specific > >>> logic to be maintained. > >>> > >>> I would like to propose that these native components are instead > compiled > >>> when the product is built by the same makefile logic as the product. At > >>> product build time we know we have access to the (correct) compilers > and we > >>> have excellent support in the makefiles for building on all platforms. > >>> > >>> If we build the native test components together with the product, we > also > >>> have to take care of distributing the result together with the product > when > >>> we do testing across a larger number of machines. We will also need a > way > >>> to tell the jtreg tests where these pre-built binaries are located. > >>> > >>> I suggest that at the end of a distributed build run, the pre-built > test > >>> binaries are packaged in a zip or tar file (just like the product > bits) and > >>> stored next to the product bundles. When we run distributed tests, we > need > >>> to pick up the product bundle and the test bundle before the testing is > >>> started. > >>> > >>> To tell the tests where the native code is, I would like to add a flag > to > >>> jtreg to point out the path to the binaries. This should cause jtreg > to set > >>> java.library.path before invoking a test and also set a test.* property > >>> which can be used by test to find it?s native components. > >>> > >>> This kind of setup would make it easier to add and maintain tests that > >>> have a native component. I think this will be especially important as > more > >>> tests are written using jtreg in the hotspot repository. > >>> > >>> Thoughts on this? Is the general approach ok? There are lots of details > >>> to be figured out, but at this stage I would like to hear feedback on > the > >>> idea as such. > >>> > >>> Thanks, > >>> /Staffan > >>> > >>> > >> > > From martinrb at google.com Wed Apr 30 20:02:38 2014 From: martinrb at google.com (Martin Buchholz) Date: Wed, 30 Apr 2014 13:02:38 -0700 Subject: Proposal: jtreg tests with native components In-Reply-To: <6D79B1FC-EEDA-497B-B622-D791D89E0DD3@oracle.com> References: <5360C4EB.80201@oracle.com> <6D79B1FC-EEDA-497B-B622-D791D89E0DD3@oracle.com> Message-ID: On Wed, Apr 30, 2014 at 4:39 AM, Staffan Larsen wrote: > > > > > You will still need to maintain platform specific logic as you won't > necessarily be able to use the CFLAGS etc that the main build process uses. > > Can you explain more? Why can?t I use CFLAGS as it is? As a general principle, CFLAGS used for the JDK itself may not be appropriate for tests and demos, which are *clients* of the JDK. As a trivial example, -DVENDOR='"Acme Corp"' would not make sense in tests (although harmless). In practice, you can probably get away with having test/demo CFLAGS default to the same values as JDK CFLAGS, but allow the flexibility to override.