From stephan.herrmann at berlin.de Mon Apr 1 09:15:42 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Mon, 01 Apr 2013 18:15:42 +0200 Subject: JSR 335 Lambda Specification, 0.6.2 In-Reply-To: <2EAEDC51-B0B5-4F00-8052-088FF78CCDC4@oracle.com> References: <5DD2850A-F043-4460-B3AA-097D73B40825@oracle.com> <5133AE81.70600@oracle.com> <9C122CD5-066D-4A99-AFE9-0D11F5549719@oracle.com> <514B47E1.3030507@berlin.de> <2EAEDC51-B0B5-4F00-8052-088FF78CCDC4@oracle.com> Message-ID: <5159B2AE.30407@berlin.de> Hi Dan, thanks for your answer. Let me translate my silence since into a more explicit response. >> "To do: define the parameterization of a class C for a type T; >> define the most specific array supertype of a type T." >> >> Could you give an ETA for a fix to these TODOs? In this case >> an informal sketch would already be quite helpful. > > This is a long-standing problem with JLS. See, for example, from JLS 7 15.12.2.7: > > "If F has the form G<..., Yk-1, U, Yk+1, ...>, where U is a type expression that involves Tj, then if A has a supertype of the formG<..., Xk-1, V, Xk+1, ...> where V is a type expression, this algorithm is applied recursively to the constraint V = U." > > The phrase "has a supertype" doesn't bother to explain how this supertype is derived. Add wildcards to the mix, and it's a bit of a mess. > > In practice, this has typically been handled by capturing A and recurring on its supertype, but that can lead to some unsound results, where new capture variables end up as inference variable bounds. The goal here is to come up with a well-defined, sound alternative. I wholeheartedly support your notion of improving the situation. > But in the mean time, whatever you were doing for Java 7 should continue to work as an approximation, and I wouldn't expect any regressions. I should have mentioned that I observed these regressions *after* doing a best guess as to which part of the old implementation might come closest to what we need here. Close, but no cigar. I'm not surprised by the mismatch given that the functional breakdown in the existing Eclipse compiler implementation is probably quite different from the code you are looking at. Looking deeper into those tests that make this approximation fail I see that most of these involve a raw type in the position of the sub type (S). Could you please give an advance statement how 18.5.5. will be integrated into 18.2.3 and friends? More specifically, if unchecked conversion is involved, should this be computed by a separate/nested invocation of the inference, or should a set of additional bounds and constraints be added to the current inference? >> Additionally, I could use a minor clarification on the following >> items from 18.4.: >> >> " >> * If ?i has one or more proper lower bounds, L1, ..., Lk, then Ti = lub(L1, ..., Lk). >> * Otherwise, where ?i has proper upper bounds U1, ..., Uk, Ti = glb(U1, ..., Uk)." >> >> These items don't explicitly specify how mixed proper and improper >> bounds should be handled. I assume for this part to apply, *all* >> bounds of the given kind must be proper bounds, right? >> I first interpreted this as a filter on the set of bounds, but >> that seems to yield bogus results. > > It should be a filter. The subset of bounds that are proper bounds are used, and the rest are ignored. Thanks, so my first interpretation was right, and the resulting regression must be searched elsewhere. So, yes, the spec is basically implementable, but, no, it doesn't work. Or, sorry, it may work for some 70 percent of the interesting programs, but not close to any of the many nines we are striving for. As I see little benefit in building the new implementation on new guess work, I'll basically just wait for the next version of the spec. I'd appreciate any hint regarding a schedule for further spec updates. cheers, Stephan From daniel.smith at oracle.com Mon Apr 1 11:01:48 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Mon, 1 Apr 2013 12:01:48 -0600 Subject: JSR 335 Lambda Specification, 0.6.2 In-Reply-To: <5159B2AE.30407@berlin.de> References: <5DD2850A-F043-4460-B3AA-097D73B40825@oracle.com> <5133AE81.70600@oracle.com> <9C122CD5-066D-4A99-AFE9-0D11F5549719@oracle.com> <514B47E1.3030507@berlin.de> <2EAEDC51-B0B5-4F00-8052-088FF78CCDC4@oracle.com> <5159B2AE.30407@berlin.de> Message-ID: <598EC20B-731F-46D5-8093-796030F9748B@oracle.com> On Apr 1, 2013, at 10:15 AM, Stephan Herrmann wrote: >> But in the mean time, whatever you were doing for Java 7 should continue to work as an approximation, and I wouldn't expect any regressions. > > I should have mentioned that I observed these regressions *after* doing > a best guess as to which part of the old implementation might come > closest to what we need here. Close, but no cigar. I'm not surprised > by the mismatch given that the functional breakdown in the existing > Eclipse compiler implementation is probably quite different from the > code you are looking at. > > Looking deeper into those tests that make this approximation fail > I see that most of these involve a raw type in the position of the > sub type (S). > Could you please give an advance statement how 18.5.5. will be > integrated into 18.2.3 and friends? More specifically, if unchecked > conversion is involved, should this be computed by a separate/nested > invocation of the inference, or should a set of additional bounds > and constraints be added to the current inference? Okay, so it sounds like your problem is in handling cases like: void m(List arg); m(new ArrayList()); // raw type JLS 7: An initial constraint of the form "ArrayList << List" is produced (15.12.2.2); no inference constraint is implied (15.12.2.7); but the method is applicable by subtyping, which allows for an unchecked conversion (15.12.2.2). Lambda Spec 0.6.2: An initial constraint of the form "new ArrayList() -> List" is produced (18.5.1); this reduces to "ArrayList -> List" (18.2.1); as noted in 18.2.2, handling an unchecked conversion here is a to-do item. Nothing substantial has changed here -- just the framework surrounding the problem. In JLS 7, we're happy to produce unsound results (a raw type targeting a parameterized type always gets you no new bounds, which effectively means "true"). In the Lambda Spec, the goal is to tighten that up by producing "true" or "false" depending on whether the conversion will be deemed legal or not. The problem with being precise about when an unchecked assignment is allowed by inference is that the spec has always been vague about when it is allowed at all -- 5.1.9 says an unchecked conversion is to "any parameterized type of the form G" -- that's nondeterministic, and asking for trouble. Hence, I've toyed with 18.5.5, although that shouldn't be taken too seriously until we figure out how to plug it in. In the mean time, javac and Eclipse have both traditionally just defined what constitutes a legal unchecked assignment on their own, presumably with much less complexity than using type inference (18.5.5), and probably not consistently with each other. So, short answer: however you test for legal unchecked assignments -- do the same check for "ArrayList -> List" to get a "true" or "false" result. (Yes, an inference variable is not a type, but given the existing vagueness, just making it work somehow would get a reasonable approximation.) In the next spec iteration, we'll have a more concrete plan on how to plug in 18.5.5 or use some other strategy to decide whether an unchecked assignment is legal, and maybe even get some extra bounds out of the deal in certain corner cases. > So, yes, the spec is basically implementable, but, no, it doesn't work. > Or, sorry, it may work for some 70 percent of the interesting programs, > but not close to any of the many nines we are striving for. > As I see little benefit in building the new implementation on new > guess work, I'll basically just wait for the next version of the spec. Given that this area is already built on guess work, I don't see the lack of clarity so far as a fatal flaw. Yes, I'd like to clear up some of the ambiguity here. In the mean time, it's useful to know that this particular issue is one that caused you some trouble. If there are other areas, please keep asking, and I'll happily describe our current thinking and identify things that are still unresolved. A new spec is a few weeks away. Hard to say exactly how many "a few" is. The issue right now is _solving_ problems like this; writing down the solutions comes next. ?Dan From stephan.herrmann at berlin.de Mon Apr 1 12:42:57 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Mon, 01 Apr 2013 21:42:57 +0200 Subject: Cross compilation and default methods Message-ID: <5159E341.3080206@berlin.de> At JavaOne in SF I discussed with some people how a new compiler should handle default methods found in class files when compiling with -source 1.7 -target 1.7. Obviously, this occurs, e.g., if no proper 1.7 JRE is given on the bootclasspath. Given that Eclipse users typically expect that our compiler behaves the same as javac even in unspecified matters, I'd like to request confirmation how javac handles this now and in the future. While the answer I got in SF indicated that this is a quite unsupported situation to begin with, recent experiments indicate, that javac does indeed employ some sophistication to reduce the number of unexpected errors. Could you please confirm that the following behavior is correctly analyzed: Given we're compiling in 1.7 mode and encounter a default method in a class file, this information is kept during compilation so that default methods can be specially handled in these ways: - clients of the interface see the default method just as a regular interface method, invocation is possible as normal. - implementors of the interface do not see/inherit the default method at all, with twofold consequences: - implementors of the interface need not implement its default methods - the default method is not visible via the implementing class, not even in self calls. - interfaces extending the given interface, do inherit the default method, invocation via the sub-interface is possible - implementors of the sub-interface still don't see the default method. BTW, in earlier JRE versions (around b47) this caused funny errors against implementors of Collection, which saw the abstract method addAll() from Fillable but did not see the non-abstract override from Collection, thus flagging a undesirable error saying that addAll() must be implemented. Is my above description of the behavior correct? Is there more we should know in this area? Is this behavior meant to stay, or should we expect any changes before Java 8 GA? thanks, Stephan From maurizio.cimadamore at oracle.com Wed Apr 3 01:43:58 2013 From: maurizio.cimadamore at oracle.com (Maurizio Cimadamore) Date: Wed, 03 Apr 2013 09:43:58 +0100 Subject: Cross compilation and default methods In-Reply-To: <5159E341.3080206@berlin.de> References: <5159E341.3080206@berlin.de> Message-ID: <515BEBCE.40202@oracle.com> Hi Stephan, the current status quo is: 1) javac reads default methods, regardless of the source/target versions used to compile 2) a class is allowed to implement an interface and _not_ providing implementations for the defaults, regardless of source/target 3) default method resolution only kicks in when source = 8 (mainly for perfomance/compatibility risks) 4) well-formedness checks on defaults (i.e. clashes) are only enabled if source = 8 I believe this is an area where there's room for improvement; things like (3) seems to be counterintuitive given that the compiler knows about default methods. What has put me off to go ahead and support default method resolution with source < 8 is the fact that the new resolution algorithm needs to look at more supertypes than the old one (i.e. it needs to look at all supertinterface before being able to give up). This leads to subtle problems where you need more dependencies on your classpath in order to be able to compile a class, as a seemingly unrelated interface that was not needed before will suddenly be inspected by javac. Maurizio On 01/04/13 20:42, Stephan Herrmann wrote: > At JavaOne in SF I discussed with some people how a new > compiler should handle default methods found in class files > when compiling with -source 1.7 -target 1.7. > Obviously, this occurs, e.g., if no proper 1.7 JRE is > given on the bootclasspath. > > Given that Eclipse users typically expect that our compiler > behaves the same as javac even in unspecified matters, > I'd like to request confirmation how javac handles this now > and in the future. > > While the answer I got in SF indicated that this is a quite > unsupported situation to begin with, recent experiments > indicate, that javac does indeed employ some sophistication > to reduce the number of unexpected errors. > > Could you please confirm that the following behavior is > correctly analyzed: > > Given we're compiling in 1.7 mode and encounter a default > method in a class file, this information is kept during > compilation so that default methods can be specially handled > in these ways: > - clients of the interface see the default method just as > a regular interface method, invocation is possible > as normal. > - implementors of the interface do not see/inherit the > default method at all, with twofold consequences: > - implementors of the interface need not implement > its default methods > - the default method is not visible via the implementing > class, not even in self calls. > - interfaces extending the given interface, do inherit > the default method, invocation via the sub-interface > is possible > - implementors of the sub-interface still don't see > the default method. > > BTW, in earlier JRE versions (around b47) this caused funny > errors against implementors of Collection, which saw the > abstract method addAll() from Fillable but did not see the > non-abstract override from Collection, thus flagging a > undesirable error saying that addAll() must be implemented. > > Is my above description of the behavior correct? > Is there more we should know in this area? > Is this behavior meant to stay, or should we expect any > changes before Java 8 GA? > > thanks, > Stephan From stephan.herrmann at berlin.de Thu Apr 4 08:43:34 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Thu, 04 Apr 2013 17:43:34 +0200 Subject: Cross compilation and default methods In-Reply-To: <515BEBCE.40202@oracle.com> References: <5159E341.3080206@berlin.de> <515BEBCE.40202@oracle.com> Message-ID: <515D9FA6.2070304@berlin.de> Hi Maurizio, On 04/03/2013 10:43 AM, Maurizio Cimadamore wrote: > the current status quo is: > > 1) javac reads default methods, regardless of the source/target versions > used to compile > 2) a class is allowed to implement an interface and _not_ providing > implementations for the defaults, regardless of source/target > 3) default method resolution only kicks in when source = 8 (mainly for > perfomance/compatibility risks) > 4) well-formedness checks on defaults (i.e. clashes) are only enabled if > source = 8 Thanks. Unfortunately, I don't see how this explains the following observation: compile this at -source 1.8: interface I0 { void foo(); } public interface I extends I0 { default void foo() { } default void bar() { } } then compile this at -source 1.7: public class C implements I { void test() { foo(); bar(); I i = this; i.foo(); i.bar(); } } I get 3 errors: C.java:1: error: C is not abstract and does not override abstract method foo() in I0 public class C implements I { ^ C.java:3: error: cannot find symbol foo(); ^ symbol: method foo() location: class C C.java:4: error: cannot find symbol bar(); ^ symbol: method bar() location: class C Sorry, if I'm being dense. I can see that I.foo() is not recognized as implementing I0.foo(), OK. That situation should probably just be avoided by library designers to avoid confusion. But why can those methods be invoked via 'i' but not via implicit 'this'? Is this an implementation detail leaking out into observable behavior? How can this be explained to users? > I believe this is an area where there's room for improvement; things > like (3) seems to be counterintuitive given that the compiler knows > about default methods. What has put me off to go ahead and support > default method resolution with source < 8 is the fact that the new > resolution algorithm needs to look at more supertypes than the old one > (i.e. it needs to look at all supertinterface before being able to give > up). This leads to subtle problems where you need more dependencies on > your classpath in order to be able to compile a class, as a seemingly > unrelated interface that was not needed before will suddenly be > inspected by javac. I can see. I don't think users will actually expect full fledged analysis of default methods in 1.7 mode. Not being blamed about unimplemented default methods is already good. There has to be a line beyond which compilation in 1.7 against 1.8 class files tells the user: sorry, I can't figure this out, please align compiler settings and versions of class libraries. My main question is: where exactly will you draw this line? I'd love to see javac and Eclipse drawing the same line, so can we make this explicit? From there, each compiler team can figure out how to explain the situation to the user. cheers, Stephan From maurizio.cimadamore at oracle.com Thu Apr 4 09:10:50 2013 From: maurizio.cimadamore at oracle.com (Maurizio Cimadamore) Date: Thu, 04 Apr 2013 17:10:50 +0100 Subject: Cross compilation and default methods In-Reply-To: <515D9FA6.2070304@berlin.de> References: <5159E341.3080206@berlin.de> <515BEBCE.40202@oracle.com> <515D9FA6.2070304@berlin.de> Message-ID: <515DA60A.9070807@oracle.com> On 04/04/13 16:43, Stephan Herrmann wrote: > Hi Maurizio, > > On 04/03/2013 10:43 AM, Maurizio Cimadamore wrote: >> the current status quo is: >> >> 1) javac reads default methods, regardless of the source/target versions >> used to compile >> 2) a class is allowed to implement an interface and _not_ providing >> implementations for the defaults, regardless of source/target >> 3) default method resolution only kicks in when source = 8 (mainly for >> perfomance/compatibility risks) >> 4) well-formedness checks on defaults (i.e. clashes) are only enabled if >> source = 8 > > Thanks. > Unfortunately, I don't see how this explains the following observation: > > compile this at -source 1.8: > > interface I0 { > void foo(); > } > public interface I extends I0 { > default void foo() { } > default void bar() { } > } > > then compile this at -source 1.7: > > public class C implements I { > void test() { > foo(); > bar(); > I i = this; > i.foo(); > i.bar(); > } > } > > I get 3 errors: It might be a problem with that particular hierarchy; the following works ok: //compiled with JDK 8 interface I { default void m() {} } //compiled with JDK 7 class E implements I { } > > C.java:1: error: C is not abstract and does not override abstract > method foo() in I0 > public class C implements I { > ^ > C.java:3: error: cannot find symbol > foo(); > ^ > symbol: method foo() > location: class C > C.java:4: error: cannot find symbol > bar(); > ^ > symbol: method bar() > location: class C > > Sorry, if I'm being dense. > I can see that I.foo() is not recognized as implementing I0.foo(), OK. > That situation should probably just be avoided by library designers > to avoid confusion. > > But why can those methods be invoked via 'i' but not via implicit > 'this'? Is this an implementation detail leaking out into observable > behavior? How can this be explained to users? This is a result of the JDK 7 invariant on class well-formedness. If a class is well-formed, a class should contain all required concrete methods implementing corresponding abstract methods in implemented interfaces. This means that javac doesn't even bother looking into interfaces when resolving a method call where the receiver is a concrete class. Maurizio > > >> I believe this is an area where there's room for improvement; things >> like (3) seems to be counterintuitive given that the compiler knows >> about default methods. What has put me off to go ahead and support >> default method resolution with source < 8 is the fact that the new >> resolution algorithm needs to look at more supertypes than the old one >> (i.e. it needs to look at all supertinterface before being able to give >> up). This leads to subtle problems where you need more dependencies on >> your classpath in order to be able to compile a class, as a seemingly >> unrelated interface that was not needed before will suddenly be >> inspected by javac. > > I can see. I don't think users will actually expect full fledged > analysis of default methods in 1.7 mode. Not being blamed about > unimplemented default methods is already good. There has to be a line > beyond which compilation in 1.7 against 1.8 class files tells the > user: sorry, I can't figure this out, please align compiler settings > and versions of class libraries. My main question is: where exactly > will you draw this line? I'd love to see javac and Eclipse drawing > the same line, so can we make this explicit? > From there, each compiler team can figure out how to explain the > situation to the user. > > cheers, > Stephan > > From stephan.herrmann at berlin.de Thu Apr 4 09:27:32 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Thu, 04 Apr 2013 18:27:32 +0200 Subject: Cross compilation and default methods In-Reply-To: <515DA60A.9070807@oracle.com> References: <5159E341.3080206@berlin.de> <515BEBCE.40202@oracle.com> <515D9FA6.2070304@berlin.de> <515DA60A.9070807@oracle.com> Message-ID: <515DA9F4.2090804@berlin.de> From the perspective of a compiler developer your answers are perfectly suitable. IFF you confirm that this behavior is going to stay, we'll look into making Eclipse behave similarly. Once we get bug reports about this, however, we'll need another way of explaining, I'm afraid. > It might be a problem with that particular hierarchy; the following > works ok: That's the kind of answer I'm not planning to give to a user :) cheers, Stephan On 04/04/2013 06:10 PM, Maurizio Cimadamore wrote: > On 04/04/13 16:43, Stephan Herrmann wrote: >> Hi Maurizio, >> >> On 04/03/2013 10:43 AM, Maurizio Cimadamore wrote: >>> the current status quo is: >>> >>> 1) javac reads default methods, regardless of the source/target versions >>> used to compile >>> 2) a class is allowed to implement an interface and _not_ providing >>> implementations for the defaults, regardless of source/target >>> 3) default method resolution only kicks in when source = 8 (mainly for >>> perfomance/compatibility risks) >>> 4) well-formedness checks on defaults (i.e. clashes) are only enabled if >>> source = 8 >> >> Thanks. >> Unfortunately, I don't see how this explains the following observation: >> >> compile this at -source 1.8: >> >> interface I0 { >> void foo(); >> } >> public interface I extends I0 { >> default void foo() { } >> default void bar() { } >> } >> >> then compile this at -source 1.7: >> >> public class C implements I { >> void test() { >> foo(); >> bar(); >> I i = this; >> i.foo(); >> i.bar(); >> } >> } >> >> I get 3 errors: > > It might be a problem with that particular hierarchy; the following > works ok: > > //compiled with JDK 8 > interface I { > default void m() {} > } > > //compiled with JDK 7 > class E implements I { } > >> >> C.java:1: error: C is not abstract and does not override abstract >> method foo() in I0 >> public class C implements I { >> ^ >> C.java:3: error: cannot find symbol >> foo(); >> ^ >> symbol: method foo() >> location: class C >> C.java:4: error: cannot find symbol >> bar(); >> ^ >> symbol: method bar() >> location: class C >> >> Sorry, if I'm being dense. >> I can see that I.foo() is not recognized as implementing I0.foo(), OK. >> That situation should probably just be avoided by library designers >> to avoid confusion. >> >> But why can those methods be invoked via 'i' but not via implicit >> 'this'? Is this an implementation detail leaking out into observable >> behavior? How can this be explained to users? > This is a result of the JDK 7 invariant on class well-formedness. If a > class is well-formed, a class should contain all required concrete > methods implementing corresponding abstract methods in implemented > interfaces. This means that javac doesn't even bother looking into > interfaces when resolving a method call where the receiver is a concrete > class. > > Maurizio >> >> >>> I believe this is an area where there's room for improvement; things >>> like (3) seems to be counterintuitive given that the compiler knows >>> about default methods. What has put me off to go ahead and support >>> default method resolution with source < 8 is the fact that the new >>> resolution algorithm needs to look at more supertypes than the old one >>> (i.e. it needs to look at all supertinterface before being able to give >>> up). This leads to subtle problems where you need more dependencies on >>> your classpath in order to be able to compile a class, as a seemingly >>> unrelated interface that was not needed before will suddenly be >>> inspected by javac. >> >> I can see. I don't think users will actually expect full fledged >> analysis of default methods in 1.7 mode. Not being blamed about >> unimplemented default methods is already good. There has to be a line >> beyond which compilation in 1.7 against 1.8 class files tells the >> user: sorry, I can't figure this out, please align compiler settings >> and versions of class libraries. My main question is: where exactly >> will you draw this line? I'd love to see javac and Eclipse drawing >> the same line, so can we make this explicit? >> From there, each compiler team can figure out how to explain the >> situation to the user. >> >> cheers, >> Stephan >> >> > From maurizio.cimadamore at oracle.com Thu Apr 4 09:39:32 2013 From: maurizio.cimadamore at oracle.com (Maurizio Cimadamore) Date: Thu, 04 Apr 2013 17:39:32 +0100 Subject: Cross compilation and default methods In-Reply-To: <515DA9F4.2090804@berlin.de> References: <5159E341.3080206@berlin.de> <515BEBCE.40202@oracle.com> <515D9FA6.2070304@berlin.de> <515DA60A.9070807@oracle.com> <515DA9F4.2090804@berlin.de> Message-ID: <515DACC4.2080009@oracle.com> I think I've already made clear that (i) "the current status quo is:" (as in _not set in stone_) (ii) "I believe this is an area where there's room for improvement" (as in _not set in stone_) But you asked as to why the compiler behaved in a certain odd way, and that's your answer :-) Now, all this won't be covered in any JLSs or JVMs - so this stuff it's not, strictly speaking, (well, up to a certain point) subkect to EG discussions. But of course we should get our story straight before shipping. That said, talking about these issues at this point in time, with the libraries, VM and language being still under active (and heavy) development, would seem a bit premature - like talking calmly in the living room when you know there's water in your basement ;-). I would suggest to postpone this discussion after feature freeze. Once all the language features will have been stabilized, it will become clearer as to what the right strategy is in order to support cross compilation. Maurizio On 04/04/13 17:27, Stephan Herrmann wrote: > From the perspective of a compiler developer your answers > are perfectly suitable. IFF you confirm that this behavior is > going to stay, we'll look into making Eclipse behave similarly. > > Once we get bug reports about this, however, we'll need another > way of explaining, I'm afraid. > > > It might be a problem with that particular hierarchy; the following > > works ok: > > That's the kind of answer I'm not planning to give to a user :) > > cheers, > Stephan > > On 04/04/2013 06:10 PM, Maurizio Cimadamore wrote: >> On 04/04/13 16:43, Stephan Herrmann wrote: >>> Hi Maurizio, >>> >>> On 04/03/2013 10:43 AM, Maurizio Cimadamore wrote: >>>> the current status quo is: >>>> >>>> 1) javac reads default methods, regardless of the source/target >>>> versions >>>> used to compile >>>> 2) a class is allowed to implement an interface and _not_ providing >>>> implementations for the defaults, regardless of source/target >>>> 3) default method resolution only kicks in when source = 8 (mainly for >>>> perfomance/compatibility risks) >>>> 4) well-formedness checks on defaults (i.e. clashes) are only >>>> enabled if >>>> source = 8 >>> >>> Thanks. >>> Unfortunately, I don't see how this explains the following observation: >>> >>> compile this at -source 1.8: >>> >>> interface I0 { >>> void foo(); >>> } >>> public interface I extends I0 { >>> default void foo() { } >>> default void bar() { } >>> } >>> >>> then compile this at -source 1.7: >>> >>> public class C implements I { >>> void test() { >>> foo(); >>> bar(); >>> I i = this; >>> i.foo(); >>> i.bar(); >>> } >>> } >>> >>> I get 3 errors: >> >> It might be a problem with that particular hierarchy; the following >> works ok: >> >> //compiled with JDK 8 >> interface I { >> default void m() {} >> } >> >> //compiled with JDK 7 >> class E implements I { } >> >>> >>> C.java:1: error: C is not abstract and does not override abstract >>> method foo() in I0 >>> public class C implements I { >>> ^ >>> C.java:3: error: cannot find symbol >>> foo(); >>> ^ >>> symbol: method foo() >>> location: class C >>> C.java:4: error: cannot find symbol >>> bar(); >>> ^ >>> symbol: method bar() >>> location: class C >>> >>> Sorry, if I'm being dense. >>> I can see that I.foo() is not recognized as implementing I0.foo(), OK. >>> That situation should probably just be avoided by library designers >>> to avoid confusion. >>> >>> But why can those methods be invoked via 'i' but not via implicit >>> 'this'? Is this an implementation detail leaking out into observable >>> behavior? How can this be explained to users? >> This is a result of the JDK 7 invariant on class well-formedness. If a >> class is well-formed, a class should contain all required concrete >> methods implementing corresponding abstract methods in implemented >> interfaces. This means that javac doesn't even bother looking into >> interfaces when resolving a method call where the receiver is a concrete >> class. >> >> Maurizio >>> >>> >>>> I believe this is an area where there's room for improvement; things >>>> like (3) seems to be counterintuitive given that the compiler knows >>>> about default methods. What has put me off to go ahead and support >>>> default method resolution with source < 8 is the fact that the new >>>> resolution algorithm needs to look at more supertypes than the old one >>>> (i.e. it needs to look at all supertinterface before being able to >>>> give >>>> up). This leads to subtle problems where you need more dependencies on >>>> your classpath in order to be able to compile a class, as a seemingly >>>> unrelated interface that was not needed before will suddenly be >>>> inspected by javac. >>> >>> I can see. I don't think users will actually expect full fledged >>> analysis of default methods in 1.7 mode. Not being blamed about >>> unimplemented default methods is already good. There has to be a line >>> beyond which compilation in 1.7 against 1.8 class files tells the >>> user: sorry, I can't figure this out, please align compiler settings >>> and versions of class libraries. My main question is: where exactly >>> will you draw this line? I'd love to see javac and Eclipse drawing >>> the same line, so can we make this explicit? >>> From there, each compiler team can figure out how to explain the >>> situation to the user. >>> >>> cheers, >>> Stephan >>> >>> >> > From brian.goetz at oracle.com Mon Apr 15 09:52:33 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Mon, 15 Apr 2013 12:52:33 -0400 Subject: RI update: division of bridging responsibility between VM and compiler Message-ID: <516C3051.7030200@oracle.com> As you may recall, adding default methods requires that the VM get involved in default method inheritance, because it is an explicit goal for the addition of an interface method with a default to be a binary-compatible change. We've had an implementation of default inheritance in the VM for quite a while. The basic inheritance algorithm was really easy to implement; it built on top of existing vtable building in a straightforward and well-defined way. Some time back, we identified some cases where pushing default inheritance into the VM seemed to necessitate pushing bridge method generation into the VM as well. We also have had an implementation of this in the VM for a while too. But, this is a much bigger change and we're not as comfortable with it -- it pushes the details of the generic type system into the VM, and risks exposing Java-language-specific type system details to classes generated by other language compilers. At one point, we were convinced we had no choice. But since then, there were some simplifications in the definition of overriding with respect to defaults (specifically, outlawing abstract-default conflicts rather than silently merging them), and it turns out that this eliminates a number of the examples that led us to believe we had no choice in this matter. (Specifically, to land in a corner case, it now requires a bridge-requiring merge between a class and an interface; can't happen any more with two interfaces.) After having spent some time trying to specify what the invoke{virtual,interface,special} semantics might be in a VM-bridged world -- with the hopes that this would be step 1 along the path of eventually moving all bridging out of the static compiler (where it clearly does not belong, and is basically pure technical debt left over from generics) -- we're getting more comfortable with the corner cases that we'd have without VM bridging. Indeed, most of them are analogous to corner cases we already have today and would continue to have tomorrow under separate compilation with ordinary classes. Instead, we're now pursuing a path where we generate bridges into interfaces (since we can do that now) using an algorithm very similar to what we do with class bridges. We may need to extend the technique of compiler-generated bridges with generating additional classfile attributes that the VM might act on to avoid these anomalies, currently being explored. This offers a significant reduction in complexity. We can rip out all existing bridge-related code from VM, and do default inheritance using the simple "same erased signature" overriding the VM has always done. Can rip out all generic analysis, including verification of generic signatures. Though might have to add back processing of additional classfile attributes and potentially use those to modify the behavior of inheritance, details TBD. And, this keeps the generic type system in javac, eliminating risks of interference with other language inheritance semantics. BRIEF NOTATION BREAK -------------------- When we were discussing how to specify default inheritance, we invented a notation where we wrote things like: Cc(Id(Ja)) and wrote separate compilation examples as: Cc(Id(Ja)) -> Cc(Id(Jd)) Which was much easier to reason about, and less ambiguity-prone, than writing the classes out longhand. Decoder chart: A, B: concrete or abstract classes C: concrete class to be instantiated I, J, K: interfaces In this world, like in FD, there's one method, named "m", with no arguments. Classes or interfaces have some extra letters after them to describe how m is declared: C -- no declaration of m Cc -- m() declared in C as concrete Ca, Ia -- m() declared in C or I as abstract Id -- m() declared in I as default Cm -- m() is declared in C as either abstract or concrete We now extend this notation with indicators describing covariant overrides, imagining a linear hierarchy of types T2 <: T1 <: T0: Cc0 -- m() declared in C as returning T0 Cc1 -- m() declared in C as returning T1 Supertypes are written in parentheses: Cc(Id(Jd)) means that C extends I and and I extends J. Separate compilation is written as: Cc(Id(Ja)) -> Cc(Id(Jd)) Since only J is changed, only J is assumed to be recompiled. MOTIVATING EXAMPLE ------------------ Here's a problem we have today (and which the path we'd been pursuing would not have fixed for 8): Cc1(A) -> Cc1(Ac0) (This is a "contravariant underride.") This means we go from: abstract class A { } class C <: A { T1 m() { } } to abstract class A { T0 m() { } } class C <: A { T1 m() { } } without recompiling C. What will happen at runtime is: m()T1 -> C m()T0 -> A whereas with a global recompile, we would get: m()T1 -> C m()T0 -> C Note that: - This problem exists today and has existed since Java 5 - Would get no better under the "default VM bridging" plan - No one seems particularly bothered by this long-standing issue. Now consider the defender analogue of this example: Cc1(I) -> Cc1(Id0) m()T1 -> C m()T0 -> I Is this any worse than the previous version? For default methods, we say "classes that don't override this method will get the default, which by definition meets the contract of I." A moldy class file that had no idea that it's m()T1 declaration was overriding an as-yet-unborn m()T0 in a supertype could well be described as "not overriding the method." In which case they get the default. This does not seem so bad, or any worse than many other similar separate compilation scenarios today. Turning it around, if we handled this case but not the class-based version of the same issue, might that not even be weirder? Note also that with the decision to rule out abstract-default conflicts (i.e., outlawing K(Ia,Jd)), the set of possible bad cases is reduced a lot; many of the scary examples came from that space. INTERFACE BRIDGES ----------------- We anticipate that (consistently compiled) interface hierarchies like Id1(Jd0) will be common. (Consider a method like Collection.immutable(), which might be covariantly overridden by List.immutable()). So, to support consistently compiled hierarchies like this (that is, I and J updated together) without forcing a recompile of concrete classes implementing I, the compiler could generate a bridge in I redirecting m()T0 to m()T1, with suitable cast, which is the highest point in the hierarchy where we can determine a bridge is needed. In a consistently compiled world, this is all that is needed. But we don't live in a consistently compiled world. So we must make some allowance for what might happen in a separately compiled world. The current scheme of only compiling bridges into the class where the bridgee lives helps reduce certain separate compilation artifacts. I think we should probably continue doing this, so that class bridges will, at times, override interface bridges. There does not seem to be harm in this, and it changes fewer things, and eliminates some risk vectors. (Ultimately the problem is that compiler bridges suffer from "premature bytecode". When the compiler generates a bridge, it is trying to reify the notion of "method m()T1 was known to override method m()T0 at compile time", but this is opaque to the VM, who can only slavishly propagate the bridge through subclass vtables as if it were code written by the user. If, instead of bridges (or in addition to), the compiler instead generated a class attribute of the form "I believe that m()T1 overrides m()T0", the VM could act on that information directly, and this might buy us out of some of the worst possible problems.) WORST CASE SCENARIO ------------------- The cases above are not terrible because the program continues to link after separation compilation and even does something vaguely justifiable. Here's a worse scenario (relevant humor break: http://www.youtube.com/watch?v=_W-qxpN2oEI). Cc1(Bc0(Ac0)) -> Cc1(Bc1(Ac0)) If the implementation in C does: super.m() one gets a StackOverflowError. This happens because when we invoke C.m(), we are really invoking C.m()T1. C.m()T1 invokes B.m()T0 via invokespecial, thinking that it is invoking the parent implementation. But really B.m()T0 is a bridge for B.m()T1, that invokes B.m()T1 with invokevirtual. But B.m()T1 is overridden by C.m()T1, and so the invokevirtual is dispatched there. Which is where we started, so we ping-pong between C.m()T1 and B.m()T1 until we fall off the stack. Again, note that (a) we already have this problem since Java 5 and (b) the complex solution we were pursuing would not have fixed it for 8. But this is definitely worse than the problems above, and we want to not widen this hole. We need to explore further what kinds of separate compilation anomalies with bridges in interfaces might cause similar problems. EXHAUSTIVE PATTERN CATALOG -------------------------- Dan did a nearly-exhaustive catalog of inheritance scenarios. The question is, do we find any of these anomalies so bad (worse than existing anomalies) that we cannot live with them? On review, none of them seem any worse than the pain of bridge methods under separate compilation we've been living with for years. They are annotated with what happens: 0: Description of the behavior of an invocation on an instance of C, targeting the descriptor of index 0. 0*: Behavior inconsistent with a full compilation of the final state. This following cases are not considered: - Illegal hierarchies, in either the initial or final state - Redundant extra classes/interfaces that have no effect on the outcome - Redundant permutations of 'implements' clauses - Final states that require recompiling C ===== Linear inheritance (one ancestor, two methods) --- Cc1(A) -> Cc1(Ac0) 0*: Inherited from A 1: Declared in C --- Cc1(I) -> Cc1(Id0) 0*: Inherited default from I 1: Declared in C ===== Linear inheritance (two ancestors, no method in C) --- C(Bc1(A)) -> C(Bc1(Ac0)) 0*: Inherited from A 1: Inherited from B --- C(B(Ac0)) -> C(Bc1(Ac0)) 0: Inherited bridge from B 1: Inherited from B --- C(B(A)) -> C(Bc1(Ac0)) 0: Inherited bridge from B 1: Inherited from B --- C(Ac1(I)) -> C(Ac1(Id0)) 0*: Inherited default from I 1: Inherited from A --- C(A(Id0)) -> C(Ac1(Id0)) 0: Inherited bridge from A 1: Inherited from A --- C(A(I)) -> C(Ac1(Id0)) 0: Inherited bridge from A 1: Inherited from A --- C(Id1(J)) -> C(Id1(Jd0)) 0*: Inherited default from J 1: Inherited default from I --- C(I(Jd0)) -> C(Id1(Jd0)) 0: Inherited bridge from I 1: Inherited default from I --- C(I(J)) -> C(Id1(Jd0)) 0: Inherited bridge from I 1: Inherited default from I ===== Linear inheritance (two ancestors, method in C) --- Cc2(B(Am0)) -> Cc2(Bc1(Am0)) 0: Bridge in C 1*: Inherited from B 2: Declared in C --- Cc2(Bm1(A)) -> Cc2(Bm1(Ac0)) 0*: Inherited from A 1: Bridge in C 2: Declared in C --- Cc2(B(A)) -> Cc2(Bc1(Ac0)) 0*: Inherited bridge from B 1*: Inherited from B 2: Declared in C --- Cc2(A(Im0)) -> Cc2(Ac1(Im0)) 0: Bridge in C 1*: Inherited from A 2: Declared in C --- Cc2(Am1(I)) -> Cc2(Am1(Id0)) 0*: Inherited from I 1: Bridge in C 2: Declared in C --- Cc2(A(I)) -> Cc2(Ac1(Id0)) 0*: Inherited bridge from A 1*: Inherited from A 2: Declared in C --- Cc2(J(Im0)) -> Cc2(Jd1(Im0)) 0: Bridge in C 1*: Inherited default from J 2: Declared in C --- Cc2(Jm1(I)) -> Cc2(Jm1(Id0)) 0*: Inherited default from I 1: Bridge in C 2: Declared in C --- Cc2(J(I)) -> Cc2(Jd1(Id0)) 0*: Inherited bridge from J 1*: Inherited default from J 2: Declared in C ===== Independent branches (no method in C) --- C(Ac1, I) -> C(Ac1, Id0) 0*: Inherited default from I 1: Inherited from A --- C(A, Id0) -> C(Ac1, Id0) 0*: Inherited default from I 1: Inherited from A --- C(A, I) -> C(Ac1, Id0) 0*: Inherited default from I 1: Inherited from A ===== Independent branches (method in C) --- Cc2(Am0, I) -> Cc2(Am0, Id1) 0: Bridge in C 1*: Inherited default from I 2: Declared in C --- Cc2(A, Im1) -> Cc2(Ac0, Im1) 0*: Inherited from A 1: Bridge in C 2: Declared in C --- Cc2(A, I) -> Cc2(Ac0, Id1) 0*: Inherited from A 1*: Inherited default from I 2: Declared in C --- Cc2(Im0, J) -> Cc2(Im0, Jd1) 0: Bridge in C 1*: Inherited default from J 2: Declared in C --- Cc2(I, J) -> Cc2(Id0, Jd1) 0*: Inherited default from I 1*: Inherited default from J 2: Declared in C ===== Diamond branches (no method in C) --- C(A(Id0), J(Id0)) -> C(Ac1(Id0), J(Id0)) 0: Inherited bridge from A 1: Inherited from A --- C(A(Id0), J(Id0)) -> C(A(Id0), Jd1(Id0)) 0: Inherited bridge from J 1: Inherited default from J --- C(A(Id0), J(Id0)) -> C(Ac2(Id0), Jd1(Id0)) 0: Inherited bridge from A (beats new bridge in J) 1*: Inherited default from J 2: Inherited from A --- C(J(Id0), K(Id0)) -> C(Jd1(Id0), K(Id0)) 0: Inherited bridge from J 1: Inherited default from J --- C(Ac2(Im0), J(Im0)) -> C(Ac2(Im0), Jd1(Im0)) 0: Inherited bridge from A (beats new bridge in J) 1*: Inherited default from J 2: Inherited from A --- C(A(Im0), Jd1(Im0)) -> C(Ac2(Im0), Jd1(Im0)) 0: Inherited bridge from A (beats old bridge in J) 1*: Inherited default from J 2: Inherited from A ===== Diamond branches (method in C) --- Cc2(A(Im0), J(Im0)) -> Cc2(Ac1(Im0), J(Im0)) 0: Bridge in C 1*: Inherited from A 2: Declared in C --- Cc2(A(Im0), J(Im0)) -> Cc2(A(Im0), Jd1(Im0)) 0: Bridge in C 1*: Inherited default from J 2: Declared in C --- Cc3(A(Im0), J(Im0)) -> Cc3(Ac1(Im0), Jd2(Im0)) 0: Bridge in C 1*: Inherited from A 2*: Inherited default from J 3: Declared in C --- Cc2(J(Im0), K(Im0)) -> Cc2(Jd1(Im0), K(Im0)) 0: Bridge in C 1*: Inherited default from J 2: Declared in C --- Cc3(J(Im0), K(Im0)) -> Cc3(Jd1(Im0), Kd2(Im0)) 0: Bridge in C 1*: Inherited default from J 2*: Inherited default from K 3: Declared in C --- Cc3(Am1(Im0), J(Im0)) -> Cc3(Am1(Im0), Jd2(Im0)) 0: Bridge in C 1: Bridge in C 2*: Inherited default from J 3: Declared in C --- Cc3(A(Im0), Jm2(Im0)) -> Cc3(Ac1(Im0), Jm2(Im0)) 0: Bridge in C 1*: Inherited from A 2: Bridge in C 3: Declared in C --- Cc3(Jm1(Im0), K(Im0)) -> Cc3(Jm1(Im0), Kd2(Im0)) 0: Bridge in C 1: Bridge in C 2*: Inherited default from K 3: Declared in C From john.r.rose at oracle.com Mon Apr 22 21:35:45 2013 From: john.r.rose at oracle.com (John Rose) Date: Tue, 23 Apr 2013 04:35:45 -0000 Subject: [jsr-292-eg] Conversion of method reference to MethodHandle In-Reply-To: <50E18202.7080908@gmx.org> References: <50E0DF61.3060608@univ-mlv.fr> <50E18202.7080908@gmx.org> Message-ID: [Catching up on some 292 EG stuff...] On Dec 31, 2012, at 4:16 AM, Jochen Theodorou wrote: > I think not allowing overloaded methods will restrict the usage quite a lot. For me, the restriction against overloaded methods is a killer. I'm sad but philosophical about this. As I told Brian, I was hoping the Lambda folks would find themselves required to add overload disambiguation to member references, along the lines Jochen suggested ? 'PrintStream::println(int)' ? at which point we 292'ers could ask for MethodHandle (and jlr.Method) as target types. I still think it may be worth doing, but only if the syntax is competent beyond the 99% level to describe all methods and constructors (which the overloading restriction prevents). Fields would be good too, but those seem to be even farther off the table. For the record, there might be a couple of other approaches. I am not proposing these seriously for JDK 8, but here they are, A and B. A. Define enough extra poly-to-interface conversions to allow a universal functional type to be defined. [1] UniversalFunction uf0 = ()->8; UniversalFunction uf1 = String::concat; UniversalFunction uf2 = (int n)->{System.out.println(n);} Then define a cracking API that can conspire with the metafactory for uf0, uf1, and uf2 to crack out the insides. B. (This is from an idea suggested by Brian.) Define javac-time constant folding rules for the following methods: MethodHandles.lookup() MethodType methodType(...) Lookup.findVirtual() ... When an expression involving only these (with constant String and Class leaves) can be proven to resolve at compile time, replace the expression with a CONSTANT_MethodHandle or CONSTANT_MethodType. Maybe gate the whole thing on an annotation @MustBeAConstantPoolConstant, to avoid surprises after compile time. Would require either harmonizing the CONSTANT_MethodHandle resolution rules with the Lookup.findFoo rules, or special dispensation to vary the exception types from ReflectiveOperationException to LinkageError. Anyway, this note is to put some ideas out for longer-term thought. My next message will discuss the proposal for the MethodHandleInfo API, for cracking "direct" method handles only. ? John [1] A universal functional type could look something like this: @FunctionalInterface interface MyUniversal { R myApply(@CastThisCovariantPolyArgument Object... anyArgs) throws X; } So: MyUniversal foo = (int n, String s)->{System.out.println(n+": "+s);} evaluates to something like: MyUniversal foo = new MyUniversal() { @Override Void myApply(Object[] av) { if (av.length != 2) throw new WrongMethodTypeException(); int n = (int)av[0]; String s = (String)av[1]; {System.out.println(n+": "+s);} } } I'm not sure if this idea is good for much besides a target type that always wins. Any random call like 'foo.myApply(1, 2, false)' would likely throw a WMTE or CCE. Then there is the question of how these covariant varargs guys interact with the "most specific overloading" rules. (Method handles make signature polymorphism work by exposing their type and allowing asType conversions.) -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20130423/16a9a0d5/attachment.html