From stephan.herrmann at berlin.de Tue Oct 8 13:06:01 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Tue, 08 Oct 2013 22:06:01 +0200 Subject: arrays and raw types in 18.2.3 Message-ID: <525465A9.7040507@berlin.de> Maybe I'm slow today but the following phrase doesn't speak to me: 18.2.3 o If T is an array type, T'[], then let S'[] be the most specific array type that is a supertype of S (or S itself) What reasons exist why S'[] would be different from S? If S is an array type, then S'[] = S. If S is not an array type, how can S have an array type as a supertype? Secondly, how should this constraint be reduced: C <: C According to 18.2.3 I need a parameterization of C that is a supertype of C (raw). Since no such supertype exists, some invocations of generic methods with raw arguments seem to be illegal now, which were legal in Java 7. I'm currently looking at this test: public class X { public static void main(String[] args) { EntityKey entityKey = null; new EntityCondenser().condense(entityKey); } public static class EntityCondenser { , K extends EntityKey> void condense(K entityKey) { } } public class EntityKey {} public interface EntityType< I, E extends EntityType, K extends EntityKey> { } } With my current understanding of 18.2.3 we cannot find a valid instantiation for K given the argument of raw type EntityKey. Even seeing sect. 18.5.5 (Unchecked Conversion Inference) I don't see how this how this can be leveraged from 18.2.3. thanks, Stephan From daniel.smith at oracle.com Tue Oct 8 15:23:08 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 8 Oct 2013 16:23:08 -0600 Subject: arrays and raw types in 18.2.3 In-Reply-To: <525465A9.7040507@berlin.de> References: <525465A9.7040507@berlin.de> Message-ID: <0BCAFD6A-0378-4557-BB26-AB4B726239E7@oracle.com> Thanks for the questions. I'm happy to help; keep them coming as you encounter points of confusion. On Oct 8, 2013, at 2:06 PM, Stephan Herrmann wrote: > Maybe I'm slow today but the following phrase doesn't speak to me: > > 18.2.3 > o If T is an array type, T'[], then let S'[] be the most specific array type that is a supertype of S (or S itself) > > What reasons exist why S'[] would be different from S? > If S is an array type, then S'[] = S. > If S is not an array type, > how can S have an array type as a supertype? This is to account for type variables and intersection types. I know there are some restrictions on what bounded type variables can look like in source (including their intersection upper bounds), but through things like capture or lub I'm pretty sure you can end up with cases in which both of these kinds of types can be subtypes of array types. > Secondly, how should this constraint be reduced: > C <: C > According to 18.2.3 I need a parameterization of C > that is a supertype of C (raw). > Since no such supertype exists, > some invocations of generic methods with raw arguments > seem to be illegal now, which were legal in Java 7. Before you get to subtyping, you'll usually have a compatibility constraint of the form "C -> C"; see 18.2.2. The note #1 points out that the text still needs to account for unchecked exceptions. Actual spec text to come in the next round. These compatibility constraints should work as before: if the statement can be made true via unchecked conversion, then the result is "true". (And there will be an unchecked warning based on the requirement somewhere in 15.12.2.) > I'm currently looking at this test: > > public class X { > public static void main(String[] args) { > EntityKey entityKey = null; > new EntityCondenser().condense(entityKey); > } > public static class EntityCondenser { > , K extends EntityKey> void condense(K entityKey) { > } > } > public class EntityKey {} > public interface EntityType< > I, > E extends EntityType, > K extends EntityKey> { > } > } Ah, okay, in this case, there is no compatibility constraint for the two types ("Raw -> Parameterized"), just a subtyping constraint ("Raw <: Parameterized"), which is derived from the bound of K. The old spec was somewhat vague here, but I believe the correct behavior for both 7 and 8 is to fail. If the method is applicable, that means there exist choices for I and K such that i) K <: EntityKey (*see note) and ii) (raw) EntityKey is method-invocation-compatible with K. Under 7, what are the choices for I and K that satisfy these two assertions? If the compiler hasn't produced them, then it hasn't proved, per 15.12.2.2 or 15.12.2.3, that the method is applicable. (*Subtyping is used here, not a more general notion of compatibility, because that's what 'extends' means when used as a variable bound.) You could explicitly satisfy these requirements with something like ec.>(entityKey) // unchecked conversion on entityKey But there's no provision in the 7 or 8 specs of inference that would make it smart enough to come up with this. Instead, the constraint "EntityKey -> ?3" is always reduced to "EntityKey <: ?3". (The fact that reduction throws away the possibility that ?3 = EntityKey here is what the Lambda Spec refers to as a non-completeness-preserving reduction step.) > With my current understanding of 18.2.3 we cannot find > a valid instantiation for K given the argument of raw type EntityKey. > Even seeing sect. 18.5.5 (Unchecked Conversion Inference) > I don't see how this how this can be leveraged from 18.2.3. With some more advanced reduction logic, we could do something like EntityKey -> ?3 reduces to EntityKey <: ?3 *or* EntityKey <: ?3 (Handling the unchecked conversion as described in 18.5.5.) But bound sets can't encode disjunctive logic ("or"), so inference is forced to make a greedy choice, and it generally prefers the left branch. (As I suggested above when I mentioned 18.2.2, though, it may take the right branch if that's obviously necessary, as in "C -> C". Yes, I know it remains to specify what "obviously necessary" means.) ?Dan From stephan.herrmann at berlin.de Thu Oct 10 05:01:32 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Thu, 10 Oct 2013 14:01:32 +0200 Subject: arrays and raw types in 18.2.3 In-Reply-To: <0BCAFD6A-0378-4557-BB26-AB4B726239E7@oracle.com> References: <525465A9.7040507@berlin.de> <0BCAFD6A-0378-4557-BB26-AB4B726239E7@oracle.com> Message-ID: <5256971C.10200@berlin.de> On 10/09/2013 12:23 AM, Dan Smith wrote: > On Oct 8, 2013, at 2:06 PM, Stephan Herrmann wrote: >> 18.2.3 >> o If T is an array type, T'[], then let S'[] be the most specific array type that is a supertype of S (or S itself) >> >> ... > > This is to account for type variables and intersection types. I know there are some restrictions on what bounded type variables can look like in source (including their intersection upper bounds), but through things like capture or lub I'm pretty sure you can end up with cases in which both of these kinds of types can be subtypes of array types. Thanks, makes sense. >> Secondly, how should this constraint be reduced: >> C <: C >> According to 18.2.3 I need a parameterization of C >> that is a supertype of C (raw). >> Since no such supertype exists, >> some invocations of generic methods with raw arguments >> seem to be illegal now, which were legal in Java 7. > > Before you get to subtyping, you'll usually have a compatibility constraint of the form "C -> C"; see 18.2.2. > > The note #1 points out that the text still needs to account for unchecked exceptions. Actual spec text to come in the next round. These compatibility constraints should work as before: if the statement can be made true via unchecked conversion, then the result is "true". (And there will be an unchecked warning based on the requirement somewhere in 15.12.2.) I read that IFF the constraint comes via a compatibility constraint a future version of the spec will tell me how to leverage unchecked conversions (not exceptions :)). I'm eagerly looking forward to that update :) >> I'm currently looking at this test: >> >> public class X { >> public static void main(String[] args) { >> EntityKey entityKey = null; >> new EntityCondenser().condense(entityKey); >> } >> public static class EntityCondenser { >> , K extends EntityKey> void condense(K entityKey) { >> } >> } >> public class EntityKey {} >> public interface EntityType< >> I, >> E extends EntityType, >> K extends EntityKey> { >> } >> } > > Ah, okay, in this case, there is no compatibility constraint for the two types ("Raw -> Parameterized"), just a subtyping constraint ("Raw <: Parameterized"), which is derived from the bound of K. > > The old spec was somewhat vague here, but I believe the correct behavior for both 7 and 8 is to fail. If that is so, then all(?) existing compilers have a bug: they do accept this program. I just checked, even javac8 b109 accepts the above test. Will this be changed in the near future? best, Stephan From stephan.herrmann at berlin.de Sun Oct 13 08:19:07 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Sun, 13 Oct 2013 17:19:07 +0200 Subject: Collected comments / questions re type inference in 0.6.3 Message-ID: <525AB9EB.7000706@berlin.de> I finally found a contiguous time slot for updating my inference implementation according to spec version 0.6.3. These are my comments and questions: I couldn't find a statement on how type inference supports "pass-through" varargs semantics, where a provided array is directly bound to the varargs array variable. This way Java 8 would change the semantics in cases like this: String[][] x = {{"X"}, {"Y"}}; List l = Arrays.asList(x); In Java 7 'l' has two elements of type String[] but the new inference seems to produce one element of type String[][]. Is this intended? Incorporation of captures bounds (18.3) is unclear in the use of indices of ? and B. In the former case I believe this is just a matter of lax notation (replacing all ? by ?i seems to trivially mend the issue). However, for the case of B I'm not sure about the intention: B1,...,Bn are introduced as the bounds of type parameters P1,...,Pn, i.e., each Bi is a list of bounds. The first bullet splits these into Bij, which is fine. But what am I to make of a constraint ?? Bi <: R?? A list of types cannot be a subtype of a type, can it? Is s.t. like "foreach j : Bij" implied here? BTW, the notation for substitution is not consistent, e.g.: - 18.3: ? Bi - 18.4: L1? Is ? prefix or postfix? :) Sect. 18.4. requires some type variables to "have well-formed bounds". I could find no reference to well-formedness rules for type variables in the given sense. I tend to say that this is a consequence of loose terminology in the classification of types: On the one hand great care is taken to distinguish "proper types" from inference variables (good!), but detailed classification of proper types is still blurred. First it might help to link usage of the term "type argument" (notably in 18.2.3 second half, 18.2.4. second half) to 4.5.1 in order to remind the reader, that "Type arguments may be either reference types or wildcards." Otherwise the condition "If T is a type" (should be "reference type") looks confusing. Secondly, usage of the term "type variable" conflicts with 4.4. I found this conflict to be in the tradition of 5.1.10, but it seems that 0.6.3 aggravates the situation, because in places like 18.4 ("Let Z1, ..., Zn be fresh type variables") we now speak of type variables with a lower bound and *arbitrary* upper bounds. Can a type variable have both upper and lower bounds? Can a type variable have multiple bounds that are classes? Can a type variable have multiple bounds that are arrays? To me it is not clear what rules would apply to these type variables in terms of well-formedness and in terms of compatibilty/ equivalence/subtyping/erasure, but it seems that these type variables may survive inference (in contrast to inference variables) and thus must be handled by all downstream phases of compilation, right? I don't see how a type variable in this sense falls into the existing classification, it seems to be neither of type, inference variable, type argument, type variable ? la 4.5.1. That would mean none of the existing rules are applicable to type variables? OTOH, from some uses of "type variable" I could _infer_ that these are indeed considered as types. But why a type variable should be a type and a wildcard should not is obscure to me. Given a suitable interpretation this may all be very clean, but currently it seems in order to understand type inference you first have to apply type meta inference :) 18.2.1 calls for applying 18.5.2 to method invocations *and* class instance creations. Inside 18.5.2, however, I don't see proper handling of constructors. Specifically: "let R be the return type of m" Given that constructors have no return type (8.8: "... the constructor declaration looks just like a method declaration that has no result (?8.4.5)."), I believe that inference only works as desired, if we extend this to include the type of the class instance creation. Else the rule "If R is void, then a compile-time error occurs" would cause trouble for constructors. Repeating from a previous mail: integration of unchecked conversions into 18.5.1 and 18.5.2 is unclear. From Dan's last statement it seems that Java 8 will be significantly more restrictive, rejecting some programs legal in Java 7. Before implementing such semantics I need confirmation by the EG, given that javac (b109) does *not* enforce the stricter semantics. I might add that I'm actually in favor of gradually fading out some uses of unchecked conversions, but this must happen in a coordinated way. thanks, Stephan From daniel.smith at oracle.com Tue Oct 15 15:04:45 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 15 Oct 2013 16:04:45 -0600 Subject: arrays and raw types in 18.2.3 In-Reply-To: <5256971C.10200@berlin.de> References: <525465A9.7040507@berlin.de> <0BCAFD6A-0378-4557-BB26-AB4B726239E7@oracle.com> <5256971C.10200@berlin.de> Message-ID: <0ECA4D4C-2E5E-4417-9227-93E43EADA44C@oracle.com> On Oct 10, 2013, at 6:01 AM, Stephan Herrmann wrote: >>> I'm currently looking at this test: >>> >>> public class X { >>> public static void main(String[] args) { >>> EntityKey entityKey = null; >>> new EntityCondenser().condense(entityKey); >>> } >>> public static class EntityCondenser { >>> , K extends EntityKey> void condense(K entityKey) { >>> } >>> } >>> public class EntityKey {} >>> public interface EntityType< >>> I, >>> E extends EntityType, >>> K extends EntityKey> { >>> } >>> } >> >> Ah, okay, in this case, there is no compatibility constraint for the two types ("Raw -> Parameterized"), just a subtyping constraint ("Raw <: Parameterized"), which is derived from the bound of K. >> >> The old spec was somewhat vague here, but I believe the correct behavior for both 7 and 8 is to fail. > > If that is so, then all(?) existing compilers have a bug: they do accept this program. > I just checked, even javac8 b109 accepts the above test. > Will this be changed in the near future? It would not surprise me if there are some subtle changes between JLS 7 and JLS 8 lurking here, but I'm confident that JLS 7 calls this particular example an error. I've confirmed that this is longstanding behavior in javac, and will report a bug. --- Here's a simplified variation: class EntityKey {} > void condense(K entityKey) {} EntityKey entityKey = null; condense(entityKey); ----- Here's what JLS says: 15.12.2.2 tests applicability by subtyping. U1 and U2 are the type arguments inferred for this invocation via initial constraint: EntityKey << K 15.12.2.7 says: - 'K' involves a type parameter - 'EntityKey' is not the type of 'null' - 'EntityKey' is not a primitive type - 'K' is a type parameter, so the constraint 'K :> EntityKey' is implied - Final constraints: 'K :> EntityKey' - There are no equality constraints - The type of K is inferred as lub(EntityKey) = EntityKey So U1=??? (this is not well-defined) and U2=EntityKey. Let S1 = K[I=???, K=EntityKey] = EntityKey Is it the case that: - A1 <: S1, that is, EntityKey <: EntityKey? Yes. - U2 <: B2[I=???, K=EntityKey], that is, EntityKey <: EntityKey? No. So the method is not applicable. The rules go out of their way to allow an unchecked conversion, if necessary, in that first condition, but not the second. The same reasoning applies for 15.12.2.3. This flirts with the problem that we haven't clearly specified what U1 is supposed to be, but, really, it's irrelevant. If U2=EntityKey, no choice for U1 will allow U2 to be within its bound. ?Dan From daniel.smith at oracle.com Tue Oct 15 15:24:43 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 15 Oct 2013 16:24:43 -0600 Subject: arrays and raw types in 18.2.3 In-Reply-To: <0ECA4D4C-2E5E-4417-9227-93E43EADA44C@oracle.com> References: <525465A9.7040507@berlin.de> <0BCAFD6A-0378-4557-BB26-AB4B726239E7@oracle.com> <5256971C.10200@berlin.de> <0ECA4D4C-2E5E-4417-9227-93E43EADA44C@oracle.com> Message-ID: On Oct 15, 2013, at 4:04 PM, Dan Smith wrote: > On Oct 10, 2013, at 6:01 AM, Stephan Herrmann wrote: > >>>> I'm currently looking at this test: >>>> >>>> public class X { >>>> public static void main(String[] args) { >>>> EntityKey entityKey = null; >>>> new EntityCondenser().condense(entityKey); >>>> } >>>> public static class EntityCondenser { >>>> , K extends EntityKey> void condense(K entityKey) { >>>> } >>>> } >>>> public class EntityKey {} >>>> public interface EntityType< >>>> I, >>>> E extends EntityType, >>>> K extends EntityKey> { >>>> } >>>> } >>> >>> Ah, okay, in this case, there is no compatibility constraint for the two types ("Raw -> Parameterized"), just a subtyping constraint ("Raw <: Parameterized"), which is derived from the bound of K. >>> >>> The old spec was somewhat vague here, but I believe the correct behavior for both 7 and 8 is to fail. >> >> If that is so, then all(?) existing compilers have a bug: they do accept this program. >> I just checked, even javac8 b109 accepts the above test. >> Will this be changed in the near future? > > It would not surprise me if there are some subtle changes between JLS 7 and JLS 8 lurking here, but I'm confident that JLS 7 calls this particular example an error. > > I've confirmed that this is longstanding behavior in javac, and will report a bug. > > --- > > Here's a simplified variation: > > class EntityKey {} > > > void condense(K entityKey) {} > > EntityKey entityKey = null; > condense(entityKey); Turns out this problem is actually independent of inference: the following will also compile with both javac and ecj: EntityKey entityKey = null; this.condense(entityKey); ?Dan From daniel.smith at oracle.com Wed Oct 16 09:05:39 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Wed, 16 Oct 2013 10:05:39 -0600 Subject: Collected comments / questions re type inference in 0.6.3 In-Reply-To: <525AB9EB.7000706@berlin.de> References: <525AB9EB.7000706@berlin.de> Message-ID: Thanks for all the feedback. It is very helpful. On Oct 13, 2013, at 9:19 AM, Stephan Herrmann wrote: > I finally found a contiguous time slot for updating my inference > implementation according to spec version 0.6.3. > These are my comments and questions: > > > I couldn't find a statement on how type inference supports > "pass-through" varargs semantics, where a provided array > is directly bound to the varargs array variable. > This way Java 8 would change the semantics in cases like this: > String[][] x = {{"X"}, {"Y"}}; > List l = Arrays.asList(x); > In Java 7 'l' has two elements of type String[] but the new > inference seems to produce one element of type String[][]. > Is this intended? In fact, I tried to add text to support the intended behavior (a list of length 2, as you say), where it was unclear before. See the discussion in Part F, 15.12.2.4: "The previous 'applicable variable arity method' terminology incorrectly hinted that, if a variable-arity method is applicable in any phase, it is applicable in and only in Phase 3. This overlooks the fact that variable arity methods can act as fixed-arity methods in Phases 1 and 2. What is relevant is the kinds of adaptations actually used to determine applicability, not the kinds of adaptations allowed by the method declaration." In JLS 7, 15.12.2.2 and 15.12.2.3 assume (incorrectly) that any potentially-applicable method, m, will have exactly n formal parameters, where n is the number of argument expressions. If we take this to be a restriction ("let m be a potentially-applicable method with n formal parameters"), then 'asList' is applicable by subtyping. 15.12.2.4, had we gotten there, would have tested that String[][] <: Object ("can be converted by method invocation conversion to the component type of Sn"), which it is, and we would have ended up with a 1-element array. The actual runtime behavior comes from 15.12.4.2, which only allocates a new array if "the type of the k'th argument expression is not assignment compatible with T[]". In the Lambda Spec, none of this is substantially changed. We should probably clarify that 15.12.2.2 and 15.12.2.3 apply to all methods for which there are n formal parameters. > Incorporation of captures bounds (18.3) is unclear in the use > of indices of ? and B. In the former case I believe this is > just a matter of lax notation (replacing all ? by ?i seems to > trivially mend the issue). Yes, someone pointed this out to me already and it's fixed. > However, for the case of B I'm not > sure about the intention: B1,...,Bn are introduced as the > bounds of type parameters P1,...,Pn, i.e., each Bi is a list of > bounds. The first bullet splits these into Bij, which is fine. > But what am I to make of a constraint ?? Bi <: R?? > A list of types cannot be a subtype of a type, can it? > Is s.t. like "foreach j : Bij" implied here? Yes, it's a slight abuse, but I didn't invent it. :-) The intent is that Bi is an intersection type as implied by the bound. See, for example, 4.5, which also treats Bi as a type. Note that an intersection types do _not_ have "for all" behavior on the left side of subtyping: the constraint is that "there exists" an element of the list that is a subtype of R. If R were allowed to be a type variable, I would be concerned about ?? Bi <: R? and want to break it into Bijs. But, since subtyping reduction (18.2.3) will go ahead and do the right thing with an intersection anyway, I think I'm okay with it. (Generally, we do need to tighten up or eliminate the distinction between intersections and lists of types, but not today.) > BTW, the notation for substitution is not consistent, e.g.: > - 18.3: ? Bi > - 18.4: L1? > Is ? prefix or postfix? :) Thanks for noting that. I'm trying to follow the precedent from sections like 4.10.2; it should be postfix. > Sect. 18.4. requires some type variables to "have well-formed bounds". > I could find no reference to well-formedness rules for type variables > in the given sense. I'll make a note to clean this up as best I can. The idea is that if capture would allow it (5.1.10), it's okay. One thing capture prohibits is two different unrelated classes as upper bounds. One thing it doesn't prohibit, but should, is a lower bound that is not a subtype of any upper bound. (Another to-do for another day: clarify the well-formedness rules for types.) > I tend to say that this is a consequence of loose terminology in > the classification of types: On the one hand great care is taken > to distinguish "proper types" from inference variables (good!), > but detailed classification of proper types is still blurred. > First it might help to link usage of the term "type argument" > (notably in 18.2.3 second half, 18.2.4. second half) to 4.5.1 > in order to remind the reader, that > "Type arguments may be either reference types or wildcards." > Otherwise the condition "If T is a type" (should be "reference type") > looks confusing. Good suggestion. Done. > Secondly, usage of the term "type variable" conflicts with 4.4. > I found this conflict to be in the tradition of 5.1.10, but it > seems that 0.6.3 aggravates the situation, because in places like > 18.4 ("Let Z1, ..., Zn be fresh type variables") we now speak of > type variables with a lower bound and *arbitrary* upper bounds. > Can a type variable have both upper and lower bounds? > Can a type variable have multiple bounds that are classes? > Can a type variable have multiple bounds that are arrays? > To me it is not clear what rules would apply to these type > variables in terms of well-formedness and in terms of compatibilty/ > equivalence/subtyping/erasure, but it seems that these type > variables may survive inference (in contrast to inference variables) > and thus must be handled by all downstream phases of compilation, > right? Yes, this is messy. 4.4 does a surprisingly good job of referring to a "type variable declared as a type parameter". Capture variables (including the variables produced by 18.4) are type variables that are _not_ declared as type parameters. 5.1.10 makes a weak attempt to provide some well-formedness rules, but, as I said, we need to do a better job here. Note that pretty much any type can already be the upper or lower bound of a capture variable, due to inference: List m(...); // any type that can be inferred for T is an upper bound of the capture of the result Also note that the idea of using "capture" variables for inference is not new -- see JLS 7 15.12.2.8. Despite these qualifiers, yes, there are some new possibilities for variable bounds that were not possible before. One thing that needs tweaking (a known issue) is the use of lub in 18.4. The inputs to lub might not be fully defined types -- they might involve type variables whose bounds we are in the middle of instantiating. So I need to back off a bit on what we can do there, probably only taking the lub of the proper lower bounds. (glb is more of a syntactic operation, so we can get away with it in 5.1.10 and here.) > I don't see how a type variable in this sense falls into the > existing classification, it seems to be neither of type, > inference variable, type argument, type variable ? la 4.5.1. > That would mean none of the existing rules are applicable to > type variables? > OTOH, from some uses of "type variable" I could _infer_ that these > are indeed considered as types. But why a type variable should be > a type and a wildcard should not is obscure to me. > Given a suitable interpretation this may all be very clean, > but currently it seems in order to understand type inference > you first have to apply type meta inference :) Type variables are definitely types. A type variable produced by capture may, for example, be the type of an expression. Chapter 4 needs to do a better job of acknowledging that some types (capture variables, intersection types, the 'null' type) are not Types -- they can't be written down in the syntax -- but are still types. Wildcards are not types because they make no sense except in the context of a parameterized type. It is impossible for a wildcard to be the type of an expression. > 18.2.1 calls for applying 18.5.2 to method invocations *and* > class instance creations. Inside 18.5.2, however, I don't see > proper handling of constructors. Specifically: > "let R be the return type of m" > Given that constructors have no return type > (8.8: "... the constructor declaration looks just like a method > declaration that has no result (?8.4.5)."), > I believe that inference only works as desired, if we extend > this to include the type of the class instance creation. > Else the rule "If R is void, then a compile-time error occurs" > would cause trouble for constructors. See JLS 7 15.9.3. Overload resolution and inference have always been defined for class instance creations somewhat informally in terms of the behavior for methods. In the diamond case, we explicitly define a "method" that gets plugged in to the analysis. I've added a sentence to clarify that the "method" is as defined in 15.9.3. > Repeating from a previous mail: integration of unchecked > conversions into 18.5.1 and 18.5.2 is unclear. Yes, on my to-do list. > From Dan's last statement it seems that Java 8 will be significantly > more restrictive, rejecting some programs legal in Java 7. > Before implementing such semantics I need confirmation by the EG, > given that javac (b109) does *not* enforce the stricter semantics. > I might add that I'm actually in favor of gradually fading out some uses > of unchecked conversions, but this must happen in a coordinated way. What we're seeing is an old inconsistency that just hasn't been fixed yet. There are many other bugs, some reported and some not yet identified, of a similar scope. And Eclipse often prefers consistency with javac over consistency with the spec. There may be subtle differences between 7 and 8 due to 8's commitment to soundness; this is by design. I haven't encountered any cases that seem to be of serious concern to users. (In other words, 7 defaults to "true" during reduction, while 8 prefers "false". So in some corners, 7 is more powerful by pretending something is true even though it can't prove it, and then happening to get lucky. These corners are obscure.) ?Dan From stephan.herrmann at berlin.de Thu Oct 17 15:59:15 2013 From: stephan.herrmann at berlin.de (Stephan Herrmann) Date: Fri, 18 Oct 2013 00:59:15 +0200 Subject: Collected comments / questions re type inference in 0.6.3 In-Reply-To: References: <525AB9EB.7000706@berlin.de> Message-ID: <52606BC3.1090300@berlin.de> On 10/16/2013 06:05 PM, Dan Smith wrote: > On Oct 13, 2013, at 9:19 AM, Stephan Herrmann wrote: >> I couldn't find a statement on how type inference supports >> "pass-through" varargs semantics, where a provided array >> is directly bound to the varargs array variable. >> This way Java 8 would change the semantics in cases like this: >> String[][] x = {{"X"}, {"Y"}}; >> List l = Arrays.asList(x); >> In Java 7 'l' has two elements of type String[] but the new >> inference seems to produce one element of type String[][]. >> Is this intended? > > In fact, I tried to add text to support the intended behavior (a list of length 2, as you say), where it was unclear before. > > See the discussion in Part F, 15.12.2.4: "The previous 'applicable variable arity method' terminology incorrectly hinted that, if a variable-arity method is applicable in any phase, it is applicable in and only in Phase 3. This overlooks the fact that variable arity methods can act as fixed-arity methods in Phases 1 and 2. What is relevant is the kinds of adaptations actually used to determine applicability, not the kinds of adaptations allowed by the method declaration." Thanks, I indeed overlooked that discussion bullet. I saw special mentioning in 15.12.2.1 ("the nth argument of the method invocation is potentially compatible with either T or T[].") and thought a similar rule might be missing from 18.5.1. > In the Lambda Spec, none of this is substantially changed. We should probably clarify that 15.12.2.2 and 15.12.2.3 apply to all methods for which there are n formal parameters. Yes, turning that note from the discussion into s.t. explicit in the spec would be helpful. >> However, for the case of B I'm not >> sure about the intention: B1,...,Bn are introduced as the >> bounds of type parameters P1,...,Pn, i.e., each Bi is a list of >> bounds. The first bullet splits these into Bij, which is fine. >> But what am I to make of a constraint ?? Bi <: R?? >> A list of types cannot be a subtype of a type, can it? >> Is s.t. like "foreach j : Bij" implied here? > > Yes, it's a slight abuse, but I didn't invent it. :-) The intent is that Bi is an intersection type as implied by the bound. See, for example, 4.5, which also treats Bi as a type. OK, with this unfolding it's much clearer :) > Note that an intersection types do _not_ have "for all" behavior on the left side of subtyping: the constraint is that "there exists" an element of the list that is a subtype of R. Thanks for mentioning! >> Sect. 18.4. requires some type variables to "have well-formed bounds". >> I could find no reference to well-formedness rules for type variables >> in the given sense. > > I'll make a note to clean this up as best I can. The idea is that if capture would allow it (5.1.10), it's okay. One thing capture prohibits is two different unrelated classes as upper bounds. One thing it doesn't prohibit, but should, is a lower bound that is not a subtype of any upper bound. (Another to-do for another day: clarify the well-formedness rules for types.) I think I can proceed with this statement for now. > [...] > One thing that needs tweaking (a known issue) is the use of lub in 18.4. The inputs to lub might not be fully defined types -- they might involve type variables whose bounds we are in the middle of instantiating. So I need to back off a bit on what we can do there, probably only taking the lub of the proper lower bounds. (glb is more of a syntactic operation, so we can get away with it in 5.1.10 and here.) I'll keep that in mind if I get into trouble with this lub... > [...] > Type variables are definitely types. A type variable produced by capture may, for example, be the type of an expression. Chapter 4 needs to do a better job of acknowledging that some types (capture variables, intersection types, the 'null' type) are not Types -- they can't be written down in the syntax -- but are still types. So, capitalization is semantically relevant? :) > [...] > See JLS 7 15.9.3. Overload resolution and inference have always been defined for class instance creations somewhat informally in terms of the behavior for methods. In the diamond case, we explicitly define a "method" that gets plugged in to the analysis. > > I've added a sentence to clarify that the "method" is as defined in 15.9.3. OK, that a substantial substitution. >> From Dan's last statement it seems that Java 8 will be significantly >> more restrictive, rejecting some programs legal in Java 7. >> Before implementing such semantics I need confirmation by the EG, >> given that javac (b109) does *not* enforce the stricter semantics. >> I might add that I'm actually in favor of gradually fading out some uses >> of unchecked conversions, but this must happen in a coordinated way. > > What we're seeing is an old inconsistency that just hasn't been fixed yet. There are many other bugs, some reported and some not yet identified, of a similar scope. And Eclipse often prefers consistency with javac over consistency with the spec. Even if we (committers) preferred the spec, most our users prefer consistency with javac, including bug-compatibility. Particularly, if javac accepts an illegal program, it's impossible to convince a user that the more restrictive compiler is better. > There may be subtle differences between 7 and 8 due to 8's commitment to soundness; this is by design. I haven't encountered any cases that seem to be of serious concern to users. (In other words, 7 defaults to "true" during reduction, while 8 prefers "false". So in some corners, 7 is more powerful by pretending something is true even though it can't prove it, and then happening to get lucky. These corners are obscure.) Judging from the number of bug reports and discussions in these bugs I have the (not-validated) impression that combined use of parameterized and some raw types is fairly wide-spread. It should be possible to mend these programs by properly parameterizing all types, but I'd predict that users will see a difference. If we can tell our users that Java 8 will be stricter in those situations, it may be easier to argue than if bug fixes in Java 7 compilers break existing programs. thanks, Stephan From daniel.smith at oracle.com Thu Oct 17 16:32:34 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Thu, 17 Oct 2013 17:32:34 -0600 Subject: Collected comments / questions re type inference in 0.6.3 In-Reply-To: <52606BC3.1090300@berlin.de> References: <525AB9EB.7000706@berlin.de> <52606BC3.1090300@berlin.de> Message-ID: On Oct 17, 2013, at 4:59 PM, Stephan Herrmann wrote: >> Type variables are definitely types. A type variable produced by capture may, for example, be the type of an expression. Chapter 4 needs to do a better job of acknowledging that some types (capture variables, intersection types, the 'null' type) are not Types -- they can't be written down in the syntax -- but are still types. > > So, capitalization is semantically relevant? :) And typefaces. :-) ?Dan From brian.goetz at oracle.com Mon Oct 21 10:27:33 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Mon, 21 Oct 2013 13:27:33 -0400 Subject: Serialization spec clarifications for lambda Message-ID: <52656405.7080300@oracle.com> The serialization spec currently has some language about how inner classes are risky to serialize -- see the Note at the bottom of section 1.10 here: http://docs.oracle.com/javase/7/docs/platform/serialization/spec/serial-arch.html#4539 I propose to add the following additional note regarding the serialization risks of lambdas. Comments welcome (though note Public Review starts next week, so not much time to dither.) Note - As with inner classes, serialization of lambda expressions is strongly discouraged. Names of synthetic methods generated by javac (or other Java(TM) compilers) to implement lambda expressions are implementation-dependent, may vary between compilers, and may change due to unrelated modifications in the same source file; differences in such names can disrupt compatibility. Lambda expressions may refer to values from the enclosing scope; when the lambda expressions serialized, these values will be serialized as well. The order in which values from the enclosing scope are captured is implementation-dependent, may vary between compilers, and any modification of the source file containing the lambda expression may change this capture order, affecting deserialization correctness. Lambda expressions cannot use field- or method-based mechanisms to control their serialized form. If serializable lambdas are used, to minimize compatibility risks, it is recommended that identical classfiles as were present at serialization time be present at deserialization time. From paul.sandoz at oracle.com Tue Oct 22 05:41:11 2013 From: paul.sandoz at oracle.com (Paul Sandoz) Date: Tue, 22 Oct 2013 14:41:11 +0200 Subject: Serialization spec clarifications for lambda In-Reply-To: <52656405.7080300@oracle.com> References: <52656405.7080300@oracle.com> Message-ID: <440A6710-1971-4FF2-A01E-14A46A988FB6@oracle.com> On Oct 21, 2013, at 7:27 PM, Brian Goetz wrote: > The serialization spec currently has some language about how inner classes are risky to serialize -- see the Note at the bottom of section 1.10 here: http://docs.oracle.com/javase/7/docs/platform/serialization/spec/serial-arch.html#4539 > > I propose to add the following additional note regarding the serialization risks of lambdas. Comments welcome (though note Public Review starts next week, so not much time to dither.) > > Note - As with inner classes, serialization of lambda expressions is strongly discouraged. Names of synthetic methods generated by javac (or other Java(TM) compilers) to implement lambda expressions are implementation-dependent, may vary between compilers, and may change due to unrelated modifications in the same source file; differences in such names can disrupt compatibility. Lambda expressions may refer to values from the enclosing scope; when the lambda expressions serialized, these values will be serialized as well. The order in which values Typo: when the lambda expressions *are* serialized. Might read better if "lambda expressions" is re-written in the singular? as is the case in the next sentence. Paul. > from the enclosing scope are captured is implementation-dependent, may vary between compilers, and any modification of the source file containing the lambda expression may change this capture order, affecting deserialization correctness. Lambda expressions cannot use field- or method-based mechanisms to control their serialized form. If serializable lambdas are used, to minimize compatibility risks, it is recommended that identical classfiles as were present at serialization time be present at deserialization time. From forax at univ-mlv.fr Tue Oct 29 10:26:27 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 18:26:27 +0100 Subject: Defaut methods are not visible if -source 1.7 is set Message-ID: <526FEFC3.30909@univ-mlv.fr> This is from enhanced-metadata-spec-discuss mailing list but I think this issue can interest this audience because it seems that javac doesn't respect the spec we have drafted or the spec has changed without me noticing it. Anyway there is a discrepancy somewhere. From Joe Darcy: >> Wouldn't this risk the same issues as when we turned >> isAnnotationPresent() into a default? >> >> > > No; we don't have the same hazard here as with isAnnotationPresent. > The issue we ran into with making isAnnotationPresent a default method > was that isAnnotationPresent was part of the original AnnotatedElement > interface defined way back in Java SE 5. In implementation decision in > javac did not expose the existence of default methods to code being > compiled under source levels less than 8. That is a pragmatic choice > and usually gives the desired result, but not in this case. > So it seems that javac 8 doesn't see default methods if source level is 1.7 (-source 1.7) and as Joe notice this hamper to transform any existing abstract method to a default method. I'm pretty sure that we have talked about that and said that adding default to a method should be source compatible when we have discussed about Iterator.remove. Otherwise it goes against the intuition that providing a body to an existing abstract method is harmless. So either javac8 implementation is not correct (BTW, javac7 see default methods) or the spec shoud be changed to say that adding a default implementation is not a compatible change and Iterator.remove should be re-abstracted. regards, R?mi From brian.goetz at oracle.com Tue Oct 29 10:48:53 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 29 Oct 2013 13:48:53 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FEFC3.30909@univ-mlv.fr> References: <526FEFC3.30909@univ-mlv.fr> Message-ID: <526FF505.8070609@oracle.com> We discussed this extensively at some point; its not as simple as this, though I admit the details are paging in slowly. IIRC, the root problem is this: you have a class class Foo extends List { } that you want to compile with the 1.8 javac but with -source 1.7. What could happen? 1. You could fail to compile, because the supertype uses language/VM features you don't support. This would be not very nice. 2. You could force the user to implement the default methods, since we can't assume they're inherited. This is also not very nice. 3. We could ignore them. The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. Eventually, modularity was supposed help here. At the time, (3) seemed the least bad alternative. But this is indeed a problem for methods that acquire defaults. > spec shoud be changed to say that adding a default implementation > is not a compatible change There's no way we're doing this to work around a tooling bug. On 10/29/2013 1:26 PM, Remi Forax wrote: > This is from enhanced-metadata-spec-discuss mailing list > but I think this issue can interest this audience because it seems > that javac doesn't respect the spec we have drafted or the spec has > changed without me noticing it. > Anyway there is a discrepancy somewhere. > > From Joe Darcy: > >>> Wouldn't this risk the same issues as when we turned >>> isAnnotationPresent() into a default? >>> >>> >> >> No; we don't have the same hazard here as with isAnnotationPresent. >> The issue we ran into with making isAnnotationPresent a default method >> was that isAnnotationPresent was part of the original AnnotatedElement >> interface defined way back in Java SE 5. In implementation decision in >> javac did not expose the existence of default methods to code being >> compiled under source levels less than 8. That is a pragmatic choice >> and usually gives the desired result, but not in this case. >> > > So it seems that javac 8 doesn't see default methods if source level is > 1.7 (-source 1.7) > and as Joe notice this hamper to transform any existing abstract method > to a default method. > > I'm pretty sure that we have talked about that and said that adding > default to a method should be > source compatible when we have discussed about Iterator.remove. > Otherwise it goes against the intuition that providing a body to an > existing abstract method is harmless. > > So either javac8 implementation is not correct (BTW, javac7 see default > methods) > or the spec shoud be changed to say that adding a default implementation > is not a compatible change > and Iterator.remove should be re-abstracted. > > regards, > R?mi > From joe.darcy at oracle.com Tue Oct 29 11:19:08 2013 From: joe.darcy at oracle.com (Joe Darcy) Date: Tue, 29 Oct 2013 11:19:08 -0700 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FF505.8070609@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> Message-ID: <526FFC1C.5050308@oracle.com> On 10/29/2013 10:48 AM, Brian Goetz wrote: > We discussed this extensively at some point; its not as simple as > this, though I admit the details are paging in slowly. > > IIRC, the root problem is this: you have a class > > class Foo extends List { } > > that you want to compile with the 1.8 javac but with -source 1.7. What > could happen? > > 1. You could fail to compile, because the supertype uses language/VM > features you don't support. This would be not very nice. > > 2. You could force the user to implement the default methods, since > we can't assume they're inherited. This is also not very nice. > > 3. We could ignore them. > > The root problem is that -source 1.7 still exposes 1.8 libraries to > the compilation, which is just wrong. What should happen is we should > be compiling with the fictitious -platform 1.7, which not only > enforces the 1.7 language level, but also puts the 1.7 JDK classes on > the bootclasspath. > > Eventually, modularity was supposed help here. At the time, (3) > seemed the least bad alternative. But this is indeed a problem for > methods that acquire defaults. > >> spec shoud be changed to say that adding a default implementation >> is not a compatible change > > There's no way we're doing this to work around a tooling bug. IMO, the current behavior of javac is the least-bad option. (The issue before was how AnnotatedElement.isAnnotationPresent should be handled. It was a Java SE 5 method on AnnotatedElement that was turned into a default. The behavior of javac meant that the method "disappeared" from the implementations of the interface, like java.lang.Class, for code being compiled under earlier source versions. We addressed this by adding back concrete implementations in the interface implementations that called back to the default method. That means the method appears present in all source versions and we still get some sharing of the implementation code.) There is no specification that governs how a compiler should present new-in-8 language features when compiling under a older-than-8 source level. More generally, the -source and -target options are not required parts of the platform specification, just a convenience to developers. -Joe From forax at univ-mlv.fr Tue Oct 29 11:31:08 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 19:31:08 +0100 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FF505.8070609@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> Message-ID: <526FFEEC.5040702@univ-mlv.fr> On 10/29/2013 06:48 PM, Brian Goetz wrote: > We discussed this extensively at some point; its not as simple as > this, though I admit the details are paging in slowly. > > IIRC, the root problem is this: you have a class > > class Foo extends List { } > > that you want to compile with the 1.8 javac but with -source 1.7. What > could happen? > > 1. You could fail to compile, because the supertype uses language/VM > features you don't support. This would be not very nice. > > 2. You could force the user to implement the default methods, since > we can't assume they're inherited. This is also not very nice. > > 3. We could ignore them. > > The root problem is that -source 1.7 still exposes 1.8 libraries to > the compilation, which is just wrong. What should happen is we should > be compiling with the fictitious -platform 1.7, which not only > enforces the 1.7 language level, but also puts the 1.7 JDK classes on > the bootclasspath. There is already a warning for that, if you compile -source 1.7 -target 1.7 with javac, you will have a warning saying that you have to set the bootclasspath to a 1.7 rt.jar So given that using -source 1.7 -target 1.7 with the wrong rt.jar is a corner case, I think your solution 2 is a good compromise but 1 is valid too. > > Eventually, modularity was supposed help here. At the time, (3) > seemed the least bad alternative. But this is indeed a problem for > methods that acquire defaults. and the fact that javac7 doesn't work like that. > >> spec shoud be changed to say that adding a default implementation >> is not a compatible change > > There's no way we're doing this to work around a tooling bug. R?mi > > > On 10/29/2013 1:26 PM, Remi Forax wrote: >> This is from enhanced-metadata-spec-discuss mailing list >> but I think this issue can interest this audience because it seems >> that javac doesn't respect the spec we have drafted or the spec has >> changed without me noticing it. >> Anyway there is a discrepancy somewhere. >> >> From Joe Darcy: >> >>>> Wouldn't this risk the same issues as when we turned >>>> isAnnotationPresent() into a default? >>>> >>>> >>> >>> No; we don't have the same hazard here as with isAnnotationPresent. >>> The issue we ran into with making isAnnotationPresent a default method >>> was that isAnnotationPresent was part of the original AnnotatedElement >>> interface defined way back in Java SE 5. In implementation decision in >>> javac did not expose the existence of default methods to code being >>> compiled under source levels less than 8. That is a pragmatic choice >>> and usually gives the desired result, but not in this case. >>> >> >> So it seems that javac 8 doesn't see default methods if source level is >> 1.7 (-source 1.7) >> and as Joe notice this hamper to transform any existing abstract method >> to a default method. >> >> I'm pretty sure that we have talked about that and said that adding >> default to a method should be >> source compatible when we have discussed about Iterator.remove. >> Otherwise it goes against the intuition that providing a body to an >> existing abstract method is harmless. >> >> So either javac8 implementation is not correct (BTW, javac7 see default >> methods) >> or the spec shoud be changed to say that adding a default implementation >> is not a compatible change >> and Iterator.remove should be re-abstracted. >> >> regards, >> R?mi >> From forax at univ-mlv.fr Tue Oct 29 11:37:37 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 19:37:37 +0100 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FFC1C.5050308@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFC1C.5050308@oracle.com> Message-ID: <52700071.4050201@univ-mlv.fr> On 10/29/2013 07:19 PM, Joe Darcy wrote: > On 10/29/2013 10:48 AM, Brian Goetz wrote: >> We discussed this extensively at some point; its not as simple as >> this, though I admit the details are paging in slowly. >> >> IIRC, the root problem is this: you have a class >> >> class Foo extends List { } >> >> that you want to compile with the 1.8 javac but with -source 1.7. >> What could happen? >> >> 1. You could fail to compile, because the supertype uses language/VM >> features you don't support. This would be not very nice. >> >> 2. You could force the user to implement the default methods, since >> we can't assume they're inherited. This is also not very nice. >> >> 3. We could ignore them. >> >> The root problem is that -source 1.7 still exposes 1.8 libraries to >> the compilation, which is just wrong. What should happen is we >> should be compiling with the fictitious -platform 1.7, which not only >> enforces the 1.7 language level, but also puts the 1.7 JDK classes on >> the bootclasspath. >> >> Eventually, modularity was supposed help here. At the time, (3) >> seemed the least bad alternative. But this is indeed a problem for >> methods that acquire defaults. >> >>> spec shoud be changed to say that adding a default implementation >>> is not a compatible change >> >> There's no way we're doing this to work around a tooling bug. > > IMO, the current behavior of javac is the least-bad option. > > (The issue before was how AnnotatedElement.isAnnotationPresent should > be handled. It was a Java SE 5 method on AnnotatedElement that was > turned into a default. The behavior of javac meant that the method > "disappeared" from the implementations of the interface, like > java.lang.Class, for code being compiled under earlier source > versions. We addressed this by adding back concrete implementations in > the interface implementations that called back to the default method. > That means the method appears present in all source versions and we > still get some sharing of the implementation code.) Yes I understand the details but I disagree with the premise, it's not the least bad-option because it means that Iterator.remove or any abstract method which is converted from an abstract method to a default method not working with javac8 -source 1.7. Default methods are abstract with -source 1.7 is more appealing to me. > > There is no specification that governs how a compiler should present > new-in-8 language features when compiling under a older-than-8 source > level. More generally, the -source and -target options are not > required parts of the platform specification, just a convenience to > developers. > > -Joe > R?mi From brian.goetz at oracle.com Tue Oct 29 11:44:23 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 29 Oct 2013 14:44:23 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <52700071.4050201@univ-mlv.fr> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFC1C.5050308@oracle.com> <52700071.4050201@univ-mlv.fr> Message-ID: <52700207.1060208@oracle.com> No, this is crazy. THis means anyone who has an existing implementation of List, who tries to recompile it using javac with -source 7, will not be able to recompile their code. But this is the primary use case for -source 1.7! > Default methods are abstract with -source 1.7 is more appealing to me. From forax at univ-mlv.fr Tue Oct 29 11:42:07 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 19:42:07 +0100 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> Message-ID: <5270017F.60709@univ-mlv.fr> On 10/29/2013 07:26 PM, Paul Benedict wrote: > It makes sense to me that -source 1.7 can't see default methods. > Really, the issue comes down to: does it makes sense to use a JDK 8 > compiler to compile JDK 7 source? I never liked the idea of splitting > -source/-target. Thankfully, these options will be phased out eventually. > > Paul It's a practical issue, there are lot of codes that compiles using -source/-target, I've always found it was easier to not do that and compiles with the latest version and then retrofit the bytecode (or fail) to the version you want to be compatible with. R?mi > > > On Tue, Oct 29, 2013 at 12:48 PM, Brian Goetz > wrote: > > We discussed this extensively at some point; its not as simple as > this, though I admit the details are paging in slowly. > > IIRC, the root problem is this: you have a class > > class Foo extends List { } > > that you want to compile with the 1.8 javac but with -source 1.7. > What could happen? > > 1. You could fail to compile, because the supertype uses > language/VM features you don't support. This would be not very nice. > > 2. You could force the user to implement the default methods, > since we can't assume they're inherited. This is also not very nice. > > 3. We could ignore them. > > The root problem is that -source 1.7 still exposes 1.8 libraries > to the compilation, which is just wrong. What should happen is we > should be compiling with the fictitious -platform 1.7, which not > only enforces the 1.7 language level, but also puts the 1.7 JDK > classes on the bootclasspath. > > Eventually, modularity was supposed help here. At the time, (3) > seemed the least bad alternative. But this is indeed a problem > for methods that acquire defaults. > > spec shoud be changed to say that adding a default implementation > is not a compatible change > > > There's no way we're doing this to work around a tooling bug. > > > On 10/29/2013 1:26 PM, Remi Forax wrote: > > This is from enhanced-metadata-spec-discuss mailing list > but I think this issue can interest this audience because it seems > that javac doesn't respect the spec we have drafted or the > spec has > changed without me noticing it. > Anyway there is a discrepancy somewhere. > > From Joe Darcy: > > Wouldn't this risk the same issues as when we turned > isAnnotationPresent() into a default? > > > > No; we don't have the same hazard here as with > isAnnotationPresent. > The issue we ran into with making isAnnotationPresent a > default method > was that isAnnotationPresent was part of the original > AnnotatedElement > interface defined way back in Java SE 5. In implementation > decision in > javac did not expose the existence of default methods to > code being > compiled under source levels less than 8. That is a > pragmatic choice > and usually gives the desired result, but not in this case. > > > So it seems that javac 8 doesn't see default methods if source > level is > 1.7 (-source 1.7) > and as Joe notice this hamper to transform any existing > abstract method > to a default method. > > I'm pretty sure that we have talked about that and said that > adding > default to a method should be > source compatible when we have discussed about Iterator.remove. > Otherwise it goes against the intuition that providing a body > to an > existing abstract method is harmless. > > So either javac8 implementation is not correct (BTW, javac7 > see default > methods) > or the spec shoud be changed to say that adding a default > implementation > is not a compatible change > and Iterator.remove should be re-abstracted. > > regards, > R?mi > > > > > -- > Cheers, > Paul From forax at univ-mlv.fr Tue Oct 29 11:44:29 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 19:44:29 +0100 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <52700207.1060208@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFC1C.5050308@oracle.com> <52700071.4050201@univ-mlv.fr> <52700207.1060208@oracle.com> Message-ID: <5270020D.2010507@univ-mlv.fr> On 10/29/2013 07:44 PM, Brian Goetz wrote: > No, this is crazy. THis means anyone who has an existing > implementation of List, who tries to recompile it using javac with > -source 7, will not be able to recompile their code. But this is the > primary use case for -source 1.7! you forget the: "or have to set the bootclasspath" at the end of your first sentence. > >> Default methods are abstract with -source 1.7 is more appealing to me. R?mi From daniel.smith at oracle.com Tue Oct 29 12:14:12 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 29 Oct 2013 13:14:12 -0600 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FF505.8070609@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> Message-ID: <125D9978-692A-48DB-A562-801AE2F313D9@oracle.com> On Oct 29, 2013, at 11:48 AM, Brian Goetz wrote: > We discussed this extensively at some point; its not as simple as this, though I admit the details are paging in slowly. > > IIRC, the root problem is this: you have a class > > class Foo extends List { } > > that you want to compile with the 1.8 javac but with -source 1.7. What could happen? > > 1. You could fail to compile, because the supertype uses language/VM features you don't support. This would be not very nice. > > 2. You could force the user to implement the default methods, since we can't assume they're inherited. This is also not very nice. > > 3. We could ignore them. > > The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. > > Eventually, modularity was supposed help here. At the time, (3) seemed the least bad alternative. But this is indeed a problem for methods that acquire defaults. The problem is that _implementing_ an interface is a different use case than _using_ an interface. (3) makes sense for the "implementing" case. In the "using" case, the most convenient thing would be for all methods to be visible. Remi's concern is focused on the "using" case, while the current behavior of javac is focused on the "implementing" case. I don't know how hard it would be to get javac (or some other tool) to treat the two cases differently, although it seems reasonably achievable... ?Dan From daniel.smith at oracle.com Tue Oct 29 12:16:27 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 29 Oct 2013 13:16:27 -0600 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <526FFEEC.5040702@univ-mlv.fr> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> Message-ID: <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: > On 10/29/2013 06:48 PM, Brian Goetz wrote: >> The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. > > There is already a warning for that, if you compile -source 1.7 -target 1.7 with javac, > you will have a warning saying that you have to set the bootclasspath to a 1.7 rt. jar I've made this point before, but do note that other libraries on the classpath are not subject to this constraint or warning. ?Dan From forax at univ-mlv.fr Tue Oct 29 12:20:55 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Tue, 29 Oct 2013 20:20:55 +0100 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <125D9978-692A-48DB-A562-801AE2F313D9@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <125D9978-692A-48DB-A562-801AE2F313D9@oracle.com> Message-ID: <52700A97.7060600@univ-mlv.fr> On 10/29/2013 08:14 PM, Dan Smith wrote: > On Oct 29, 2013, at 11:48 AM, Brian Goetz wrote: > >> We discussed this extensively at some point; its not as simple as this, though I admit the details are paging in slowly. >> >> IIRC, the root problem is this: you have a class >> >> class Foo extends List { } >> >> that you want to compile with the 1.8 javac but with -source 1.7. What could happen? >> >> 1. You could fail to compile, because the supertype uses language/VM features you don't support. This would be not very nice. >> >> 2. You could force the user to implement the default methods, since we can't assume they're inherited. This is also not very nice. >> >> 3. We could ignore them. >> >> The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. >> >> Eventually, modularity was supposed help here. At the time, (3) seemed the least bad alternative. But this is indeed a problem for methods that acquire defaults. > The problem is that _implementing_ an interface is a different use case than _using_ an interface. > > (3) makes sense for the "implementing" case. > > In the "using" case, the most convenient thing would be for all methods to be visible. > > Remi's concern is focused on the "using" case, while the current behavior of javac is focused on the "implementing" case. > > I don't know how hard it would be to get javac (or some other tool) to treat the two cases differently, although it seems reasonably achievable... > > ?Dan yes, great idea. the implementing vs using comes to my mind but I did not think that the policy can be different. R?mi From joe.darcy at oracle.com Tue Oct 29 14:54:11 2013 From: joe.darcy at oracle.com (Joseph Darcy) Date: Tue, 29 Oct 2013 14:54:11 -0700 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> Message-ID: <52702E83.5060700@oracle.com> On 10/29/2013 12:16 PM, Dan Smith wrote: > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: > >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >>> The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. >> There is already a warning for that, if you compile -source 1.7 -target 1.7 with javac, >> you will have a warning saying that you have to set the bootclasspath to a 1.7 rt. jar > I've made this point before, but do note that other libraries on the classpath are not subject to this constraint or warning. > And at least some parties do not use javac's bootclasspath mechanism to limit the core API that the compiled code uses; a third party API checking tool is used instead. In such workflows, by design the javac check is circumvented. (I don't necessary think this is a good workflow, but it is being used.) -Joe From spullara at gmail.com Tue Oct 29 15:07:59 2013 From: spullara at gmail.com (Sam Pullara) Date: Tue, 29 Oct 2013 18:07:59 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <52702E83.5060700@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> <52702E83.5060700@oracle.com> Message-ID: <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> Why would you want to use the new compiler if you have the old one on the machine? I'm not really understanding the use case for the bootclasspath version.? Sam ---Sent from Boxer On 10/29/2013 12:16 PM, Dan Smith wrote: > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: > >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >>> The root problem is that -source 1.7 still exposes 1.8 libraries to the compilation, which is just wrong. What should happen is we should be compiling with the fictitious -platform 1.7, which not only enforces the 1.7 language level, but also puts the 1.7 JDK classes on the bootclasspath. >> There is already a warning for that, if you compile -source 1.7 -target 1.7 with javac, >> you will have a warning saying that you have to set the bootclasspath to a 1.7 rt. jar > I've made this point before, but do note that other libraries on the classpath are not subject to this constraint or warning. > And at least some parties do not use javac's bootclasspath mechanism to limit the core API that the compiled code uses; a third party API checking tool is used instead. In such workflows, by design the javac check is circumvented. (I don't necessary think this is a good workflow, but it is being used.) -Joe -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131029/0b205a82/attachment.html From brian.goetz at oracle.com Tue Oct 29 16:37:10 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 29 Oct 2013 19:37:10 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> <52702E83.5060700@oracle.com> <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> Message-ID: <527046A6.6020000@oracle.com> Reason 1: The new compiler often has bugfixes that have not always been backported to older compilers. Reason 2: Lots of shops want to run on the latest JVM (to get performance or serviceability improvements) but not necessarily on the latest language level ("I haven't trained all my people on 'Strings in switch' yet.") So people upgrade to a recent JDK but want to prevent new language features from leaking into their source base. On 10/29/2013 6:07 PM, Sam Pullara wrote: > Why would you want to use the new compiler if you have the old one on > the machine? I'm not really understanding the use case for the > bootclasspath version. > > Sam > > --- > Sent from Boxer > On Tue, Oct 29, 2013 at 05:54 PM, Joseph Darcy wrote: >> >> On 10/29/2013 12:16 PM, Dan Smith wrote: >> > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: >> > >> >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >> >>> The root problem is that -source 1.7 still exposes 1.8 libraries >> to the compilation, which is just wrong. What should happen is we >> should be compiling with the fictitious -platform 1.7, which not only >> enforces the 1.7 language level, but also puts the 1.7 JDK classes on >> the bootclasspath. >> >> There is already a warning for that, if you compile -source 1.7 >> -target 1.7 with javac, >> >> you will have a warning saying that you have to set the >> bootclasspath to a 1.7 rt. jar >> > I've made this point before, but do note that other libraries on the >> classpath are not subject to this constraint or warning. >> > >> >> And at least some parties do not use javac's bootclasspath mechanism to >> limit the core API that the compiled code uses; a third party API >> checking tool is used instead. In such workflows, by design the javac >> check is circumvented. (I don't necessary think this is a good workflow, >> but it is being used.) >> >> -Joe >> From brian.goetz at oracle.com Tue Oct 29 17:00:13 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 29 Oct 2013 20:00:13 -0400 Subject: An unintended interaction Message-ID: <52704C0D.50809@oracle.com> If you have: public interface Foo { public static void main(String[] args) { System.out.println("Foo!"); } } you can say java Foo and it will run. Is that what we intended? Should the java launcher detect this and not look for main() methods in interfaces? From spullara at gmail.com Tue Oct 29 17:02:37 2013 From: spullara at gmail.com (Sam Pullara) Date: Tue, 29 Oct 2013 20:02:37 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <527046A6.6020000@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> <52702E83.5060700@oracle.com> <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> <527046A6.6020000@oracle.com> Message-ID: 1) I don't believe this one. The chances are much higher that you accidentally use a method that didn't exist in the previous version.? 2) you can run your compiled code from an old JDK on the latest JVM. ?Sam ---Sent from Boxer Reason 1: The new compiler often has bugfixes that have not always been backported to older compilers. Reason 2: Lots of shops want to run on the latest JVM (to get performance or serviceability improvements) but not necessarily on the latest language level ("I haven't trained all my people on 'Strings in switch' yet.") So people upgrade to a recent JDK but want to prevent new language features from leaking into their source base. On 10/29/2013 6:07 PM, Sam Pullara wrote: > Why would you want to use the new compiler if you have the old one on > the machine? I'm not really understanding the use case for the > bootclasspath version. > > Sam > > --- > Sent from Boxer > On Tue, Oct 29, 2013 at 05:54 PM, Joseph Darcy wrote: >> >> On 10/29/2013 12:16 PM, Dan Smith wrote: >> > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: >> > >> >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >> >>> The root problem is that -source 1.7 still exposes 1.8 libraries >> to the compilation, which is just wrong. What should happen is we >> should be compiling with the fictitious -platform 1.7, which not only >> enforces the 1.7 language level, but also puts the 1.7 JDK classes on >> the bootclasspath. >> >> There is already a warning for that, if you compile -source 1.7 >> -target 1.7 with javac, >> >> you will have a warning saying that you have to set the >> bootclasspath to a 1.7 rt. jar >> > I've made this point before, but do note that other libraries on the >> classpath are not subject to this constraint or warning. >> > >> >> And at least some parties do not use javac's bootclasspath mechanism to >> limit the core API that the compiled code uses; a third party API >> checking tool is used instead. In such workflows, by design the javac >> check is circumvented. (I don't necessary think this is a good workflow, >> but it is being used.) >> >> -Joe >> -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131029/889f1957/attachment-0001.html From brian.goetz at oracle.com Tue Oct 29 17:04:59 2013 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 29 Oct 2013 20:04:59 -0400 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> <52702E83.5060700@oracle.com> <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> <527046A6.6020000@oracle.com> Message-ID: <52704D2B.9030209@oracle.com> So, this is an argument to get rid of -source entirely. But if we did that, people would HOWL. On 10/29/2013 8:02 PM, Sam Pullara wrote: > 1) I don't believe this one. The chances are much higher that you > accidentally use a method that didn't exist in the previous version. > 2) you can run your compiled code from an old JDK on the latest JVM. > > Sam > > --- > Sent from Boxer > On Tue, Oct 29, 2013 at 07:37 PM, Brian Goetz wrote: >> Reason 1: The new compiler often has bugfixes that have not always been >> backported to older compilers. >> >> Reason 2: Lots of shops want to run on the latest JVM (to get >> performance or serviceability improvements) but not necessarily on the >> latest language level ("I haven't trained all my people on 'Strings in >> switch' yet.") So people upgrade to a recent JDK but want to prevent >> new language features from leaking into their source base. >> >> On 10/29/2013 6:07 PM, Sam Pullara wrote: >> > Why would you want to use the new compiler if you have the old one on >> > the machine? I'm not really understanding the use case for the >> > bootclasspath version. >> > >> > Sam >> > >> > --- >> > Sent from Boxer >> > On Tue, Oct 29, 2013 at 05:54 PM, Joseph Darcy wrote: >> >> >> >> On 10/29/2013 12:16 PM, Dan Smith wrote: >> >> > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: >> >> > >> >> >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >> >> >>> The root problem is that -source 1.7 still exposes 1.8 libraries >> >> to the compilation, which is just wrong. What should happen is we >> >> should be compiling with the fictitious -platform 1.7, which not only >> >> enforces the 1.7 language level, but also puts the 1.7 JDK classes on >> >> the bootclasspath. >> >> >> There is already a warning for that, if you compile -source 1.7 >> >> -target 1.7 with javac, >> >> >> you will have a warning saying that you have to set the >> >> bootclasspath to a 1.7 rt. jar >> >> > I've made this point before, but do note that other libraries on the >> >> classpath are not subject to this constraint or warning. >> >> > >> >> >> >> And at least some parties do not use javac's bootclasspath mechanism to >> >> limit the core API that the compiled code uses; a third party API >> >> checking tool is used instead. In such workflows, by design the javac >> >> check is circumvented. (I don't necessary think this is a good >> workflow, >> >> but it is being used.) >> >> >> >> -Joe >> >> From david.holmes at oracle.com Tue Oct 29 17:22:34 2013 From: david.holmes at oracle.com (David Holmes) Date: Wed, 30 Oct 2013 10:22:34 +1000 Subject: An unintended interaction In-Reply-To: <52704C0D.50809@oracle.com> References: <52704C0D.50809@oracle.com> Message-ID: <5270514A.2060307@oracle.com> On 30/10/2013 10:00 AM, Brian Goetz wrote: > If you have: > > public interface Foo { > public static void main(String[] args) { > System.out.println("Foo!"); > } > } > > you can say > java Foo > > and it will run. Is that what we intended? Should the java launcher > detect this and not look for main() methods in interfaces? Seems harmless to me but probably implies a need to update documentation in a number of places. David From david.lloyd at redhat.com Tue Oct 29 17:35:13 2013 From: david.lloyd at redhat.com (David M. Lloyd) Date: Tue, 29 Oct 2013 19:35:13 -0500 Subject: Defaut methods are not visible if -source 1.7 is set In-Reply-To: <52704D2B.9030209@oracle.com> References: <526FEFC3.30909@univ-mlv.fr> <526FF505.8070609@oracle.com> <526FFEEC.5040702@univ-mlv.fr> <82D94186-2837-40E7-A0E0-5A0D99ADF9D5@oracle.com> <52702E83.5060700@oracle.com> <6ECE83DB-A48B-4DB8-8012-2DAECC2BDAFC@gmail.com> <527046A6.6020000@oracle.com> <52704D2B.9030209@oracle.com> Message-ID: <52705441.8030102@redhat.com> The primary benefit was already somewhat messed up by 1.7: the ability to write code that runs on an old JDK *but* detect and use new APIs when possible. But that doesn't really work anymore. Now you pretty much have to stub in manually hacked up weird hybrid classes to do this, which is unfortunate. On 10/29/2013 07:04 PM, Brian Goetz wrote: > So, this is an argument to get rid of -source entirely. But if we did > that, people would HOWL. > > On 10/29/2013 8:02 PM, Sam Pullara wrote: >> 1) I don't believe this one. The chances are much higher that you >> accidentally use a method that didn't exist in the previous version. >> 2) you can run your compiled code from an old JDK on the latest JVM. >> >> Sam >> >> --- >> Sent from Boxer >> On Tue, Oct 29, 2013 at 07:37 PM, Brian Goetz wrote: >>> Reason 1: The new compiler often has bugfixes that have not always been >>> backported to older compilers. >>> >>> Reason 2: Lots of shops want to run on the latest JVM (to get >>> performance or serviceability improvements) but not necessarily on the >>> latest language level ("I haven't trained all my people on 'Strings in >>> switch' yet.") So people upgrade to a recent JDK but want to prevent >>> new language features from leaking into their source base. >>> >>> On 10/29/2013 6:07 PM, Sam Pullara wrote: >>> > Why would you want to use the new compiler if you have the old one on >>> > the machine? I'm not really understanding the use case for the >>> > bootclasspath version. >>> > >>> > Sam >>> > >>> > --- >>> > Sent from Boxer >>> > On Tue, Oct 29, 2013 at 05:54 PM, Joseph Darcy wrote: >>> >> >>> >> On 10/29/2013 12:16 PM, Dan Smith wrote: >>> >> > On Oct 29, 2013, at 12:31 PM, Remi Forax wrote: >>> >> > >>> >> >> On 10/29/2013 06:48 PM, Brian Goetz wrote: >>> >> >>> The root problem is that -source 1.7 still exposes 1.8 libraries >>> >> to the compilation, which is just wrong. What should happen is we >>> >> should be compiling with the fictitious -platform 1.7, which not only >>> >> enforces the 1.7 language level, but also puts the 1.7 JDK classes on >>> >> the bootclasspath. >>> >> >> There is already a warning for that, if you compile -source 1.7 >>> >> -target 1.7 with javac, >>> >> >> you will have a warning saying that you have to set the >>> >> bootclasspath to a 1.7 rt. jar >>> >> > I've made this point before, but do note that other libraries on >>> the >>> >> classpath are not subject to this constraint or warning. >>> >> > >>> >> >>> >> And at least some parties do not use javac's bootclasspath >>> mechanism to >>> >> limit the core API that the compiled code uses; a third party API >>> >> checking tool is used instead. In such workflows, by design the javac >>> >> check is circumvented. (I don't necessary think this is a good >>> workflow, >>> >> but it is being used.) >>> >> >>> >> -Joe >>> >> -- - DML From spullara at gmail.com Tue Oct 29 17:40:11 2013 From: spullara at gmail.com (Sam Pullara) Date: Tue, 29 Oct 2013 20:40:11 -0400 Subject: An unintended interaction In-Reply-To: <5270514A.2060307@oracle.com> References: <52704C0D.50809@oracle.com> <5270514A.2060307@oracle.com> Message-ID: <756C9134-18D5-4246-B52B-5DCC708A7AC7@gmail.com> I kind of like the idea that the interface might choose its application implementation based on the arguments passed on the command line.? Sam? ---Sent from Boxer On 30/10/2013 10:00 AM, Brian Goetz wrote: > If you have: > > public interface Foo { > public static void main(String[] args) { > System.out.println("Foo!"); > } > } > > you can say > java Foo > > and it will run. Is that what we intended? Should the java launcher > detect this and not look for main() methods in interfaces? Seems harmless to me but probably implies a need to update documentation in a number of places. David -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131029/7d35fb04/attachment.html From kevinb at google.com Tue Oct 29 23:23:59 2013 From: kevinb at google.com (Kevin Bourrillion) Date: Tue, 29 Oct 2013 23:23:59 -0700 Subject: An unintended interaction In-Reply-To: <52704C0D.50809@oracle.com> References: <52704C0D.50809@oracle.com> Message-ID: I also feel this is harmless. On Tue, Oct 29, 2013 at 5:00 PM, Brian Goetz wrote: > If you have: > > public interface Foo { > public static void main(String[] args) { > System.out.println("Foo!"); > } > } > > you can say > java Foo > > and it will run. Is that what we intended? Should the java launcher > detect this and not look for main() methods in interfaces? > > -- Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131029/7873997b/attachment.html From andrey.breslav at jetbrains.com Wed Oct 30 00:33:09 2013 From: andrey.breslav at jetbrains.com (Andrey Breslav) Date: Wed, 30 Oct 2013 11:33:09 +0400 Subject: An unintended interaction In-Reply-To: References: <52704C0D.50809@oracle.com> Message-ID: Looks harmless to me -- Andrey Breslav http://kotlin.jetbrains.org http://jetbrains.com Develop with pleasure! On Oct 30, 2013, at 10:23 , Kevin Bourrillion wrote: > I also feel this is harmless. > > > On Tue, Oct 29, 2013 at 5:00 PM, Brian Goetz wrote: > If you have: > > public interface Foo { > public static void main(String[] args) { > System.out.println("Foo!"); > } > } > > you can say > java Foo > > and it will run. Is that what we intended? Should the java launcher detect this and not look for main() methods in interfaces? > > > > > -- > Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131030/be169bfd/attachment.html From forax at univ-mlv.fr Wed Oct 30 00:50:17 2013 From: forax at univ-mlv.fr (Remi Forax) Date: Wed, 30 Oct 2013 08:50:17 +0100 Subject: An unintended interaction In-Reply-To: References: <52704C0D.50809@oracle.com> Message-ID: <5270BA39.90806@univ-mlv.fr> yes, main is an entry point, so it works and I am not able to see why it should not work. R?mi On 10/30/2013 08:33 AM, Andrey Breslav wrote: > Looks harmless to me > > -- > Andrey Breslav > http://kotlin.jetbrains.org > http://jetbrains.com > Develop with pleasure! > > > > On Oct 30, 2013, at 10:23 , Kevin Bourrillion > wrote: > >> I also feel this is harmless. >> >> >> On Tue, Oct 29, 2013 at 5:00 PM, Brian Goetz > > wrote: >> >> If you have: >> >> public interface Foo { >> public static void main(String[] args) { >> System.out.println("Foo!"); >> } >> } >> >> you can say >> java Foo >> >> and it will run. Is that what we intended? Should the java >> launcher detect this and not look for main() methods in interfaces? >> >> >> >> >> -- >> Kevin Bourrillion | Java Librarian | Google, Inc. |kevinb at google.com >> > From Vladimir.Zakharov at gs.com Wed Oct 30 05:53:58 2013 From: Vladimir.Zakharov at gs.com (Zakharov, Vladimir) Date: Wed, 30 Oct 2013 08:53:58 -0400 Subject: An unintended interaction In-Reply-To: References: <52704C0D.50809@oracle.com> Message-ID: I too feel it is harmless. Not even mostly harmless, just harmless. I feel it is not more unexpected than having main() running in an abstract class? The Goldman Sachs Group, Inc. All rights reserved. See http://www.gs.com/disclaimer/global_email for important risk disclosures, conflicts of interest and other terms and conditions relating to this e-mail and your reliance on information contained in it. This message may contain confidential or privileged information. If you are not the intended recipient, please advise us immediately and delete this message. See http://www.gs.com/disclaimer/email for further information on confidentiality and the risks of non-secure electronic communication. If you cannot access these links, please notify us by reply message and we will send the contents to you. From: lambda-spec-experts-bounces at openjdk.java.net [mailto:lambda-spec-experts-bounces at openjdk.java.net] On Behalf Of Kevin Bourrillion Sent: Wednesday, October 30, 2013 2:24 AM To: Brian Goetz Cc: lambda-spec-experts at openjdk.java.net Subject: Re: An unintended interaction I also feel this is harmless. On Tue, Oct 29, 2013 at 5:00 PM, Brian Goetz > wrote: If you have: public interface Foo { public static void main(String[] args) { System.out.println("Foo!"); } } you can say java Foo and it will run. Is that what we intended? Should the java launcher detect this and not look for main() methods in interfaces? -- Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mail.openjdk.java.net/pipermail/lambda-spec-experts/attachments/20131030/5ddd5a25/attachment.html From daniel.smith at oracle.com Wed Oct 30 12:36:45 2013 From: daniel.smith at oracle.com (Dan Smith) Date: Wed, 30 Oct 2013 13:36:45 -0600 Subject: JSR 335 Lambda Specification, 0.7.0 Message-ID: <74AB836A-DD97-4F93-92BD-5A4164EF9E58@oracle.com> An updated specification can be found here: http://cr.openjdk.java.net/~dlsmith/jsr335-0.7.0/ This is being submitted as the JLS/JVMS portion of the JSR 335 Public Review. There are just a few bug fixes since 0.6.3. Other links Diff: http://cr.openjdk.java.net/~dlsmith/jsr335-0.7.0-diff.html One-page HTML: http://cr.openjdk.java.net/~dlsmith/jsr335-0.7.0.html Downloadable zip: http://cr.openjdk.java.net/~dlsmith/jsr335-0.7.0.zip Full change log, from the document: > Typing and Evaluation: Added a paragraph for the Java Object Serialization Specification. Cleaned up presentation of method reference resolution logic. > > Overload Resolution: Ensured that, in most-specific testing, all varargs parameter types are considered, even when there are 0 varargs arguments. > > Type Inference: Provided full rules for "more specific method" inference. Backed off of approach to unchecked conversion inference, and specified when unchecked conversions are allowed by reduction. In resolution, adjusted to perform lub only on proper types. > > Default Methods: Added a discussion about binary compatibility to Chapter 13. > > Java Virtual Machine: Fixed text in method resolution to properly ignore static and private methods in superinterfaces. ?Dan