[lworld] RFR: 8244231: [lworld] Add support for ref-default and val-default inline classes.

Srikanth Adayapalam sadayapalam at openjdk.java.net
Thu Jul 22 11:41:02 UTC 2021


On Thu, 22 Jul 2021 11:10:23 GMT, Srikanth Adayapalam <sadayapalam at openjdk.org> wrote:

>> More generally, I think I'm slightly uncomfortable by the use of the letters `Q` and `L` in the enum constants `X_TypeOf_Y`. Now, when `L` and `Q` are in the `X` position, the semantics is simple: `L` means `.ref`, `Q` means `.val`.
>> 
>> But when `L` and `Q` appear in the `Y` position, then it becomes messy - because the choice is no longer binary:
>> 
>> * the class is a primitive class or not (1 bit)
>> * if primitive, the class is reference-favoring or not (another bit)
>> 
>> Right now, it seems that you use `Q` to denote whether the class is primitive class or not, which is useful, I assume, to detect distinction between a legacy reference type (`L_TypeOf_L`) and a reference projection of a primitive class (`L_TypeOf_Q`). This distinction is, I believe an important one, as the former has identity, the latter doesn't, so type checking would probably differ.
>> 
>> What I'm less sure is whether you want/need different type checking rules for when `L_TypeOf_Q && reference-favoring` vs. `L_TypeOf_Q && !reference-favoring`. What you have is still an identity-less reference type - so why should it be important whether you get a redundant projection or not (which might even be disabled at some point during type checking) ?
>> 
>> So, back to your question, I guess I don't see why you have defined the current behavior as being "natural". To me, getting an X projection on a X-favoring primitive class is an idempotent operation, which you can repeat even 20 times, and the type-system shouldn't care. Are there places in the code where you need this sharper distinction?
>
>> > when we have a type that is an L_TypeOf_Q it could be
>> > either the (a) reference projection of value default primitive class or (b) the reference type of a reference default primitive class.
>> 
>> This seems at odds with what the Javadoc for that flavor says:
>> 
>> ```
>> /**
>>              * Reference projection type of a primitive-favoring aka primitive-default
>>              * plain vanilla primitive class type,
>>              */
>>             L_TypeOf_Q,
>> ```
>> 
>> E.g. javadoc suggests that Q means primitive-default already, so there should be no need to check the tsym?
> 
> Good catch! This is certainly a problem in the javadoc of L_TypeOf_Q and needs to be corrected. It was always the intent that L_TypeOf_Q is simply a primitive reference type.

> More generally, I think I'm slightly uncomfortable by the use of the letters `Q` and `L` in the enum constants `X_TypeOf_Y`. Now, when `L` and `Q` are in the `X` position, the semantics is simple: `L` means `.ref`, `Q` means `.val`.
> 
> But when `L` and `Q` appear in the `Y` position, then it becomes messy - because the choice is no longer binary:
> 
> * the class is a primitive class or not (1 bit)
> * if primitive, the class is reference-favoring or not (another bit)
> 
> Right now, it seems that you use `Q` to denote whether the class is primitive class or not, which is useful, I assume, to detect distinction between a legacy reference type (`L_TypeOf_L`) and a reference projection of a primitive class (`L_TypeOf_Q`). This distinction is, I believe an important one, as the former has identity, the latter doesn't, so type checking would probably differ.

First of all, I agree that the javadoc of com.sun.tools.javac.code.Type.ClassType.Flavor#L_TypeOf_Q is incorrect and confusing and this has cascading ripple effects into other areas when it comes to comprehension.

So let me try to clarify the intent.

L_Typeof_Q is simply a primitive reference type - whose underlying primitive class type could either be reference favoring or value favoring.

With this clarification in place, if you look at the flavors, they naturally model the details associated with the domain.

There is inherent disparity/asymmetry in the concerned flavors in that - there is always a primitive reference type available for a primitive class (whether it is reference default or value default) but the reverse is not true bringing about the asymmetry. (ie you can have a reference type without there being a primitive class type i.e the L_Typeof_L legacy flavor)

> 
> What I'm less sure is whether you want/need different type checking rules for when `L_TypeOf_Q && reference-favoring` vs. `L_TypeOf_Q && !reference-favoring`. What you have is still an identity-less reference type - so why should it be important whether you get a redundant projection or not (which might even be disabled at some point during type checking) ?
> 
> So, back to your question, I guess I don't see why you have defined the current behavior as being "natural". To me, getting an X projection on a X-favoring primitive class is an idempotent operation, which you can repeat even 20 times, and the type-system shouldn't care. Are there places in the code where you need this sharper distinction?

Let me punt the question by saying - with the admission that javadoc of L_Typeof_Q was wrong and confusing, do we agree that once that is corrected, the flavors capture and model the domain information faithfully ? I don't off the top of the head recall places where this sharper distinction is needed - but if there is consensus that the model is valid even if a bit baroque, then that is reason to preserve it till we have more experience to definitively prune what we think is valid but superfluous information captured in the abstraction.

-------------

PR: https://git.openjdk.java.net/valhalla/pull/482



More information about the valhalla-dev mailing list