Questions on default values

Brian Goetz brian.goetz at oracle.com
Fri Mar 19 14:43:14 UTC 2021


Your reading of this sketch of one of the options for a feature we might 
or might not is mostly correct, but still is missing some subtleties.  
Clarifications inline.

> Three linked questions:
>
> My reading of the above is that a brand new NoGoodDefault (nullable)
> primitive type `YearWeek(int,int)` would work as follows:
>
>    YearWeek yw = new YearWeek(2021, 6);
>    YearWeek ywn = null;
>    YearWeek ywd = YearWeek.default;

In some of these there is an implicit conversion; `null` is of type 
YW.ref (null is always a reference), while YW.default is of type 
YW.val.  In the second line, there's an implicit conversion from the ref 
to val type.  Similarly, if you were to compare (null == YW.default), 
one side would be widened/narrowed to that of the other.  Doesn't matter 
in which direction, since it interconverts in both directions.

Note that this is introducing new complexity into JLS Ch5 (Conversions 
and Contexts.)  This is not free.

> - this code compiles
> - all three variables are of the same primitive type - `YearWeek`
> - all three variables are primitives, not references
> - `ywn` and `ywd` are == and indistinguishable
> - if these are instance variables then all three are flattened
> - any method call on `ywn` or `ywd` throws NPE

These are a correct interpretation of the sketch.

> - there is no such thing as `YearWeek.val` (so a developer cannot
> refer to a non-null YearWeek)

This last one is incorrect.  YW.val is a real type, and all of the above 
are YW.val.  The only difference here is the interpretation of the 
zero-bits flat value ("vull").  Vulls would be interpreted as null when 
converting to YW.ref, rather than an element of the domain, and vice versa.

>    Object obj = yw;
>    YearWeek[] arr = new YearWeek[5];
>
> - variable 'obj' is a reference of type `YearWeek.ref`

Not quite.  (This has nothing to do with NGD.)   For any primitive class 
P, there is a universe of instances of P (primitive objects) which have 
no identity.  The type `P.val` (for which P is usually an alias) is this 
universe of P instances.  The type `P.ref` is the universe of 
_references to_ those instances of P, plus null.  But there are *no* 
instances of P.ref; it's more like an interface.

     P p = ...
     Object obj = p

what happens here is that we do a primitive widening conversion from P 
to P.ref (take a reference to the instance), and then do a reference 
widening from P.ref to Object (subtyping).  If you then ask for 
`obj.getClass()`, it will say P.  With NGD this doesn't change; it just 
affects what happens when p is vull.

> - variable 'arr' is an array of the primitive type `YearWeek`
> - `arr[0] == YearWeek.default` is true
> - `arr[0] == null` is true
> - the elements of the array are flattened into a contiguous piece of memory

Correct.

> Is this analysis correct?
> (If so, then I'm pleased. I wrote some "requirements" in July 2020
> exactly along these lines but never sent them to the list)

The fact that you're pleased is not necessarily encouraging, sad to say; 
it's evidence for our biggest concern about this approach, which is that 
it will be an attractive nuisance ("Finally, my nullable value types"), 
developers will be unaware of the performance tradeoffs they are making, 
and then conclude "value types suck."

The feature here is *not* "nullable value types", as much as some would 
like that to be the feature!  The feature here is "detection of 
uninitialized primitives."  It is convenient that the latter travels 
through null, but null is a means to the end, not the end.

> Is the performance of method calls on NoGoodDefault primitives likely
> to be of the same order of magnitude as calls on references? ie. does
> the extra null/default checks on a NoGoodDefault primitive type
> effectively equate to those already done on reference types today?

No, and that's the problem.  Every dereference must compare to vull 
first, and throw on vull.  Comparison to vull is significantly more 
expensive than comparison to null.  (Further, the VM has many tricks for 
optimizing null checks that are unlikely to scale to vulls.) You pick up 
the flattening/density/calling convention/inlining benefits, but you pay 
a cost every time you access one of its fields or call its methods.

There are tricks we can use to reduce the cost of the vull check (e.g., 
pivot fields, hoisting, vull-tracking), but these come out of the same 
implementation and complexity budgets that other features come out of, 
so it's not obvious that this is where we should spend it, especially if 
people want it to be something it never will be.

> If the analysis is correct is it now the case that there is no need
> for the "reference-favoring primitive classes" concept. ie. that
> `java.time.LocalDate` can be migrated to a normal NoGoodDefault fully
> flattened primitive type?

Nope, sorry!  The ref-favoring model exists to support _compatible 
migration_ for classes like Optional (and to a different degree, 
Integer), which this doesn't give us (and which is far^2 more 
important.)  If we were going to ditch one in favor of the other, it 
would be a slam-dunk to say "just use ref-favoring primitives if you 
want nullability."  Migration compatibility is far more important than 
filing down this rough edge.





More information about the valhalla-dev mailing list