Value types, encapsulation, and uninitialized values

Stephen Colebourne scolebourne at
Tue Oct 16 07:31:19 UTC 2018

On Tue, 16 Oct 2018 at 01:39, Brian Goetz <brian.goetz at> wrote:
> What I'm keen to avoid is a situation where NonNull values have a much
> better performance model that Nullable ones. ie. a Nullable value type
> should still be able to gain the benefits of flattening, otherwise
> whats the point? (A nullable value type is just a value type where the
> bit-pattern of zero is given a special name and prevented from being
> invoked on. Neither of those things prevent the value from being
> flattened.)
> Be careful here what you mean by "performance model.”  Nullable value types WILL be more expensive than non-nullable ones in at least some dimensions.  Remember, value types are a performance win on multiple dimensions:
>  - flatness
>  - density
>  - dispatch (due to monomorphicity)
>  - calling convention (scalarizability, non-identity, etc)
>  - .. more …
> Nullable value types would retain the flatness win; they might retain the density win (but in the worst case, might need more bits, backsliding on that count); the extra code paths required for null checking would likely put a dent in the others.  So the advice would definitely be “use this with care; it has a cost.”  It’s surely not 100% of what you gained by going to values in the first place, but its not 0% either.  Which is a good argument for making the user choose.

Agreed. The choice seems sound. What I was driving at with "much
better" is a performance model where nullable values get closer to 80%
of the benefits than only 50%. Not that it can be distilled to a
percentage of course...

I also agree that a nullable value might need one more bit (which
might align to a lot more), but I want to ensure that those nullable
values that can arrange things such that their natural all-zero bits
is not valid (like LocalDate or Money) can avoid that extra bit.


More information about the valhalla-dev mailing list