What we have lost ?

Dan Smith daniel.smith at oracle.com
Tue Sep 6 22:08:57 UTC 2022


> On Sep 6, 2022, at 1:32 AM, Remi Forax <forax at univ-mlv.fr> wrote:
> 
> What is missing/not supported by the current model is value classes that should not be used by reference,
> either because it will cause performance issues or because the user will not get the semantics he think he will get.

This is a useful question to explore, so thanks for bringing it up: can we think of use cases in which a class should, in the informed opinion of its author, always be referred to via its value type? If so, a reference-by-default approach is definitely problematic, because one of our starting assumptions was that most uses of value classes would be totally fine using a reference type.

> Here is a list of such value types:
> - unit types, value types like by example Nothing (which mean that a method never returns) with no fields.
>  Because creating a ref on it creates something :)

If you truly mean for such a class to have no instances, that's not something a class with a value type can assert—the default instance always exists. I can see how it would be nice, for example, to have a type like Void that is also non-nullable, but value types are not the feature to accomplish that.

> - wrappers/monads that modify the semantics, by example a generic value class Atomic that plays the same role as an AtomicReference, AtomicInteger, etc
>  the problem here is that the default semantics is not the semantics the user want.

Okay, so say we have value class Atomic<T>, and we're in a future where this gets specialized. I think you're saying it will be important to say 'Atomic.val<Foo>' at all uses rather than 'Atomic<Foo>'. But I'm not clear on the argument for why that would be. Can you elaborate?

> - SIMD vectors, if those are nullable, the VM/JIT will insert implicit null checks which are not usually a problem apart in thigh loop like users write with SIMD vectors.

The *storage* should definitely use a value type, so in this sort of application we'd encourage value-typed array allocations (and value-typed type arguments for wrapping data structures).

In a loop over a flat array, I would expect it to be okay to talk about the reference type in source, and have the JIT generate optimal code to work with the underlying flat storage, without any allocations or null checks. My sense is that we suspect that this can work reliably, but it could use more targeted performance testing to confirm.

> - existing value classes in Scala or Kotlin, those are not nullable by default but in the current design, getClass() will happily reflect them with a nullable class making Scala/Kotlin second class citizens of the Java platform.

Is the problem here that Scala/Kotlin will want reference-default interpretation of names in Java source? (If so, <shrug>, if you want a good user experience with Kotlin types, write Kotlin source.)

Or is the problem that they will want the reflection API to behave differently, making Foo.val.class the "primary" class object, not a secondary one? (If so, <shrug> again, the Java reflection API is Java-oriented, interop is not a major factor in its design.)

Maybe I'm missing your point on this one?



More information about the valhalla-spec-observers mailing list