[External] Foo / Foo.ref is a backward default; should be Foo.val / Foo
Kevin Bourrillion
kevinb at google.com
Mon Apr 25 16:31:52 UTC 2022
On Mon, Apr 25, 2022 at 10:05 AM Brian Goetz <brian.goetz at oracle.com> wrote:
Bucket 2 — This is the obvious migration target for value-based classes,
It also seems useful as the migration stepping-stone for bucket 1 -> 3.
Which makes me feel good about the possibility of shipping 2 first.
Bucket 3 — here’s where it gets a little fuzzier how we stack it. Bucket 3
> drops reference-ness, or more precisely, gives you the option to drop
> reference-ness.
(and notice how much nicer this phrase is than "... drops reference-ness
and gives you the option to claw it back")
I think we are all happy with Bucket 2; it has a single and understandable
> difference from B1, with clear consequences, it supports migration,
There is still one major problem which I'll try to take to another thread
soon.
I think we are all still bargaining with Bucket 3, ... for me, the main
> question is “how do we let people get more flattening without fooling
> themselves into thinking that there aren’t additional concurrency risks
> (tearing).”
>
The degree of worry over tearing is something we will have to figure out
how to size appropriately. From my/Google's perspective I will continue
arguing that we are making too much of it. If I'm learning how all this
works, and I read/hear a statement like, "By the way, writing racy code can
work out more badly than usual when these things are involved" ... my
reaction would be "okay, noted. I'll keep right on trying to never write
racy code, and if I'm ever diagnosing a puzzling concurrency error I'll
come back and learn what this is all about. Maybe I'll check that my static
analysis tool has a 'data race when accessing value class' finding enabled.
Okay they're working on it, cool...."
And then I'd be fine. My mental model would be fine.
(I'm *much* more concerned about the proliferation of 1970-type bugs from
people using uninitialized values, or the proliferation of
pseudo-nullability patterns required to prevent those bugs. If not for just
this alone, I think I'd favor letting every B2 class *automatically* be B3
with no extra permission!)
That one class gives rise to two types is already weird, and creates
> opportunity for people to think that one is the “real” type and one is the
> “hanger on.” Unfortunately, depending on which glasses you are wearing,
> the relationship inverts. We see this with int and Integer. From a user
> perspective, int is usually the real type, and Integer is this weird
> compatibility shim.
But I think delivering Valhalla means -- perhaps ironically, sure -- that
we can and should invert that expectation. If you can always think of the
reference type as the real thing, then you're getting the "unification" we
promised. Substitute a value type when x-y-or-z. Your static analysis tool
will propose refactoring your code to use the value type when x-y (and
maybe it misses case z).
I think it's only on a surface level that this story makes the value types
look "lesser". We'd still move a lot of units because of their genuinely
compelling advantages.
In the future world, which of these declarations do we expect to see?
>
> public final class Integer { … }
>
> or
>
> public mumble value class int { … }
>
> The tension is apparent here too; I think most Java developers would hope
> that, were we writing the world from scratch, that we’d declare the latter,
> and then do something to associate the compatibility shim with the real
> type.
imho, we will "just" have to retrain them on this. And as I'll keep
repeating, we can't escape this need for retraining no matter which way we
go. I think the story is to a point now where the retraining won't
be *nearly* as hard as it once was.
--
Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com
More information about the valhalla-spec-observers
mailing list