Primitives in instanceof and patterns
John Rose
john.r.rose at oracle.com
Fri Sep 9 21:32:04 UTC 2022
On 9 Sep 2022, at 11:07, Brian Goetz wrote:
> … Regardless, a better way to think about `instanceof` is that it is
> the precondition for "would a cast to this type be safe and useful."
> In the world where we restrict to reference types, the two notions
> coincide.
And, in the future world where every value (except possibly `null`) is
an *instance*, the two notions will coincide again, without the
restriction to reference types. We are taking reasonable incremental
steps toward that world here, IMO.
> But the safe-cast-precondition is clearly more general (this is like
> the difference between defining the function 2^n on Z, vs on R or C;
> of course they have to agree at the integers, but the continuous
> exponential function is far more useful than the discrete one.)
> Moreover, the general mental model is just as simple: how do you know
> a cast is safe? Ask instanceof. What does safe mean? No error or
> material loss of precision.
And (to pile on a bit here), the casts you are speaking of here, Brian,
*are the casts we have in Java*, not some idealized or restricted or
cleaned up cast. So we have to deal with the oddities of primitive
value conversion.
The payoff from dealing with this is that the meaning of patterns is
derived systematically from the meaning of casts (and other
conversions). That is hugely desirable, because it means a very complex
new feature is firmly anchored to existing features. Getting this kind
of thing right preserves and extends Java’s role as a world-class
programming language.
> A more reasonable way to state this objection would be: "most users
> believe that `instanceof` is purely about subtyping, and it will take
> some work to bring them around to a more general interpretation, how
> are we going to do that?"
This is subjective and esthetic, but I think two thoughts help here
(with teaching and rationale): First, everything (except `null`) is an
instance, or will eventually be. Second, subtyping in Java includes the
murky rules for primitive typing.
Those specific rules more or less systematically determine how casts
work. They should also systematically determine (in the same way) how
patterns work. After all, casts and patterns are (and very much should
be!) mirror image counterparts of each other, or dance partners holding
hands.
(I visualize such things as boxes on the whiteboard with reversible
arrows between them. You could say “category” if you like. Brian
likes to say “dual”, and I took linear algebra too, but I doubt most
folks took the trouble in that class to be curious about exactly what a
“dual space” really is all about.)
Rather than extending the language we wish we had, we are extending the
one we *do* have, and that means aligning even the murky parts of casts
with pattern behavior.
In the end, I don’t think it’s very murky at all in practice, except
of course for the outraged theoretical purist (who lives in each of us).
There is certainly *no new murk*. IMO what Brian is showing works out
surprisingly well, so kudos to him for following his nose to a design
with liveable details. This success also IMO demonstrates the foresight
of the original authors and current maintainers of the spec, even in the
“murky” parts of primitive value conversions.
— John
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/amber-spec-observers/attachments/20220909/3114b109/attachment-0001.htm>
More information about the amber-spec-observers
mailing list