Primitive type pattern (as actually specified) is considered harmful
Brian Goetz
brian.goetz at oracle.com
Wed Sep 10 13:03:43 UTC 2025
> considered harmful
We can add that to the list of phrases that Remi is not allowed to utter
on this list.
> # This is the wrong semantics
I believe we've discussed several time how this is the "wrong" way to
make your case. But, presentation aside, I can't really figure out which
argument you are actually trying to make.
What I see, though, in your objections, is a classic fallacious pattern:
- language already has features X_i (for i in 0..N, N is large)
- language adds feature Y, which is orthogonal to all X_i
- the combination of Y and X_4302 is "confusing"
- Y gets blamed
- the inevitable bargaining begins, where it is attempted to make Y
1000x more complicated to avoid this particular interaction
One reason that this is an unhelpful way to go about things is that it
is optimizing for the wrong thing. The extension of instanceof to
primitive types is utterly simple and straightfoward, and consistent
with its meaning for reference types; the simplicity of this is a source
of strength.
To be clear, here's what's going on:
- Today, for reference types, `x instanceof T` asks "can the reference
x be safely cast to T", where "safe" means "not null, and conversion
without CCE".
- A language that has casts needs instanceof (or the equivalent),
since you should be able to ask "would it succeed" before doing
something that might fail.
- We extend the interpretation `x instanceof T` to _all_ types,
appealing to the same rule: would casting be safe. We extend the
interpretation of "not safe" as "losing data" in addition to errors.
- Type patterns for primitives are built on instanceof in the same way
that they are for reference types.
That's it; that's the whole feature. But, because it is built on
casting, to the degree that you might be confused by casting, you will
be confused (in exactly the same way) by instanceof.
The corner cases around nullity are mostly a distraction; Valhalla will
refine these in a way that gives you more control (e.g., `instanceof T?`
vs `instanceof T!`) and makes it more clear what is going on with
respect to nullity. We should set these aside for now.
The details of conversion between integral and floating point types is
already problematic; we know this. Specifically, we allow certain
conversions between int/long and float/double _in assignment and method
context_ when we should not; this is sadly an example of copying too
literally from the C spec in 1995. We can talk about that (but, not in
the context of an "X is wrong" discussion), and there are things we can
do to improve things.
In your example, the problematic part is really:
Plane plane = new Plane(200_000_007, 16_777_219);
because this is a lossy implicit conversion. And it would be totally
reasonable to warn here (in fact, we would like to do so.) But don't
blame the pattern match for this. That's complaining of is "conversion
between int and float is confusing, and now you gave me a feature that
lets me do more conversion, so is is more confusing!" That's fine, but
you have to stop before you leap madly to "so its wrong."
I suggest that, instead of replying to individual points, you start
over, and think carefully about what you are actually trying to say.
Maybe you have an argument here, but all I see is a mess of "I found a
confusing example, so its all wrong." Best to start over.
On 9/10/2025 5:39 AM, Remi Forax wrote:
> Hello all,
> The idea of JEP 507 is for the following code:
>
> Object o = "foo";
> switch(o) {
> case String s -> ...
> default -> ...
> }
>
> The "case String" recovers the dynamic class of the value.
> The assignment does a widening and the pattern matching does the narrowing back to the original class.
> This can be seen as the transformation chain, String -> Object -> String.
>
> The JEP 507 proposes to apply the same principle to primitive types,
> By example, the transformation byte -> int -> byte, the pattern matching acting as a kind of inverse operation.
>
> For me, while this idea is coherent with itself, it fails at different levels.
>
>
> # Primitive conversions can be lossy
>
> With primitive types, the widening can be lossy (int to float is lossy, long to double is lossy),
> and from a mathematical point of view inverting a lossy function makes no sense.
>
> This give us this kind of puzzler:
>
> record Plane(float x, float y) {}
>
> void main() {
> Plane plane = new Plane(200_000_007, 16_777_219);
>
> switch (plane) {
> case Plane(int x, int y) -> IO.println("plane " + x + " " + y);
> default -> IO.println("not a plane");
> }
> }
>
> You may say that the bug lies in the fact that Java should not allow lossy conversions,
> And I would agree, but it does not change the fact that conceptually, the pattern matching is trying to invert a lossy function, which again make no sense.
>
>
> # This is the wrong semantics
>
> For most people, int is equivalent to Integer!, this is also where we are aiming for Valhalla.
> Given that a switch can match null but only using a separately case null, matching a primitive type or its corresponding wrapper types should be equivalent.
>
> Sadly, this is not the semantics defined by the JEP 507.
>
> I propose, instead of the semantics of the JEP 507, to use two rules:
> - If the value switched upon is an Object, a "case int" should be equivalent to a "case Integer" and vice versa.
>
> By example:
>
> Object o = ...
> switch(o) {
> case int i -> ...
> default -> ...
> }
>
> should be equivalent to
>
> Object o = ...
> switch(o) {
> case Integer i -> ...
> default -> ...
> }
>
>
> - If the value switched upon is a primitive type, then only conversion that can occur is a boxing conversion.
>
> int v = ...
> switch(v) {
> case Integer _ -> ... // ok
> }
>
> If people want to know if an int can be safely converted to a byte, I think that using a static deconstructor method is better.
>
> regards,
> Rémi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/amber-spec-observers/attachments/20250910/a0e0b401/attachment-0001.htm>
More information about the amber-spec-observers
mailing list