Draft Spec for Fourth Preview of Pattern Matching for Switch (JEP 433) and Second Preview of Record Patterns (JEP 432) now available

Dan Smith daniel.smith at oracle.com
Fri Oct 28 00:51:26 UTC 2022

I've reviewed the draft and put together a few patches, but here are a few more stray bits of feedback from me, most of which get into concepts and design:


5.5: There's this "downcast compatible" notion that was apparently introduced with instanceof pattern matching. It means a cast is legal and not unchecked. Ok. But—not all legal casts are downcasts, right? It's pretty confusing to use a term that suggests that they are.

jshell> Serializable ser = "abc"
ser ==> "abc"

jshell> if (ser instanceof Comparable<?> c) { System.out.println("matches");}

--- Maybe someday we'll have expressions nested in patterns, but for now isn't it meaningless to ask whether a pattern variable is in scope within another part of the same pattern?


13.4.2: I don't know if I'd talk about "migration compatibility" here. It's a rather common thing for a separately-compiled change to cause a runtime error that is not a linkage error. But previously the phrase "migration compatible" doesn't get mentioned in Chapter 13, and we don't go out of our way to label these conditions. It's just... a runtime exception.


14.11.1: I'm not sure the prohibition of 'when false' makes sense. We don't complain about unreachable code in 'if (false) { ... }', for example, or consider variables touched inside the 'if' to be DU.

A possible use case is to turn a rule on or off based on a constant somewhere, or just by directly editing the code during a debugging session.


14.11.1: The compatibility rules are specified as "for all" rules, which means you can do some surprising things when they're vacuously true...

double d = 23.0;
switch (d) { default -> System.out.println("ok"); } // no error

(This is an "enhanced switch", so requires a default to be exhaustive.)

This seems okay to me—Java doesn't prevent all stupid programs—but worth a sanity check. And we should be careful not to make assertions that claim such switches can't be written.


14.11.1: A couple of dominance design notes:

- Claiming that a guarded type pattern dominates a constant seems overly-nannying to me. Usually, these sorts of errors occur because the compiler can *prove* you've done something stupid (e.g., "this cast will always fail"). Here, the compiler is assuming the switch rule is stupid unless the programmer can prove it's not, to the compiler's satisfaction. And the compiler isn't very smart! Suggestion: just leave guarded type patterns out, they don't dominate anything else.

- It's weird that 'null' is the one case constant that can't come after a 'default', given that 'null' is the one value that can't be handled by a 'default'. :-) Since we have to allow other case constants after 'default' anyway, I think it's best to allow 'null' as one more. ("default dominates everything except constants" is a rule I can remember.)

(Also note—and I think this is good—that the dominance rules for 'default' don't care whether this is an "enhanced switch" or not. The fewer things that depend on enhanced vs. traditional, the better.)


14.11.1: I don't think we prevent this scenario:

switch (obj) { case String s: case Object o: }

The rules about duplicate pattern labels apply to "a statement is labeled...". There's no labeled statement here.

I think we should probably prohibit this, for the same code hygiene reason that we prohibit it before a statement group that doesn't try to use any of the variables.

(This would not be a legal switch expression, because switch expressions can't fall out. But enhanced switch statements can, as long as they're exhaustive.)


14.11.2: Design suggestion: rename "enhanced switch" to "pattern switch", define it as only those switches that make use of patterns, and don't worry about the remaining "switch with new features" corner cases. It's just such an important concept that I think there's a benefit to making the distinction really clean and obvious. E.g., asking someone new to Java to memorize the ad hoc set of types that don't demand exhaustiveness seems unhelpfully complicated.

(Corner cases I'm thinking about: want to use a null constant but not be exhaustive? Fine. Want to have an Object input type but an empty switch body? Pointless, but fine. Etc.)


14.11.3, 15.28.2: Opinionated take: we're continuing to throw ICCE when an unmatched enum constant slips through a switch, because it would be a (behaviorally) incompatible change to throw something else. Meanwhile, unmatched sealed classes get a MatchException. I think there are a tiny number of people who would notice if we changed from ICCE to MatchException for enums too, and a lot more people who will have to cope with the historical baggage going forward if we don't. We should just standardize on MatchException.

Further complication: as written, a sealed-interface switch that doesn't mention any enum constants will still throw if the value that gets through happens to be an enum constant.

switch (sealedThing) {
    case Foo f -> {}
    case Bar b -> {}
    // an enum class gets added as a permitted subclass: ICCE
    // a regular class gets added as a permitted subclass: ME

We could specify this differently, but I'm not sure there's a bright line that will be intuitive.


14.14.2: I'm sure there's been some discussion about this already, but the use of MatchException for nulls in 'for' loops but NPE for nulls in switches also seems like a sad historical wart. What's wrong with an NPE from a 'for' loop?


14.30.1: Parenthesized patterns are no fun for a spec writer. :-) Are they actually useful? I'm not sure I've seen an example demonstrating what they're for. (The JEP only talks about them abstractly.)

Followup: would it make sense for a 'for' loop to permit a parenthesized pattern?


14.30.1, 14.30.2: I'm not sold on *any patterns*, *resolved patterns*, and *executable switch blocks*.

The semantics are fine—some type patterns will match null, others will not. But it seems to me that this can be a property of the type pattern, one of many properties of programs that we determine at compile time. No need to frame it as rewriting a program into something else. (Compare the handling of the '+' operator. We don't rewrite to a non-denotable "concatenation expression".)

- The pattern matching runtime rules can just say "the null reference matches a type pattern if the type pattern is unconditional".
- We can make it a little more clear that a type pattern is determined to be unconditional, or not, based on its context-dependent match type (is that what we call it?)

For a *compiler*, it will be useful to come up with an encoding that preserves the compile-time "unconditional" property in bytecode. But that's a compiler problem, not something JLS needs to comment on.


14.30.3: To my ear (maybe this is an American thing?), "applicable at" sounds wrong, and it would read more naturally to say "applicable for" or "applicable to". Same for "unconditional at".


14.30.3: A record pattern can't match null, but for the purpose of dominance, it's not clear to me why a record pattern can't be considered unconditional, and thus dominate the equivalent type pattern or a 'default'.

switch (point) {
    case Point(int x, int y) -> {}
    case Point p -> {} // unreachable
    default -> {} // unreachable

More information about the amber-spec-experts mailing list