RFR: 8367530: The exhaustiveness errors could be improved [v4]

Aggelos Biboudis abimpoudis at openjdk.org
Tue Nov 4 16:12:44 UTC 2025


On Mon, 3 Nov 2025 15:10:49 GMT, Jan Lahoda <jlahoda at openjdk.org> wrote:

>> Consider code like:
>> 
>> package test;
>> public class Test {
>>     private int test(Root r) {
>>         return switch (r) {
>>             case Root(R2(R1 _), R2(R1 _)) -> 0;
>>             case Root(R2(R1 _), R2(R2 _)) -> 0;
>>             case Root(R2(R2 _), R2(R1 _)) -> 0;
>>         };
>>     }
>>     sealed interface Base {}
>>     record R1() implements Base {}
>>     record R2(Base b1) implements Base {}
>>     record Root(R2 b2, R2 b3) {}
>> }
>> ``` 
>> 
>> This is missing a case for `Root(R2(R2 _), R2(R2 _))`. javac will produce an error correctly, but the error is not very helpful:
>> 
>> $ javac test/Test.java
>> .../test/Test.java:4: error: the switch expression does not cover all possible input values
>>         return switch (r) {
>>                ^
>> 1 error
>> 
>> 
>> The goal of this PR is to improve the error, at least in some cases to something along these lines:
>> 
>> $ javac test/Test.java 
>> .../test/Test.java:4: error: the switch expression does not cover all possible input values
>>         return switch (r) {
>>                ^
>>   missing patterns: 
>>     test.Test.Root(test.Test.R2(test.Test.R2 _), test.Test.R2(test.Test.R2 _))
>> 1 error
>> 
>> 
>> The (very simplified) way it works in a recursive (or induction) way:
>> - start with defining the missing pattern as the binding pattern for the selector type. This would certainly exhaust the switch.
>> - for a current missing pattern, try to enhance it:
>>     - if the current type is a sealed type, try to expand to its (direct) permitted subtypes. Remove those that are not needed.
>>     - if the current (binding pattern) type is a record type, expand it to a record type, generate all possible combinations of its component types based on sealed hierarchies. Remove those that are not needed.
>> 
>> This approach relies heavily on our ability to compute exhaustiveness, which is evaluated repeatedly in the process.
>> 
>> There are some cases where the algorithm does not produce ideal results (see the tests), but overall seems much better than what we have now.
>> 
>> Another significant limitation is the speed of the process. Evaluating exhaustiveness is not a fast process, and this algorithm evaluates exhaustiveness repeatedly, potentially for many combinations of patterns (esp. for record patterns). So part of the proposal here is to have a time deadline for the computation. The default is 5s, and can be changed by `-XDexhaustivityTimeout=<timeout-in-ms>`.
>> 
>> There's also an open possibility for select tools to...
>
> Jan Lahoda has updated the pull request with a new target base due to a merge or a rebase. The pull request now contains 23 commits:
> 
>  - Merge branch 'JDK-8364991' into JDK-8367530
>  - Fixing tests
>  - Merge branch 'JDK-8364991-2' into JDK-8367530-2
>  - Better visualisation.
>  - Merge branch 'JDK-8367499' into JDK-8367530
>  - Merge branch 'JDK-8364991' into JDK-8367499
>  - Fixing tests.
>  - Cleanup.
>  - Cleanup.
>  - Cleanup.
>  - ... and 13 more: https://git.openjdk.org/jdk/compare/e6a3f0da...b7c33349

This is a MEGA PR. Many congratulations @lahodaj. 🎉 

This is a first pass. I will come back to it again as I am slowly progressing into understanding it better.

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 157:

> 155:                         .stream()
> 156:                         .flatMap(pd -> {
> 157:                             if (pd instanceof BindingPattern bp && enum2Constants.containsKey(bp.type.tsym)) {

Can this be subsumed into `computeMissingPatternDescriptions` just to handle the generation of all kinds of incomplete patterns inside there? It seems to be that this client-side would be clearer, but ignore if it would be too much fuss.

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 173:

> 171:     }
> 172: 
> 173:     private CoverageResult computeCoverage(Type selectorType, Set<PatternDescription> patterns, boolean search) {

This extra boolean parameter is subtle (because previously we didn't have a new mode of execution). Can you introduce either a top-level comment at the `computeCoverage` method or something more descriptive than `search` or an enumeration with the two modes? Because now it is calculating both the coverage result and the inverse (uncovered patterns).

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 351:

> 349:             Type instantiated = instantiatePatternType(targetType, csym);
> 350: 
> 351:             return instantiated != null && types.isCastable(targetType, instantiated);

If inference fails, does this mean that we get an exhaustiveness error with a sample generic instantiation?

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 836:

> 834:         if (toExpand instanceof BindingPattern bp) {
> 835:             if (bp.type.tsym.isSealed()) {
> 836:                 //try to replace binding patterns for sealed types with all their immediate permitted types:

Can you reuse the word "viablePermittedPatterns" in this comment? That way the reader can formulate the properties of what this variable represents, faster. 

IIUC this is a process of minimization going from the `viablePermittedPatterns` to `reducedPermittedPatterns`. Not only that, but (I think) there is a property that reduction means that you prune the combinatorial space as long as there are no variables *and* the pattern is the most general. The actual invariants are actually coming from `isPossibleSubtypePredicate` which ensures castability and pattern inference. 

I wonder how can we end up here with completely incompatible classes? (final A, final B) (so that the castability check return false in `isPossibleSubtypePredicate`). Wouldn't that be an error of applicability?

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 893:

> 891: 
> 892:                 for (Type componentType : componentTypes) {
> 893:                     List<Type> variants;

Is this the same conceptually to `viablePermittedPatterns` for the binding patterns case? If yes, we can reuse the same name (just to match the concepts)?

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 924:

> 922:                                                 .collect(Collectors.toCollection(HashSet::new));
> 923: 
> 924:                 //remove unnecessary:

It seems that the removal of the unnecessary patterns is common to the next step as well? If yes, would it be beneficial to extract?

src/jdk.compiler/share/classes/com/sun/tools/javac/comp/ExhaustivenessComputer.java line 948:

> 946:                 //combine sealed subtypes into the supertype, if all is covered.
> 947:                 //but preserve more specific record types in positions where there are record patterns in the original patterns
> 948:                 //this is particularly for the case where the sealed supertype only has one permitted type, the record

There is a word missing:"this is particularly *important* for" I think, right?

-------------

PR Review: https://git.openjdk.org/jdk/pull/27256#pullrequestreview-3416390483
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490487291
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490461430
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2491048541
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490585550
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490667198
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490594825
PR Review Comment: https://git.openjdk.org/jdk/pull/27256#discussion_r2490682402


More information about the compiler-dev mailing list