capturing array of wildcard?

Dan Smith daniel.smith at oracle.com
Fri Dec 6 10:01:47 PST 2013


On Dec 1, 2013, at 9:27 AM, Stephan Herrmann <stephan.herrmann at berlin.de> wrote:

> Me again,
> 
> Javac accepts the following example, I don't know why:
> 
>  interface OO<T,E> {}
>  interface TO<T> extends OO<String,T> {}
>  interface TT extends TO<String> {}
> 
>  public class X {
>    <E, T> TO<T> combine(final TO<? super E> x, final OO<E, T>[] y) { return null; }
>    void foo(TT tt, TO<? super Object>[] too) {
>      combine(tt, too);
>    }
>  }
> 
> How would the spec allow this?
> Looking at the second parameter ("too") being passed into combine(),
> I see this sequence of reduction:
> - ⟨too -> OO<E#0,T#1>[]⟩
> - ⟨TO<? super java.lang.Object>[] -> OO<E#0,T#1>[]⟩
> - ⟨TO<? super java.lang.Object>[] <: OO<E#0,T#1>[]⟩
> - ⟨TO<? super java.lang.Object> <: OO<E#0,T#1>⟩
> -- ⟨java.lang.String <= E#0⟩
> -- ⟨? super java.lang.Object <= T#1⟩
> --- FALSE
> 
> Type argument containment in 18.2.3 with wildcard and type
> yields FALSE.
> 
> If we omit the array dimension in the example, we start with
> - ⟨too -> OO<E#0,T#1>⟩
> - ⟨TO<capture#1-of ? super java.lang.Object> -> OO<E#0,T#1>⟩
> and all will be well.
> 
> To the best of my knowledge, capture of an array type is
> the identity function, and inference does not perform
> additional capture, right?
> That would mean the correct behavior is to reject the above?

The problem is in the vaguely-defined notion of the supertypes of a wildcard-parameterized type.  Here's the current spec text (18.2.3):

"If T is a parameterized class type, C<A1, ..., An>, then among the supertypes of S that are parameterizations of C, a most-specific type is identified: C<B1, ...,Bn>"

In JLS 7 (15.12.2.7), the same concept is invoked with phrases like "if A has a supertype of the form G<..., Xk-1, V, Xk+1, ...>".

You might recall we considered doing something to define exactly how this supertype computation worked, but decided against it for now.  Something to clean up next time around...

The easy answers are wrong.  For example:

1) It's incorrect to perform substitution with a wildcard.  Type variables represent _types_; wildcards are not types, and make no sense if used as arbitrary types.  So a substitution like [ E := ? super Object ] is just wrong.

Just one example:
interface A<T> { T get(); void put(T arg); }
class B<U> extends A<List<U>> { ... }

B<? extends Number> b = new B<Integer>();
A<List<? extends Number>> a = b; // wrong!
a.put(new ArrayList<Double>());

2) It's incorrect to perform capture to find the supertypes of a wildcard-parameterized type.  Capture is useful when reasoning about subtyping -- e.g., S <: T if capture(S) <: T.  But the capture of a type is not its own supertype.

ArrayList<?> <: List<CAP> // wrong!

---

The right answer is probably to do something like this: take the capture, derive the supertype, then "uncapture" the type by inserting some wildcards and removing capture variables.  The details would need to be specified, since there's more than one way to do it.  The key, though, is that the result must actually be a supertype of the type you started with.

In the meantime, what javac does right now is more or less #2 -- take the capture of the type, and then just keep going.  As you've seen, these leads to some logical errors like in your example.  I imagine Eclipse has historically done something similar.

—Dan


More information about the lambda-spec-experts mailing list