capturing array of wildcard?

Stephan Herrmann stephan.herrmann at
Sun Dec 1 08:27:11 PST 2013

Me again,

Javac accepts the following example, I don't know why:

   interface OO<T,E> {}
   interface TO<T> extends OO<String,T> {}
   interface TT extends TO<String> {}

   public class X {
     <E, T> TO<T> combine(final TO<? super E> x, final OO<E, T>[] y) { return null; }
     void foo(TT tt, TO<? super Object>[] too) {
       combine(tt, too);

How would the spec allow this?
Looking at the second parameter ("too") being passed into combine(),
I see this sequence of reduction:
- ⟨too -> OO<E#0,T#1>[]⟩
- ⟨TO<? super java.lang.Object>[] -> OO<E#0,T#1>[]⟩
- ⟨TO<? super java.lang.Object>[] <: OO<E#0,T#1>[]⟩
- ⟨TO<? super java.lang.Object> <: OO<E#0,T#1>⟩
-- ⟨java.lang.String <= E#0⟩
-- ⟨? super java.lang.Object <= T#1⟩

Type argument containment in 18.2.3 with wildcard and type
yields FALSE.

If we omit the array dimension in the example, we start with
- ⟨too -> OO<E#0,T#1>⟩
- ⟨TO<capture#1-of ? super java.lang.Object> -> OO<E#0,T#1>⟩
and all will be well.

To the best of my knowledge, capture of an array type is
the identity function, and inference does not perform
additional capture, right?
That would mean the correct behavior is to reject the above?


More information about the lambda-spec-experts mailing list