enhanced enums - back from the dead?

Maurizio Cimadamore maurizio.cimadamore at oracle.com
Fri Dec 7 23:57:58 UTC 2018


<snip>
> so let's retry
>
> public enum Foo<T extends Comparable<T>> {
>    S<String>(""), I<Integer>(42);   // JEP 301 mentions the diamond syntax
>    
>    private T t;
>    
>    public Foo(T t) {
>      this.t = t;
>    }
>    
>    T t() {
>      return t;
>    }
>    
>    public static void main(String[] args) {
>      Arrays.stream(values()).sorted(Comparator.comparing(Foo::t)).forEach(System.out::println);
>    }
> }
>
> My original point was, when you introduce a raw type in a Stream, you end up with a warning on every methods because of the inference.
> While if you use the correct type, with an unbounded wildcard, the compiler stop you to do stupid things,
> basically, you are trading an error to a series of warning people will be happy to suppress with @SuppressWarnings.
>
>    Arrays.stream(values()).sorted(Comparator.comparing(Foo::t)).forEach(System.out::println);
>    emits a series of warnings
Yes, in this case the unchecked warning will hit. I understand that this 
is suboptimal (see at the end).
>    Stream.of(S, I).sorted(Comparator.comparing(Foo::t)).forEach(System.out::println);
>    emits an error
Yes, this is no different than using classes, really.
> BTW, it seems there is an issue somewhere in the compiler because
>    Arrays.stream((Foo<?>[])values()).sorted(Comparator.comparing(Foo::t)).forEach(System.out::println);
> happily compiles ??

Well, Foo and Foo<?> are interchangeable, so I guess the compiler is 
also trading Foo[] for Foo<?>[]. As per JLS 5.1.9:

"There is an unchecked conversion from the raw array type G[]k to any 
array type of the form G<T1,...,Tn>[]k. (The notation []k indicates an 
array type of k dimensions.)

Use of an unchecked conversion causes a compile-time unchecked warning 
unless all type arguments Ti (1 ≤ i ≤ n) are unbounded wildcards 
(§4.5.1), or the warning is suppressed by @SuppressWarnings (§9.6.4.5). "


> You can inspect any class even the private one by reflection, but you can not call the constructor if the class is private, if the class is package private, the constructor is package private too so you are widening the access.
As for access/security, we already check in many places that no call to 
the constructor of an enum can occur, so this doesn't seem like a new 
issue? I mean, we have to protect anyway against people reflectively 
calling a package-private constructor of the enum class; maybe this will 
require few more checks for the enum class subclasses, but what I'm 
saying is that there's already a check in there.
>
> It's not that i don't like the feature, it's that for me it's a feature you can not even put in the box of the features that we could do. We start with "hey we could do this !" but there are some typing issues. Now, what your are saying is that we can use raw types to not have the typing issues, but as i said above, you are trading an error to a bunch of warnings, doesn't seems to be a good deal*.

I agree that having too many warnings is bad - in my experiment, 
although I touched a lot of code, including stream chains, I did not 
find them; Comparator.comparing is probably one of the worst beast (and 
doesn't work well with target typing even beside generic enums). Not 
sure if that shifts the balance one way or another, but point taken.

On this topic, since I was there, I tried to tweak the prototype so that 
Enum.values() and Enum.valueOf() return wildcards Foo<?>, but supertype 
is Enum<Foo> and this seem to work surprisingly well, both in the tests 
I had and in the new one you suggest. Maybe that would minimize the raw 
type usage, pushing it quite behind the curtains, and strictly as a 
migration aid for APIs such as EnumSet/Map ?

Maurizio

>>>> Maurizio
>>> Rémi
>>>
>>>>> public class LineParsing {
>>>>>      private final HashMap<String, Consumer<? super Iterator<String>>> actionMap =
>>>>>      new HashMap<>();
>>>>>      
>>>>>      public LineParsing with(String option, Consumer<? super Iterator<String>>
>>>>>      action) {
>>>>>        actionMap.put(option, action);
>>>>>        return this;
>>>>>      }
>>>>>      
>>>>>      public void parse(List<String> args) {
>>>>>        var it = args.iterator();
>>>>>        while(it.hasNext()) {
>>>>>          actionMap.get(it.next()).accept(it);
>>>>>        }
>>>>>      }
>>>>>      
>>>>>      public static void main(String[] args) {
>>>>>        var bean = new Object() {
>>>>>          Path input = Path.of("input.txt");
>>>>>          boolean all = false;
>>>>>        };
>>>>>        
>>>>>        new LineParsing()
>>>>>            .with("-input", it -> bean.input = Path.of(it.next()))
>>>>>            .with("-all", it -> bean.all = true)
>>>>>            .parse(List.of(args));
>>>>>      }
>>>>> }
>>>>>
>>>>> regards,
>>>>> Rémi
>>>>>
>>>>>
>>>>> ----- Mail original -----
>>>>>> De: "Maurizio Cimadamore"<maurizio.cimadamore at oracle.com>
>>>>>> À: "amber-spec-experts"<amber-spec-experts at openjdk.java.net>
>>>>>> Envoyé: Mercredi 5 Décembre 2018 17:14:59
>>>>>> Objet: enhanced enums - back from the dead?
>>>>>> Hi,
>>>>>> as mentioned in [1], the work on enhanced enum stopped while ago as we
>>>>>> have found some interoperability issues between generic enums and
>>>>>> standard enum APIs such as EnumSet/EnumMap.
>>>>>>
>>>>>> Recently, we have discussed a possible approach that might get us out of
>>>>>> the woods, which is described in greater details here:
>>>>>>
>>>>>> http://cr.openjdk.java.net/~mcimadamore/amber/enhanced-enums.html
>>>>>>
>>>>>> We have done some internal testing to convince ourselves that, from an
>>>>>> operational perspective, where we end up is indeed good. Some external
>>>>>> validation might also be very helpful, which is why we're also in the
>>>>>> process of releasing the internal patch we have tested internally in the
>>>>>> 'enhanced-enums' amber branch (we'll need to polish it a little :-)).
>>>>>>
>>>>>> Assuming that, usability-wise, our story ticks all the boxes, I think it
>>>>>> might be worth discussing a few points:
>>>>>>
>>>>>> * Do we still like the features described in JEP 301, from an
>>>>>> expressiveness point of view?
>>>>>>
>>>>>> * Both features described in JEP 301 require some sort of massaging. On
>>>>>> the one hand sharper typing of enum constants has to take care of binary
>>>>>> compatibility of enum constant subclasses into account (for this reason
>>>>>> we redefine accessibility of said subclasses along with their binary
>>>>>> names). On the other hand, with the newly proposed approach, generic
>>>>>> enums also need some language aid (treatment of raw enum constants
>>>>>> supertypes). Do we feel that the steps needed in order to accommodate
>>>>>> these sharp edges are worth the increase in expressive power delivered
>>>>>> by JEP 301?
>>>>>>
>>>>>> * Our proposed treatment for generic enums raises an additional, more
>>>>>> philosophical, question: what are raw types *for* and how happy are we
>>>>>> in seeing more of them (in the form of raw enum types)?
>>>>>>
>>>>>> Cheers
>>>>>> Maurizio
>>>>>>
>>>>>> [1] -
>>>>>> http://mail.openjdk.java.net/pipermail/amber-spec-experts/2017-May/000041.html


More information about the amber-spec-experts mailing list