Updated State of the Specialization
Simon Ochsenreither
simon at ochsenreither.de
Sat Dec 20 19:05:58 UTC 2014
> We considered that approach, but it seemed a pretty bad idea. Null
> means "no object is there". But zero is a very useful and common
> integer; using zero to also mean "no number" seems infeasible.
> Returning zero from Map.get() in an int-valued Map when the element is
> not mapped seems even more error-prone than returning null in a
> ref-valued map. So we think null should retain its current meaning,
> and not define conversion from null to value types.
I'm not sure I understand the connection you are making to Map#get
here... Yes, it's a poor API, but why is it the job of this draft to fix it?
Imho, a real fix would be to provide an alternative method which works
better, not to introduce a huge language feature to work around one
method. I don't want to imagine how the JVM/Java will look like if you
add language feature work-arounds for every case of poor API design.
> Yes, there needs to be some way to do this; this is largely a syntax
> issue. Suffice it to say there will be some way to do this.
Regarding zero values of T, I'm not even sure something like (int) null
works in Java, but Scala does it consistently with null.asInstanceOf[T].
(If or how the syntax will work in Java is not my main interest here.
What matters is the bytecode. There were also a few ideas in the past to
wrap this idiom in a "nicer" API like def default[T] =
null.asInstanceOf[T] in Scala, but considering that a) the use of zero
values is not that high b) null.asInstanceOf[T] works fine as an idiom,
no further action was taken.)
With the introduction of value types there needs to be a way to achieve
this in bytecode, because an approach where you first check for every
primitive type and return null otherwise just doesn't work with
user-definable value types.
Of course one could just do new Array[T](1)(0) but it's kind of
ridiculous to create an one-element array just to pull out the
uninitialized element.
> Today, classes have two choices:
> - Non-generic
> - Erased generics
>
> We'll add a third choice:
> - Any-fied generics
>
> but we're not taking away the first two choices or changing anything
> about them. So at the very least, you have all the tools you
> currently have. If you want to suggest some use cases that you are
> hoping we can support, please do; we can't make any promises, but
> we're happy to have the input.
I think it would be kind of embarrassing to create – in the same draft
which tries to solve the issue that people have to decide between
abstraction and performance (e. g. one generic method vs. one method for
refs plus 8 methods for primitives) – the same issue again, just in a
different place (HKT abstraction vs. performance).
It feels a bit like that with the current draft specialization will end
up as another Java-only feature, like Java 8's defender methods or Java
8's function types. Imho, that would be unfortunate.
> It's more than a single use case. When you look at the code for
> Collections, its rife with examples where implementation by parts is
> likely to be your only compatible choice. Further, this is something
> C# programmers complain about; you can't write a different
> implementation for value instantiations as for reference, and this is
> limiting. Reference implementations have tools (nulls, synchronization
> on object parameters) that value implementations don't have, and value
> implementations have tools (known non-polymorphism) that reference
> implementations don't have.
I'd love to see an example. It hasn't been an issue in Scala so far.
(When object-like tools are required in some places, the upper bounds
reflect that, but that's about it.)
> Here's another example, then: the problem of "Maps from int to int" is
> a well-studied special case, with many clever implementation tricks
> that are not available to more general-purpose maps.
That's certainly a better example! I'm not sure whether creating the
chance of having subtle behavioral differences between different
collection instantiations is an acceptable price to pay for that, though.
> But, we're also convinced that if our libraries have this problem,
> other libraries will too (and it's way more than one or two methods --
> just try any-fying java.util.concurrent!) So I don't think it's "one
> case of poor library design". We'll know more once we have a more
> complete implementation and can do some corpus analysis on existing
> generic code.
The main issue for JDK libraries is that they can't be changed anymore,
a problem which just doesn't exist for pretty much all other libraries
out there.
> I wish it was just two. Perhaps a bet! I give you $1000 now, and you
> pay me $500 for every library out there that can't anyfy cleanly. I
> think you will fund my retirement... :)
For non-JDK libraries, moving to Java 10 is an explicit choice. Existing
code will not break and if you think about the impact of having to
slightly adapt a library for Java 10, especially compared to the
upcoming _real_ breaking changes like modularization or sun.misc.Unsafe
this is completely negligible from my perspective.
Every library who wants to use Java 10 features needs to be adapted
manually anyway; just slapping "any" on type parameters won't work, so
making sure that clashing methods have non-clashing alternatives is
probably not the biggest issue by far when migrating to Java 10.
Which example would be unable to be handled with the approach I
described? I think that approach combines non-intrusiveness on the
library level and non-intrusiveness on the language level, clean &
well-known semantics and an easy upgrade path (which could even be
automated in some cases).
More information about the valhalla-dev
mailing list