Migrating methods in Collections
Ron Pressler
ron at paralleluniverse.co
Tue Dec 22 09:36:24 UTC 2015
>
> On Dec 22, 2015, at 1:55 AM, Brian Goetz <brian.goetz at oracle.com> wrote:
>
>
> As a general rule, the pain of migrating should go to those who want to migrate, and should not fall on those who don’t want to migrate. So existing Map code that uses reference types and wants to keep using reference types, should be able to completely ignore the changes to the API.
>
>
> Another mitigating factor is that the new methods are total. That means, you can migrate your code to be “any-collections-ready” without actually using any of the anyfied classes, and without changing the semantics of anything. Which gives us a path to eventually deprecating the old methods — though realistically it would probably be a VERY long time before we removed them.
>
>
>
I agree with everything, but it all comes down to the following question: does backwards _source_ compatibility alone, something that Java has good coping mechanisms for (javac source level, IDE/tool support) justify the addition of a feature that is not trivial and not very general (certainly not as general as extension methods)? We’re talking about receiver type-matching that is finer-grained than a class, something that feels foreign (and “un-simple") in OOP.
> Something slightly more ambitious. I’d like to deprecate {Int,Long,Double}Stream, but allow Stream<int> to respond to all methods currently supported by IntStream. This provides a path to getting rid of the manual specializations (probably faster than the legacy collection methods) because Stream<int> would be just as good as the old IntStream.
>
>
>
But couldn’t it be just as good a replacement even if some of the methods were plain static methods, something Java developers are quite familiar with? It will require code migration either way. Yes, it won’t have the same fluent-API, but neither will other methods that people will come up with. Is hand-specializing the _public interface_ (I have no qualms with hand-specializing hidden implementation) a necessary enough feature to justify non-class-based receiver-type-matching? It feels like a new and unfamiliar form of ad-hoc almost-but-not-quite extension methods (sadly, actual extension methods won’t solve the problem). If anything, backwards source compatibility is a stronger argument, as it is very important (though, IMO, not important enough to justify this).
Default methods had both the urgency — binary compatibility — and the generality. It seems to me that partial methods have neither. I’m not saying they’re not a cool feature or that they don’t solve the problem, but they don’t feel very blue-collar.
Anyway, I’ve said my piece on this matter :)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.openjdk.java.net/pipermail/valhalla-spec-experts/attachments/20151222/fc34b6d2/attachment.html>
More information about the valhalla-spec-experts
mailing list