<!DOCTYPE html><html><head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<blockquote type="cite" cite="mid:CACzrW9CzirrZsOw4Ov+rupGPqOa_cim-qpgeyURms-Ha9U8F+g@mail.gmail.com">
<blockquote type="cite">
<pre wrap="" class="moz-quote-pre">Zooming out, design almost always involves "lump vs split" choices; do we highlight the specific differences between cases, or their commonality?
</pre>
</blockquote>
<pre wrap="" class="moz-quote-pre">
Another way to express this distinction is "what level of magic is acceptable?"</pre>
</blockquote>
<br>
Heh, that's a pretty loaded way to express it. <br>
<br>
Having semantics depend on static types is not "magic", whether or
not the types are repeated at every line of code they are used.
When we say <br>
<br>
int x = (int) anObject<br>
<br>
vs<br>
<br>
int x = (int) aLong<br>
<br>
the two casts to int have different semantics _based on the type of
what is being cast_; one will attempt to cast the Object to Integer
and then unbox (possibly CCEing), and the other will attempt to
narrow the long to an int (possibly losing data). And yet, they
both appear to be "the same thing" -- casting a value to int. <br>
<br>
Your arguments about JEP 507 could equally well be applied to the
semantic difference between pair of statements above. So why is
this not a disaster, or even "magic"? Because static types are a
core part of the Java language! Appealing to them, even if they are
explicitly denoted only somewhere else, is not magic. <br>
<br>
It would be a useful thought experiment to ask yourself why the
above two examples don't offend you to the point of proposing new
syntax. Because all the implicit, type-driven variation in
semantics that is present in `anObject instanceof int x` is equally
present in `int x = anObject`. (In fact, it should be, because they
are _the same thing_.)<br>
<br>
So no, I can't agree that this is about "magic" at all. Let's use
the right word: "implicit". Your core argument is that "too much is
left implicit here, and therefore no one will be able to understand
what is going on." These sort of "it's OK now, but if we do one
more thing it will get out of hand" arguments remind me of previous
arguments around previous features that involved new implicit
behaviors driven by static types, which were predicted by their
detractors to be raging disasters, and which turned to be ...
fine. <br>
<br>
Example 1: autoboxing<br>
<br>
Prior to Java 5, there was no implicit or explicit conversion
between `int` and `Integer` (not even casting); boxing and unboxing
were done manually through `new Integer(n)`, `Integer.valueOf(n)`,
and `Integer::intValue`. In Java 5, we added boxing and unboxing
conversions to the list of conversions, and also, somewhat more
controversially, supported "implicit" boxing and unboxing
conversions (more precisely, allowing them in assignment/method
context) as well as "explicit" boxing and unboxing conversions
(casting). <br>
<br>
Of course, some people cheered ("yay, less ceremony") but others
gasped in horror. An assignment that ... can throw? What black
magic is this? This will make programs less reliable! And the
usual bargaining: "why does this have to be implicit, what's wrong
with requiring an explicit cast?" 20 years later, this may seem
comical or hard to believe, but there was plenty of controversy over
this in its day.<br>
<br>
While the residue of complexity this left in the spec was nontrivial
(added to the complexity of both conversions and overload selection,
each nontrivial areas of the language), overall this was a win for
Java programmers. The static type system was still in charge,
clearly defining the semantics of our programs, but the explicit
ceremony of "go from int to Integer" receded into the background.
The world didn't end; Java programs didn't become wildly less
reliable. And if we asked people today if they wanted to go back,
the answer would surely be a resounding "hell, no." <br>
<br>
Example 2: local variable type inference (`var`)<br>
<br>
The arguments on both sides of this were more dramatic; its
supporters went on about "drowning in ceremony", while its
detractors cried "too much! too much!", warning that Java codebases
would collapse into unreadability due to bad programmers being
unable to resist the temptation of implicitness. Many strawman
examples were offered as evidence of how unreadable Java code would
become. (To be fair, these people were legitimately afraid for how
such a feature would be used, and how this would affect their
experience of programming in Java, fearing it would be overused or
abused, and that we wouldn't be able to reclose Pandora's box. (But
some were just misguided mudslinging, of course; the silliest of
them was "you're turning Java into Javascript", when in fact type
inference is based entirely on ... static types. Unfortunately
there is no qualification exam for making strident arguments.))<br>
<br>
Fortunately, some clearer arguments eventually emerged from this
chaos. People pointed out that for many local variables, the
_variable name_ carried far more information than the variable type,
and that the requirement to manifestly type all variables led to
distortions in how people coded (such as leading to more complicated
and deeply nested expressions, that could have benefited by pulling
out subexpressions into named variables). <br>
<br>
In the end, it was mostly a nothingburger. Developers learned to
use `var` mostly responsibly, and there was no collapse in
maintainability or readability of Java code. The fears were
unfounded. <br>
<br>
<br>
One of the things that happens when people react to new features
that are not immediately addressing a pain point that they happen to
be personally in, is to focus on all the things that might go
wrong. This is natural and usually healthy, but one of the problems
with this tendency is that in this situation, where the motivation
of the feature doesn't speak directly to us, we often don't have a
realistic idea of how and when and how often it will come up in real
code. In the absence of a concrete "yes, I can see 100 places I
would have used this yesterday", we replace those with speculative,
often distorted examples, and react to a fear of the unrealistic
future they imply. <br>
<br>
Yes, it is easy to imagine cases where something confusing could
arise out of "so much implicitness" (though really, its not so much,
its just new flavors of the same old stuff.) But I will note that
almost all of the example offered involve floating point, which
mainstream Java developers _rarely use_. Which casts some doubt on
whether these examples of "look how confusing this is" are
realistic. <br>
<br>
<br>
(This might seem like a a topic change, but it is actually closer to
the real point.) At this point you might be tempted to argue "but
then why don't we 'just' exclude floating point from this feature?"
And the reason is: that would go against the _whole point_ of this
feature. This JEP is about _regularization_. Right now, there are
all sorts of random and gratuitous restrictions about what types can
be used where; we can only use reference types in instanceof, we
can't switch on float, constant case switches are not really
patterns yet, we can't use `null` in nested pattern context, etc
etc. Each of these restrictions or limitations may have been
individually justifiable at the time, but in the aggregate, they are
a pile of pure accidental complexity, make the language harder to
use and learn, create unexpected interactions and gaps, and make it
much much harder to evolve the language in the ways that Valhalla
aims to, allowing the set of numeric types that can "work like
primitives" to be expanded. We can get to a better place, but we
can't bring all our accidental complexity with us. <br>
<br>
When confronted with a new feature, especially one that is not
speaking directly to pain points one is directly experiencing, the
temptation is to respond with a highly localized focus, one which
focuses on taking the claimed goals of this feature and trying to
make it "safer" or "simpler" (which usually also means "smaller".)
But such localized responses often have two big risks: they risk
missing the point of the feature (which is easy if it is already not
speaking directly to you), and they risk adding new complexity
elsewhere in the language in aid of "fixing" what seems "too much"
about the feature in front of you.<br>
<br>
This feature is about creating level ground for future work to build
on -- constant patterns, numeric conversions between `Float16` and
`double`, etc. But to make these features possible, we first have
to undo the accidental complexity of past hyperlocal feature design
so that there can be a level ground that these features can be built
on; the ad-hoc restrictions have to go. This JEP may appear to
create complicated new situations (but really, just less familiar
ones), but it actually makes instanceof and switch _simpler_ -- both
by by removing restrictions and by defining everything in terms of a
small number of more fundamental concepts, rather than a larger pile
of ad-hoc rules and restrictions. Its hard to see that at first, so
you have to give it time to sink in. <br>
<br>
*If* it turns out, when we get to that future, that things are still
too implicit for Java developers to handle, we still have the
opportunity _then_ to offer new syntactic options for finer control
over conversions and partiality. But I'm not compelled by the idea
of going there preemptively (and I honestly don't think it is
actually going to be a problem.) <br>
<br>
<br>
<br>
<blockquote type="cite" cite="mid:CACzrW9CzirrZsOw4Ov+rupGPqOa_cim-qpgeyURms-Ha9U8F+g@mail.gmail.com">
<pre wrap="" class="moz-quote-pre">Circling back to "what level of magic is acceptable?". The trouble
here is that partial type patterns and unconditional type patterns
already share the same syntax, and that is bad enough. To add in type
conversions is just way too far. This isn't lumping, it is magic.
Trying to read and decipher code with merged type checks and type
conversions in patterns simply isn't possible without an excessive
amount of external context, which is potentially very difficult to do
in PRs for example.
All my proposal really argues is that alternative syntaxes are
available that make the code readable again. With ~ the visible syntax
question becomes "if I can convert to an int ....". Other options are
available.
</pre>
</blockquote>
</body>
</html>