Enhancing expressions with mutability?
Maurizio Cimadamore
maurizio.cimadamore at oracle.com
Mon Apr 6 10:33:49 UTC 2020
On 05/04/2020 19:25, B. Blaser wrote:
> On Wed, 1 Apr 2020 at 22:59, Maurizio Cimadamore
> <maurizio.cimadamore at oracle.com> wrote:
>> On 01/04/2020 19:32, B. Blaser wrote:
>>> So, here is a concrete experiment:
>>>
>>> https://bugs.openjdk.java.net/secure/attachment/87565/learning.patch
>>>
>>> Quoting from my comment on JBS, it uses a parallel genetic algorithm
>>> to solve all contextual inference variables at once requiring only a
>>> small subset of propagation steps which reduces JDK-8152289 example's
>>> compilation time to a couple of seconds! Note that this heuristic is
>>> dedicated to similar examples and the usual inference algorithm is
>>> still used if the former doesn't converge rapidly enough.
>>>
>>> The current design is absolutely not affected as this implementation
>>> additionally uses very expressive mutable expressions like "Term gene
>>> = (`able).link(`adn, `undetvars.get(i), `type());" employing the
>>> quotation operator `. It simply requires a boot JDK (14) including the
>>> patch minus inference changes in Attr, Infer & InferenceContext.
>>>
>>> Still not convinced?
>> Sadly yes :-)
> Paradoxically, I agree with you, a few questions still remain...
>
>> I mentioned a number of issues, performances aside, in my previous
>> email; error recovery seems particularly nasty with an approach such as
>> this,
> Yes, this is the most problematic issue but I'm rather confident that
> if a solution exists the genetic algorithm will converge rapidly
> enough (less than 10 generations) otherwise we can consider that an
> error occurred. Of course, this would have to be confirmed with many
> measures but as far as I know,
> * either the context has few inference variables (1 to 4) and a clever
> choice of the initial population might simply perform an exhaustive
> combinatorial search using multiple threads,
> * or the context has potentially many variables which are most of the
> time exact copies of themselves resulting in very low liberty-degrees
> and thus a very fast convergence too.
>
> Of course, you might argue that this genetic engine might fail to
> infer valid expressions but I believe this is inherent to inference
> algorithms in general. Recall JDK-8219318, you said yourself that the
> current engine might be "out of gas" in some situations,
Let me underline all the things that raise my concern level with your
message:
* " I'm rather confident that if a solution exists the genetic algorithm
will converge rapidly enough (less than 10 generations) otherwise we can
consider that an error occurred" - pretty sure that this statement alone
is unjustifiable from a JLS perspective, where subtyping chains can be
arbitrarily long and complex
* "... either the context has few inference variables (1 to 4)" - in
such cases the existing inference engine is uber fast, why would you
want to improve it?
* "combinatorial search using multiple threads" - and... even more
complexity.
* "or the context has potentially many variables which are most of the
time exact copies of themselves" - another assumption which holds for
the case you are looking at, but doesn't hold in general. When working
with collectors I've seen many cases of inference context which, after
normalization (e.g. equivalent type variables dropped) were still in the
10-15 mark.
In any case I think we're talking past each other here, and using the
term inference engine for different purposes.
When I said in my comment to JDK-8219318 that the engine was running out
of gas I meant the engine "as specified in the JLS". In fact, this
thread seems to forget that, no matter what the impl will do, it will
have to be backed up by whatever text is in the JLS - each pass and each
fail will have to be explained in terms of normative text in the spec.
So, popping back, JDK-8219318 and JDK-8152289 are two _very_ different
issues; the former is essentially tied to how inference works in the
spec; the latter is an implementation performance issue. I already
talked at length as to why I wouldn't like the javac codebase to evolve
and embrace some kind of bi-modal genetic inference engine in order to
speed up performances. I think that, by extension, by skepticism also
applies (in much greater way) to _change the language specifications_ to
allow for such changes (e.g. to support JDK-8219318).
Popping back even more; we started this thread by discussing your
proposal for mutable expressions (although this list shouldn't be used
for proposing language changes, but, fine) - both Paul and I replied
connecting the work you did with some of the stuff we have in the oven.
From there we descended into some kind of inference engine war, which I
don't think was the point of your original communication?
Inference engine changes have a lot of strings attached to them; it's
not about the code and, to a degree, not even about performance, as long
as it's acceptable - which I think it is, it's primarily about
maintenance, and conformance with the spec. Invariably issues will pop
up, and stuff will break, and when that happens, the more similar the
spec is to the implementation, the better. So, in my view, the way to
move the inference code forward is to make it _more in sync_ with the
JLS, so that concepts are implemented in a more 1-1 way (which wasn't
possible at the outset in Java 8 as we had to be compatible with pre
Java 8 inference - this requirement might go away at some point, so we
might be able to cleanup some more).
Maurizio
More information about the discuss
mailing list