Enhancing expressions with mutability?
Maurizio Cimadamore
maurizio.cimadamore at oracle.com
Wed Apr 1 20:59:30 UTC 2020
On 01/04/2020 19:32, B. Blaser wrote:
> On Mon, 23 Mar 2020 at 20:33, Maurizio Cimadamore
> <maurizio.cimadamore at oracle.com> wrote:
>>> I'm planning to employ experimentally to solve some
>>> tricky performance issues in javac's type inference solving system,
>>> see [3].
>> With respect to implement type inference using these algorithms, I feel
>> less sanguine.
> So, here is a concrete experiment:
>
> https://bugs.openjdk.java.net/secure/attachment/87565/learning.patch
>
> Quoting from my comment on JBS, it uses a parallel genetic algorithm
> to solve all contextual inference variables at once requiring only a
> small subset of propagation steps which reduces JDK-8152289 example's
> compilation time to a couple of seconds! Note that this heuristic is
> dedicated to similar examples and the usual inference algorithm is
> still used if the former doesn't converge rapidly enough.
>
> The current design is absolutely not affected as this implementation
> additionally uses very expressive mutable expressions like "Term gene
> = (`able).link(`adn, `undetvars.get(i), `type());" employing the
> quotation operator `. It simply requires a boot JDK (14) including the
> patch minus inference changes in Attr, Infer & InferenceContext.
>
> Still not convinced?
Sadly yes :-)
I mentioned a number of issues, performances aside, in my previous
email; error recovery seems particularly nasty with an approach such as
this, but more generally, having two inference engines opens up a whole
new can of worms, where now you have two implementations to maintain and
keep in sync, and there are many more things that can go wrong - and
testing becomes similarly much harder. What if the two algorithms don't
agree? Might we end up in situations where, depending on how much time
one engine takes to converge, we get an error or a failure because a
different solver is used? This is the kind of problems that worry me.
In my eyes, 8152289 is "just" a performance bug, and one that we know
how to solve (*), but we did not get enough time to work on it. Also, as
far as I know, the pathological cases described in that bug are the only
thing left for us to solve performance-wise (e.g. when speaking about
programs which keep running through inference for minutes). Coming up
with a new inference engine to solve these classes of problem is, in my
opinion, the wrong way to go about things. It might have been a nice
validation for your mutable expression experiment, but we can't consider
that as a candidate fix for JDK-8152289, sorry.
Maurizio
(*) The crux of the issue in that bug is detecting that many inference
variables are just "duplicates" of each other, so they are going to get
exactly same bounds, and be resolved in exactly the same way. Vicente,
Dan and I have brainstormed months ago about ways to take an inference
context, minimize it aggressively, solve the minimized context, the
project resolutions back to the original context. We already applied a
similar techniques in cases like
a(b(c(d .... )
And that worked really well. But the logic we have implemented has some
limitations and, currently, it doesn't scale to cases where hundreds of
tvars are added to the _same_ inference context. But this is fixable,
with some work.
Maurizio
>
> Ciao,
> Bernard
More information about the compiler-dev
mailing list