Truffle CallNode API. Is it possible to keep the old inline API?

Wei Zhang ndrzmansn at gmail.com
Mon Feb 10 20:34:19 PST 2014


Hi Christian,

> I definitely don't want to support both inlining APIs at the same time
> because the old API would not support the changes that are currently in my
> pipeline.

I understand.
Thanks for offering options to resolve my issue.

> 1) On the way CallGeneratorNode#execute is implemented I assume that you do
> not really want to use the inlining heuristic provided by the truffle
> framework.
> Instead you just want to inline always, right? Wouldn't it easier for you
> to just always transform generator function calls and don't wait for the
> inlining heuristic to say so?
> In my opinion the generator transformation is not really inlining, its a
> different more advanced concept and should also be treated this way.
> If you want to perform this transformation just before compilation of a
> method, we could also think of adding an API for getting notified just
> before truffle compilation.
> I would prefer if we could go that way.

You are right. It is not exactly inlining.
But inlining heuristic helps too.
If a generator call is not hot enough, we can ignore it.

My only concern with this option is that I need Truffle to further
inline calls in the 'inlined' or transformed generator, so it can
potentially peel off multiple levels of generator calls.
Do you think this option will do it?

> 2) I could provide you with an API to fully customize the behavior of
> inlining in an individual CallNode.

This would keep things closer to how it works now.
But again, if the first option works for me I can go with that.


> Some other questions:
> Can you undo generator call transformations? What if an generator call gets
> inlined but the generator callsite gets megamorphic later on?

Yes I have to keep the original loop around and switch to it if things change.
I will have to add it at some point... : )

> Besides CallGeneratorNode, the other call nodes can be migrated without
> troubles?

Another one is CallBuiltinInlinableNode, in which BuiltinIntrinsifier
is invoked.
You already know about this.
It needs the same solution that CallGeneratorNode does.

Thanks,

/Wei


>> On Sun, Feb 9, 2014 at 1:01 AM, Christian Humer
>> <christian.humer at gmail.com> wrote:
>> > Hi Wei,
>> >
>> > I had a very brief look at ZipPy calls and they seem to be implemented
>> > nicely.
>> > Do I guess correctly that your problems migrating is due to
>> > BuiltinIntrinsifier?
>> > Can you quickly outline the rationale behind it?
>> > Can you point out other areas which got you into troubles?
>> >
>> > I will have an in depth look on them on Monday.
>> >
>> > Thx.
>> >
>> > - Christian Humer
>> >
>> >
>> > On Sun, Feb 9, 2014 at 2:54 AM, Wei Zhang <ndrzmansn at gmail.com> wrote:
>> >>
>> >> Hi Christian Humer,
>> >>
>> >> I've been looking at the new CallNode API for a while now.
>> >> I tried a couple of times to adopt it, but it hasn't been successful so
>> >> far.
>> >>
>> >> The new inlining API does look cleaner and more compact.
>> >> It makes more sense for the most part, but it requires a big change in
>> >> ZipPy.
>> >>
>> >> One thing that ZipPy relies on in the old API is that one can
>> >> customize the inlining logic.
>> >> A Python level call Inlining could trigger some additional
>> >> transformation in the caller's AST.
>> >> In the new API, inlining is pretty much hidden from the caller.
>> >>
>> >> Admittedly there's always another way to achieve the same thing, but
>> >> it would be nice to have the old API around at least before we can
>> >> successfully migrate to the new one.
>> >> Another option for me is to stop merging with Truffle until I figure
>> >> everything out.
>> >> But it is going to take a while before I can put my focus back on the
>> >> new CallNode.
>> >> And I know it is not healthy to fall behind for too long.
>> >>
>> >> I'm not sure how much it would affect you, but it is definitely making
>> >> my life easier for the next month or so.
>> >> Please let me know.
>> >>
>> >> Thanks,
>> >>
>> >> /Wei
>> >
>> >
>>


More information about the graal-dev mailing list