[9] RFR [XS] 8054492: Casting can result in redundant null checks in generated code

Vladimir Kozlov vladimir.kozlov at oracle.com
Fri Oct 24 21:57:39 UTC 2014


On 10/23/14 3:45 AM, Paul Sandoz wrote:
> Hi Vladimir,
>
> Do you consider these two aspects (intrinsic and inline) closely interrelated do as to be considered under the same issue? If not i can log another issue for the inlining.

I am working on next version of my intrinsic after discussion with John 
Rose. It will use gen_checkcast() code which we use for checkcast 
bytecode. It will be more general code and less dependable on inlining. 
But I still want to do force inlining Call.cast() in more general case 
when we can still use profiling data.

I am concern about changes in castReference() to use Class.cast(). It 
will need additional performance testing because it was big performance 
bottleneck for jsr292. And I don't want to cause regression.

Thanks,
Vladimir

>
> Paul.
>
> On Oct 23, 2014, at 7:09 AM, Vladimir Kozlov <vladimir.kozlov at oracle.com> wrote:
>
>> On 10/22/14 2:29 AM, Paul Sandoz wrote:
>>> On Oct 22, 2014, at 10:54 AM, Roland Westrelin <roland.westrelin at oracle.com> wrote:
>>>>> Here is intrinsic implementation:
>>>>>
>>>>> http://cr.openjdk.java.net/~kvn/8054492/webrev.01/
>>>>
>>>> That looks good to me.
>>>>
>>> Same here.
>>>
>>> On Oct 22, 2014, at 2:02 AM, Vladimir Kozlov <vladimir.kozlov at oracle.com> wrote:
>>>> Paul, is it enough for you?
>>>>
>>>
>>> Yes, many thanks. I verified in a number of scenarios.
>>>
>>> With Class.cast intrinsified can we can replace uses of following method in j.l.i.MethodHandleImpl with Class.cast?
>>
>> Intrinsic is not enough. That code was added because we do not inline Class.cast() for deep call stacks. C2 has inline depth limit 9. But we can workaround using next changes:
>>
>> --- a/src/share/vm/opto/bytecodeInfo.cpp	Fri Aug 22 09:55:49 2014 -0700
>> +++ b/src/share/vm/opto/bytecodeInfo.cpp	Wed Oct 22 22:07:21 2014 -0700
>> @@ -161,6 +161,7 @@
>>    if ((freq >= InlineFrequencyRatio) ||
>>        (call_site_count >= InlineFrequencyCount) ||
>>        is_unboxing_method(callee_method, C) ||
>> +      (callee_method->intrinsic_id() == vmIntrinsics::_class_cast) ||
>>        is_init_with_ea(callee_method, caller_method, C)) {
>>
>>      max_inline_size = C->freq_inline_size();
>> @@ -262,6 +263,11 @@
>>      return false;
>>    }
>>
>> +  if (callee_method->intrinsic_id() == vmIntrinsics::_class_cast) {
>> +    // Inline Class.cast() method.
>> +    return false;
>> +  }
>> +
>>    if (callee_method->has_compiled_code() &&
>>        callee_method->instructions_size() > InlineSmallCode) {
>>      set_msg("already compiled into a big method");
>>
>>
>> Thanks,
>> Vladimir
>>
>>>
>>>      @ForceInline
>>>      @SuppressWarnings("unchecked")
>>>      static <T,U> T castReference(Class<? extends T> t, U x) {
>>>          // inlined Class.cast because we can't ForceInline it
>>>          if (x != null && !t.isInstance(x))
>>>              throw newClassCastException(t, x);
>>>          return (T) x;
>>>      }
>>>
>>> Paul.
>>>
>


More information about the hotspot-compiler-dev mailing list