RFR: Enable C2 loop strip mining by default

Krystal Mok rednaxelafx at gmail.com
Sat Dec 16 00:47:15 UTC 2017


(Not a Reviewer) but Aleksey's version for Shenandoah makes more sense to
me.

Thanks,
Kris

On Fri, Dec 15, 2017 at 1:24 PM, Aleksey Shipilev <shade at redhat.com> wrote:

> On 12/15/2017 01:38 PM, Per Liden wrote:
> > Patch to enable loop strip mining by default when using ZGC. I also
> noticed that the file had an
> > incorrect header, so I fixed that too.
> >
> > http://cr.openjdk.java.net/~pliden/zgc/c2_loop_strip_
> mining_by_default/webrev.0/
>
> Yup. It worked very well for Shenandoah.
>
> But, the relevant code block from Shenandoah code is:
>
> #ifdef COMPILER2
>   // Shenandoah cares more about pause times, rather than raw throughput.
>   if (FLAG_IS_DEFAULT(UseCountedLoopSafepoints)) {
>     FLAG_SET_DEFAULT(UseCountedLoopSafepoints, true);
>   }
>   if (UseCountedLoopSafepoints && FLAG_IS_DEFAULT(LoopStripMiningIter)) {
>     FLAG_SET_DEFAULT(LoopStripMiningIter, 1000);
>   }
> #ifdef ASSERT
>
> ...which is slightly different from what you are suggesting for ZGC. Don't
> you want to enable
> LoopStripMiningIter when user explicitly sets
> -XX:+UseCountedLoopSafepoints (which, I guess, are
> most users concerned with TTSP-related latency)?
>
> Thanks,
> -Aleksey
>
>
>


More information about the zgc-dev mailing list