RFR: Enable C2 loop strip mining by default
Aleksey Shipilev
shade at redhat.com
Fri Dec 15 21:24:03 UTC 2017
On 12/15/2017 01:38 PM, Per Liden wrote:
> Patch to enable loop strip mining by default when using ZGC. I also noticed that the file had an
> incorrect header, so I fixed that too.
>
> http://cr.openjdk.java.net/~pliden/zgc/c2_loop_strip_mining_by_default/webrev.0/
Yup. It worked very well for Shenandoah.
But, the relevant code block from Shenandoah code is:
#ifdef COMPILER2
// Shenandoah cares more about pause times, rather than raw throughput.
if (FLAG_IS_DEFAULT(UseCountedLoopSafepoints)) {
FLAG_SET_DEFAULT(UseCountedLoopSafepoints, true);
}
if (UseCountedLoopSafepoints && FLAG_IS_DEFAULT(LoopStripMiningIter)) {
FLAG_SET_DEFAULT(LoopStripMiningIter, 1000);
}
#ifdef ASSERT
...which is slightly different from what you are suggesting for ZGC. Don't you want to enable
LoopStripMiningIter when user explicitly sets -XX:+UseCountedLoopSafepoints (which, I guess, are
most users concerned with TTSP-related latency)?
Thanks,
-Aleksey
More information about the zgc-dev
mailing list