regarding long young generation pause when using G1 GC
Amit Mishra
amit.mishra at redknee.com
Thu Jan 21 12:30:19 UTC 2016
Hello team,
Please help me with G1 GC long young generation pause..
At one of our Customer deployment we were using CMS with 48G heap size and facing Concurrent mode failure every month.
To solve this we started load test using G1 GC on lab but we are seeing intermittent long GC pause as long as 148 seconds,
Could you please look into logs and confirm what best we can do to ascertain root cause and possible tuning to fix this.
Thanks for help in advance.
We are using default GC parameters:(Attaching full GC file as well)
JVM_ARGS="${JVM_ARGS} -XX:-OmitStackTraceInFastThrow"
JVM_ARGS="${JVM_ARGS} -XX:-UseSSE42Intrinsics"
JVM_ARGS="${JVM_ARGS} -XX:-ReduceInitialCardMarks"
JVM_ARGS="${JVM_ARGS} -XX:+UseG1GC"
JVM_ARGS="${JVM_ARGS} -XX:MaxGCPauseMillis=500"
JVM_ARGS="${JVM_ARGS} -XX:+UseCompressedOops"
JVM_ARGS="${JVM_ARGS} -XX:+PrintFlagsFinal"
JVM_ARGS="${JVM_ARGS} -XX:-EliminateLocks"
JVM_ARGS="${JVM_ARGS} -XX:+UseLargePages"
export GC_LOG="${PROJECT_HOME}/log/gcstats.log.$$"
JVM_ARGS=$JVM_ARGS" -Xloggc:${GC_LOG}"
JVM_ARGS=$JVM_ARGS" -verbose:gc"
JVM_ARGS="${JVM_ARGS} -XX:+DisableExplicitGC"
JVM_ARGS=$JVM_ARGS" -XX:+AggressiveOpts"
JVM_ARGS="${JVM_ARGS} -XX:+HeapDumpOnOutOfMemoryError"
JVM_ARGS="${JVM_ARGS} -XX:ReservedCodeCacheSize=128m"
JVM_ARGS="${JVM_ARGS} -XX:PermSize=256m"
JVM_ARGS="${JVM_ARGS} -XX:MaxPermSize=256m"
JVM_ARGS="${JVM_ARGS} -Xmx28672m"
JVM_ARGS="${JVM_ARGS} -Xms28672m
Long GC pause.
5020.853: [GC pause (young), 148.0343389 secs]
[Parallel Time: 142474.3 ms, GC Workers: 13]
[GC Worker Start (ms): Min: 5022695.7, Avg: 5022705.6, Max: 5022714.4, Diff: 18.7]
[Ext Root Scanning (ms): Min: 143.0, Avg: 2550.5, Max: 6442.5, Diff: 6299.4, Sum: 33157.1]
[Update RS (ms): Min: 5704.6, Avg: 19017.1, Max: 45977.9, Diff: 40273.3, Sum: 247222.7]
[Processed Buffers: Min: 0, Avg: 26.9, Max: 291, Diff: 291, Sum: 350]
[Scan RS (ms): Min: 70538.2, Avg: 92539.7, Max: 114688.0, Diff: 44149.8, Sum: 1203016.2]
[Object Copy (ms): Min: 894.7, Avg: 9265.9, Max: 20674.1, Diff: 19779.4, Sum: 120456.9]
[Termination (ms): Min: 10169.3, Avg: 19073.0, Max: 26684.1, Diff: 16514.7, Sum: 247949.3]
[GC Worker Other (ms): Min: 0.1, Avg: 4.5, Max: 18.2, Diff: 18.0, Sum: 58.4]
[GC Worker Total (ms): Min: 142441.2, Avg: 142450.8, Max: 142461.1, Diff: 19.8, Sum: 1851860.7]
[GC Worker End (ms): Min: 5165155.6, Avg: 5165156.4, Max: 5165162.0, Diff: 6.4]
[Code Root Fixup: 2.2 ms]
[Clear CT: 11.2 ms]
[Other: 5546.7 ms]
[Choose CSet: 0.0 ms]
[Ref Proc: 47.4 ms]
[Ref Enq: 0.9 ms]
[Free CSet: 3642.1 ms]
[Eden: 13.2G(13.2G)->0.0B(1416.0M) Survivors: 16.0M->16.0M Heap: 24.7G(28.0G)->11.5G(28.0G)]
[Times: user=184.95 sys=34.49, real=148.07 secs]
Thanks,
Amit
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.openjdk.java.net/pipermail/hotspot-gc-use/attachments/20160121/89011910/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: gcstats.log.15989
Type: application/octet-stream
Size: 1111579 bytes
Desc: gcstats.log.15989
URL: <http://mail.openjdk.java.net/pipermail/hotspot-gc-use/attachments/20160121/89011910/gcstats.log-0001.15989>
More information about the hotspot-gc-use
mailing list