g1 gc pause (young) high object copy time

Amarin Phaosawasdi phaosaw2 at illinois.edu
Thu Apr 13 19:15:48 UTC 2017


I've looked at the rest of the logs and the survivor space in the second 
program is indeed much bigger than the first program.

It does seem to be the case.

Thanks for the pointers.

Amarin


On 04/13/2017 01:46 PM, Wolfgang Pedot wrote:
> Hi,
>
> what I see is your first snippet shows only little surviving objects 
> (Survivor space grows by 3MB) while your second log shows an increase 
> of ~20MB in the survivor space which means way more objects (or bigger 
> objects) are still alive and need to be copied.
>
> Dont know if that can explain all of that time but I have looked at 
> some of my logs and there is always more object copy time when 
> survivor space increases.
>
> kind regards
> Wolfgang Pedot
>
> Am 13.04.2017 um 20:22 schrieb Amarin Phaosawasdi:
>> Hi,
>>
>> I'm trying to explain the GC behavior for two programs that produce 
>> the same results but use memory differently.
>>
>> One program seems to be taking much longer GCing than the other, 
>> although the GC count and average heap space freed are similar.
>>
>> ------------------------------
>>
>> java version "1.7.0_79"
>> Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
>> Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
>> Flags: -XX:+UseG1GC -verbose:gc -XX:+PrintGCDetails 
>> -XX:+PrintGCTimeStamps -XX:+PrintTenuringDistribution 
>> -XX:+PrintHeapAtGC -XX:+PrintAdaptiveSizePolicy -XX:MaxPermSize=256m
>>
>> ------------------------------
>>
>> Program 1: uses a file reader and iterates line by line. Each line 
>> can be garbage collected as soon as the program finishes processing 
>> that line.
>> Extra flag: -Xmx500M
>>
>> Number of young GCs: 3599
>> Average heap space freed (young): 296.74
>> Total young GC pause time: 11.84 seconds
>> Number of mixed GCs: 0
>> Number of full GCs: 0
>>
>> ------------------------------
>>
>> Program 2: reads a portion of the file into memory and iterates 
>> through that first. The whole portion is released at once when the 
>> program finishes processing all the lines in this portion. The rest 
>> uses a file reader to iterate line by line.
>> Extra flag: -Xmx1000M
>>
>> Number of young GCs: 3086
>> Average heap space freed (young): 369.42
>> Total young GC pause time: 129.37 seconds
>> Number of mixed GCs: 344
>> Average heap space freed (mixed): 56.80
>> Total mixed GC pause time: 9.35 seconds
>> Number of full GCs: 0
>>
>> ------------------------------
>>
>> Program 2 takes much more time doing GC than program 1. When I look 
>> at the GC logs, it seems that program 2 spends more time on object 
>> copying.
>>
>> ------------------------------
>>
>> Snippet from program 1's GC log:
>>
>> {Heap before GC invocations=11 (full 0):
>>  garbage-first heap   total 512000K, used 391986K 
>> [0x00000000d0c00000, 0x00000000f0000000, 0x00000000f0000000)
>>   region size 1024K, 300 young (307200K), 16 survivors (16384K)
>>  compacting perm gen  total 31744K, used 30804K [0x00000000f0000000, 
>> 0x00000000f1f00000, 0x0000000100000000)
>>    the space 31744K,  97% used [0x00000000f0000000, 
>> 0x00000000f1e152e0, 0x00000000f1e15400, 0x00000000f1f00000)
>> No shared spaces configured.
>> 2.666: [GC pause (young)
>> Desired survivor size 19922944 bytes, new threshold 15 (max 15)
>> - age   1:     758408 bytes,     758408 total
>> - age   2:     929040 bytes,    1687448 total
>> - age   3:    1163024 bytes,    2850472 total
>> - age   4:      39368 bytes,    2889840 total
>> - age   5:        408 bytes,    2890248 total
>> - age   9:    6765952 bytes,    9656200 total
>> - age  10:     818416 bytes,   10474616 total
>> - age  11:     993376 bytes,   11467992 total
>>  2.666: [G1Ergonomics (CSet Construction) start choosing CSet, 
>> _pending_cards: 1024, predicted base time: 3.57 ms, remaining time: 
>> 196.43 ms, target pause time: 200.00 ms]
>>  2.666: [G1Ergonomics (CSet Construction) add young regions to CSet, 
>> eden: 284 regions, survivors: 16 regions, predicted young region 
>> time: 25.28 ms]
>>  2.666: [G1Ergonomics (CSet Construction) finish choosing CSet, eden: 
>> 284 regions, survivors: 16 regions, old: 0 regions, predicted pause 
>> time: 28.85 ms, target pause time: 200.00 ms]
>> , 0.0044930 secs]
>>    [Parallel Time: 3.0 ms, GC Workers: 18]
>>       [GC Worker Start (ms): Min: 2666.3, Avg: 2667.2, Max: 2669.0, 
>> Diff: 2.8]
>>       [Ext Root Scanning (ms): Min: 0.0, Avg: 0.6, Max: 1.1, Diff: 
>> 1.1, Sum: 10.9]
>>       [Update RS (ms): Min: 0.0, Avg: 0.0, Max: 0.1, Diff: 0.1, Sum: 
>> 0.3]
>>          [Processed Buffers: Min: 0, Avg: 0.2, Max: 1, Diff: 1, Sum: 4]
>>       [Scan RS (ms): Min: 0.0, Avg: 0.0, Max: 0.1, Diff: 0.1, Sum: 0.5]
>>       [Code Root Scanning (ms): Min: 0.0, Avg: 0.0, Max: 0.1, Diff: 
>> 0.1, Sum: 0.2]
>>       [Object Copy (ms): Min: 0.0, Avg: 0.9, Max: 1.5, Diff: 1.5, 
>> Sum: 16.8]
>>       [Termination (ms): Min: 0.0, Avg: 0.3, Max: 0.4, Diff: 0.4, 
>> Sum: 5.7]
>>       [GC Worker Other (ms): Min: 0.0, Avg: 0.0, Max: 0.0, Diff: 0.0, 
>> Sum: 0.2]
>>       [GC Worker Total (ms): Min: 0.0, Avg: 1.9, Max: 2.8, Diff: 2.8, 
>> Sum: 34.6]
>>       [GC Worker End (ms): Min: 2669.1, Avg: 2669.1, Max: 2669.1, 
>> Diff: 0.0]
>>    [Code Root Fixup: 0.1 ms]
>>    [Code Root Migration: 0.4 ms]
>>    [Clear CT: 0.2 ms]
>>    [Other: 0.9 ms]
>>       [Choose CSet: 0.0 ms]
>>       [Ref Proc: 0.4 ms]
>>       [Ref Enq: 0.0 ms]
>>       [Free CSet: 0.4 ms]
>>    [Eden: 284.0M(284.0M)->0.0B(281.0M) Survivors: 16.0M->19.0M Heap: 
>> 382.8M(500.0M)->101.3M(500.0M)]
>> Heap after GC invocations=12 (full 0):
>>  garbage-first heap   total 512000K, used 103737K 
>> [0x00000000d0c00000, 0x00000000f0000000, 0x00000000f0000000)
>>   region size 1024K, 19 young (19456K), 19 survivors (19456K)
>>  compacting perm gen  total 31744K, used 30804K [0x00000000f0000000, 
>> 0x00000000f1f00000, 0x0000000100000000)
>>    the space 31744K,  97% used [0x00000000f0000000, 
>> 0x00000000f1e152e0, 0x00000000f1e15400, 0x00000000f1f00000)
>> No shared spaces configured.
>> }
>>  [Times: user=0.04 sys=0.00, real=0.00 secs]
>>
>> ------------------------------
>>
>> Snippet from program 2's GC log
>>
>> {Heap before GC invocations=13 (full 0):
>>  garbage-first heap   total 1024000K, used 852897K 
>> [0x00000000b1800000, 0x00000000f0000000, 0x00000000f0000000)
>>   region size 1024K, 368 young (376832K), 15 survivors (15360K)
>>  compacting perm gen  total 31744K, used 30938K [0x00000000f0000000, 
>> 0x00000000f1f00000, 0x0000000100000000)
>>    the space 31744K,  97% used [0x00000000f0000000, 
>> 0x00000000f1e36bf8, 0x00000000f1e36c00, 0x00000000f1f00000)
>> No shared spaces configured.
>> 3.623: [GC pause (young)
>> Desired survivor size 24117248 bytes, new threshold 15 (max 15)
>> - age   1:    8124360 bytes,    8124360 total
>>  3.623: [G1Ergonomics (CSet Construction) start choosing CSet, 
>> _pending_cards: 11449, predicted base time: 26.10 ms, remaining time: 
>> 173.90 ms, target pause time: 200.00 ms]
>>  3.623: [G1Ergonomics (CSet Construction) add young regions to CSet, 
>> eden: 353 regions, survivors: 15 regions, predicted young region 
>> time: 52.90 ms]
>>  3.623: [G1Ergonomics (CSet Construction) finish choosing CSet, eden: 
>> 353 regions, survivors: 15 regions, old: 0 regions, predicted pause 
>> time: 79.00 ms, target pause time: 200.00 ms]
>>  3.656: [G1Ergonomics (Heap Sizing) attempt heap expansion, reason: 
>> recent GC overhead higher than threshold after GC, recent GC 
>> overhead: 44.58 %, threshold: 10.00 %, uncommitted: 0 bytes, 
>> calculated expansion amount: 0 bytes (20.00 %)]
>> , 0.0330160 secs]
>>    [Parallel Time: 28.1 ms, GC Workers: 18]
>>       [GC Worker Start (ms): Min: 3623.0, Avg: 3627.2, Max: 3640.1, 
>> Diff: 17.1]
>>       [Ext Root Scanning (ms): Min: 0.0, Avg: 0.6, Max: 1.0, Diff: 
>> 1.0, Sum: 11.0]
>>       [SATB Filtering (ms): Min: 0.0, Avg: 0.0, Max: 0.1, Diff: 0.1, 
>> Sum: 0.1]
>>       [Update RS (ms): Min: 0.0, Avg: 3.0, Max: 5.4, Diff: 5.4, Sum: 
>> 54.4]
>>          [Processed Buffers: Min: 0, Avg: 3.6, Max: 10, Diff: 10, 
>> Sum: 64]
>>       [Scan RS (ms): Min: 0.0, Avg: 0.1, Max: 0.6, Diff: 0.6, Sum: 2.7]
>>       [Code Root Scanning (ms): Min: 0.0, Avg: 0.0, Max: 0.0, Diff: 
>> 0.0, Sum: 0.0]
>>       [Object Copy (ms): Min: 10.5, Avg: 19.8, Max: 22.4, Diff: 11.9, 
>> Sum: 355.7]
>>       [Termination (ms): Min: 0.0, Avg: 0.3, Max: 0.4, Diff: 0.4, 
>> Sum: 5.0]
>>       [GC Worker Other (ms): Min: 0.0, Avg: 0.0, Max: 0.1, Diff: 0.0, 
>> Sum: 0.4]
>>       [GC Worker Total (ms): Min: 10.9, Avg: 23.8, Max: 28.0, Diff: 
>> 17.1, Sum: 429.3]
>>       [GC Worker End (ms): Min: 3651.0, Avg: 3651.0, Max: 3651.0, 
>> Diff: 0.0]
>>    [Code Root Fixup: 0.0 ms]
>>    [Code Root Migration: 0.0 ms]
>>    [Clear CT: 0.2 ms]
>>    [Other: 4.6 ms]
>>       [Choose CSet: 0.0 ms]
>>       [Ref Proc: 3.9 ms]
>>       [Ref Enq: 0.0 ms]
>>       [Free CSet: 0.6 ms]
>>    [Eden: 353.0M(353.0M)->0.0B(330.0M) Survivors: 15.0M->35.0M Heap: 
>> 832.9M(1000.0M)->500.4M(1000.0M)]
>> Heap after GC invocations=14 (full 0):
>>  garbage-first heap   total 1024000K, used 512417K 
>> [0x00000000b1800000, 0x00000000f0000000, 0x00000000f0000000)
>>   region size 1024K, 35 young (35840K), 35 survivors (35840K)
>>  compacting perm gen  total 31744K, used 30938K [0x00000000f0000000, 
>> 0x00000000f1f00000, 0x0000000100000000)
>>    the space 31744K,  97% used [0x00000000f0000000, 
>> 0x00000000f1e36bf8, 0x00000000f1e36c00, 0x00000000f1f00000)
>> No shared spaces configured.
>> }
>>  [Times: user=0.39 sys=0.00, real=0.03 secs]
>>
>> ------------------------------
>>
>> I would like to understand why the object copy for program 2 takes 
>> much longer.
>>
>> How should I debug this further?
>>
>> Please let me know if you need more information or would like me to 
>> run anything.
>>
>> Thank you.
>>
>> Amarin
>> _______________________________________________
>> hotspot-gc-use mailing list
>> hotspot-gc-use at openjdk.java.net
>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>
>
> _______________________________________________
> hotspot-gc-use mailing list
> hotspot-gc-use at openjdk.java.net
> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use



More information about the hotspot-gc-use mailing list