Long Reference Processing Time

Yu Zhang yu.zhang at oracle.com
Wed May 20 22:35:21 UTC 2015


can you dump the heap and examine it with eclipse mat or some similar tools?

Thanks,
Jenny

On 5/20/2015 3:12 PM, Joy Xiong wrote:
> Thank you, Poonam.
>
> Also is there a way to get more info on weak references, such as the 
> reference name? Our application does not use weak references, so it's 
> likely that the weak references come from the underneath library, and 
> I'd like to know which library is using lots of weak references.
>
> thanks,
> -Joy
>
>
>
> On Wednesday, May 20, 2015 2:26 PM, Poonam Bajaj Parhar 
> <poonam.bajaj at oracle.com> wrote:
>
>
> Hello Joy,
>
> 8u40 is the latest update release that contains new enhancements and 
> bug fixes, and 8u45 is the latest security release that includes 
> security fixes on top of 8u40.
>
> So, for your test run I think you can try with 8u40.
>
> regards,
> Poonam
>
> On 5/20/2015 1:17 PM, Joy Xiong wrote:
>> Yu and Poonam,
>>
>> Thank you for your quick response.
>> In terms of JDK version, we have 8u40 available, so want to check 
>> with you how 8u40 differs from 8u45.
>>
>> thanks,
>> -Joy
>>
>>
>>
>> On Wednesday, May 20, 2015 11:18 AM, Poonam Bajaj Parhar 
>> <poonam.bajaj at oracle.com> <mailto:poonam.bajaj at oracle.com> wrote:
>>
>>
>> Hello Joy,
>>
>> Could you try running with the latest JDK8 update release (8u45). 
>> Looks like you are trying out G1 with 8u5. There have been many 
>> improvements/fixes in G1GC since 8u5. Please test with the latest 8u 
>> and let us know the results.
>>
>> Thanks,
>> Poonam
>>
>> On 5/20/2015 10:35 AM, Joy Xiong wrote:
>>> *
>>> *
>>> Hi All,
>>>
>>> I recently moved our application from CMS to G1 due to heap 
>>> fragmentation. Here are the JVM tunable used for the application:
>>> -XX:MaxGCPauseMillis=40 -XX:+ParallelRefProcEnabled 
>>> -XX:G1HeapRegionSize=8M -XX:ParallelGCThreads=22 -server -Xms5g 
>>> -Xmx5g -XX:MetaspaceSize=256m -XX:MaxMetaspaceSize=256m -XX:+UseG1GC
>>> /export/apps/jdk/JDK-1_8_0_5/bin/java
>>>
>>> With G1, I observe long time processing references. The long 
>>> reference processing time has two types:
>>> 1) Occur in Young GC phase. The processing time does not make sense 
>>> to me, as the majority time is spent on processing soft reference, 
>>> whose number is 0. Is there some hidden time contributing to 
>>> processing soft references?
>>> 2) Occur in the remark phase during the concurrent phase. Our 
>>> application has a large number of weak references, but I don't quite 
>>> understand why the processing time is much larger with G1 than with CMS.
>>>
>>> Detailed log record is shown as below:
>>> *1. Processing soft reference takes long time*. However, we only 
>>> have 0 soft reference
>>> 2015-05-15T19:39:57.849+0000: 30271.428: [GC pause (G1 Evacuation 
>>> Pause) (young)
>>> Desired survivor size 201326592 bytes, new threshold 15 (max 15)
>>> - age   1:  6197672 bytes,    6197672 total
>>> - age   2: 553864 bytes,    6751536 total
>>> - age   3: 321216 bytes,    7072752 total
>>> - age   4: 563120 bytes,    7635872 total
>>> - age   5: 261920 bytes,    7897792 total
>>> - age   6: 265768 bytes,    8163560 total
>>> - age   7: 319856 bytes,    8483416 total
>>> - age   8: 132328 bytes,    8615744 total
>>> - age   9: 153768 bytes,    8769512 total
>>> - age  10: 194256 bytes,    8963768 total
>>> - age  11:  64600 bytes,    9028368 total
>>> - age  12: 160208 bytes,    9188576 total
>>> - age  13:  69376 bytes,    9257952 total
>>> - age  14: 151832 bytes,    9409784 total
>>> - age  15: 186920 bytes,    9596704 total
>>>  30271.429: [G1Ergonomics (CSet Construction) start choosing CSet, 
>>> _pending_cards: 13708, predicted base time: 22.33 ms, remaining 
>>> time: 17.67 ms, target pause time: 40.00 ms]
>>>  30271.429: [G1Ergonomics (CSet Construction) add young regions to 
>>> CSet, eden: 380 regions, survivors: 2 regions, predicted young 
>>> region time: 5.51 ms]
>>>  30271.429: [G1Ergonomics (CSet Construction) finish choosing CSet, 
>>> eden: 380 regions, survivors: 2 regions, old: 0 regions, predicted 
>>> pause time: 27.83 ms, target pause time: 40.00 ms]
>>> 30271.445: [*SoftReference, 0 refs, 0.9021283 secs*]30272.347: 
>>> [WeakReference, 5 refs, 0.0031983 secs]30272.350: [FinalReference, 2 
>>> refs, 0.0019730 secs]30272.352: [PhantomReference, 102 refs, 
>>> 0.0019032 secs]30272.354: [JNI Weak Reference, 0.0000124 secs], 
>>> 0.9305765 secs]
>>>    [Parallel Time: 14.4 ms, GC Workers: 22]
>>>       [GC Worker Start (ms): Min: 30271429.4, Avg: 30271429.7, Max: 
>>> 30271429.9, Diff: 0.5]
>>>       [Ext Root Scanning (ms): Min: 4.3, Avg: 5.5, Max: 10.8, Diff: 
>>> 6.4, Sum: 120.1]
>>>       [Update RS (ms): Min: 0.0, Avg: 3.7, Max: 8.7, Diff: 8.7, Sum: 
>>> 80.8]
>>>  [Processed Buffers: Min: 0, Avg: 53.7, Max: 109, Diff: 109, Sum: 1181]
>>>       [Scan RS (ms): Min: 0.0, Avg: 1.6, Max: 4.9, Diff: 4.9, Sum: 35.0]
>>>       [Code Root Scanning (ms): Min: 0.0, Avg: 0.0, Max: 0.0, Diff: 
>>> 0.0, Sum: 0.1]
>>>       [Object Copy (ms): Min: 0.1, Avg: 2.5, Max: 3.1, Diff: 3.1, 
>>> Sum: 55.3]
>>> [Termination (ms): Min: 0.0, Avg: 0.3, Max: 0.4, Diff: 0.4, Sum: 7.3]
>>>       [GC Worker Other (ms): Min: 0.0, Avg: 0.1, Max: 0.3, Diff: 
>>> 0.2, Sum: 2.0]
>>>       [GC Worker Total (ms): Min: 13.4, Avg: 13.7, Max: 14.0, Diff: 
>>> 0.6, Sum: 300.7]
>>>       [GC Worker End (ms): Min: 30271443.3, Avg: 30271443.3, Max: 
>>> 30271443.5, Diff: 0.2]
>>>    [Code Root Fixup: 0.2 ms]
>>>    [Code Root Migration: 0.1 ms]
>>>    [Clear CT: 1.0 ms]
>>>    [Other: 914.8 ms]
>>>       [Choose CSet: 0.0 ms]
>>>       [Ref Proc: 910.3 ms]
>>>       [Ref Enq: 0.5 ms]
>>>       [Free CSet: 1.7 ms]
>>>    [Eden: 3040.0M(3040.0M)->0.0B(240.0M) Survivors: 16.0M->16.0M 
>>> Heap: 4588.4M(5120.0M)->1551.4M(5120.0M)]
>>>  [Times: user=0.29 sys=0.00, real=0.93 secs]
>>>
>>> *2. Processing weak reference takes long time*
>>> 93967.047: [SoftReference, 0 refs, 0.0032025 secs]93967.051: 
>>> [WeakReference, 1 refs, 0.0012743 secs]93967.052: [FinalReference, 2 
>>> refs, 0.0010594 secs]93967.053: [PhantomReference, 97 refs, 
>>> 0.0009133 secs]93967.054: [JNI Weak Reference, 0.0000160 secs], 
>>> 0.0455414 secs]
>>>    [Parallel Time: 33.1 ms, GC Workers: 22]
>>>       [GC Worker Start (ms): Min: 93967012.9, Avg: 93967013.3, Max: 
>>> 93967013.6, Diff: 0.7]
>>>       [Ext Root Scanning (ms): Min: 4.7, Avg: 5.6, Max: 15.1, Diff: 
>>> 10.4, Sum: 122.9]
>>>       [Code Root Marking (ms): Min: 0.0, Avg: 1.1, Max: 9.2, Diff: 
>>> 9.2, Sum: 25.2]
>>>       [Update RS (ms): Min: 0.0, Avg: 3.1, Max: 4.2, Diff: 4.2, Sum: 
>>> 67.3]
>>>  [Processed Buffers: Min: 0, Avg: 53.3, Max: 125, Diff: 125, Sum: 1173]
>>>       [Scan RS (ms): Min: 0.0, Avg: 1.1, Max: 1.5, Diff: 1.4, Sum: 24.6]
>>>       [Code Root Scanning (ms): Min: 0.0, Avg: 0.0, Max: 0.0, Diff: 
>>> 0.0, Sum: 0.1]
>>>       [Object Copy (ms): Min: 16.8, Avg: 21.0, Max: 21.8, Diff: 5.0, 
>>> Sum: 463.0]
>>> [Termination (ms): Min: 0.0, Avg: 0.1, Max: 0.1, Diff: 0.1, Sum: 1.4]
>>>       [GC Worker Other (ms): Min: 0.0, Avg: 0.2, Max: 0.6, Diff: 
>>> 0.5, Sum: 4.9]
>>>       [GC Worker Total (ms): Min: 31.8, Avg: 32.2, Max: 32.7, Diff: 
>>> 0.9, Sum: 709.3]
>>>       [GC Worker End (ms): Min: 93967045.3, Avg: 93967045.5, Max: 
>>> 93967045.8, Diff: 0.5]
>>>    [Code Root Fixup: 0.2 ms]
>>>    [Code Root Migration: 0.1 ms]
>>>    [Clear CT: 0.5 ms]
>>>    [Other: 11.7 ms]
>>>       [Choose CSet: 0.0 ms]
>>>       [Ref Proc: 7.8 ms]
>>>       [Ref Enq: 0.5 ms]
>>>       [Free CSet: 0.8 ms]
>>>    [Eden: 1696.0M(1696.0M)->0.0B(1544.0M) Survivors: 32.0M->32.0M 
>>> Heap: 4021.0M(5120.0M)->2321.6M(5120.0M)]
>>>  [Times: user=0.66 sys=0.00, real=0.04 secs]
>>> 2015-05-16T13:21:33.478+0000: 93967.057: [GC 
>>> concurrent-root-region-scan-start]
>>> 2015-05-16T13:21:33.479+0000: 93967.058: Total time for which 
>>> application threads were stopped: 0.0652331 seconds
>>> 2015-05-16T13:21:33.487+0000: 93967.066: [GC 
>>> concurrent-root-region-scan-end, 0.0082516 secs]
>>> 2015-05-16T13:21:33.487+0000: 93967.066: [GC concurrent-mark-start]
>>> 2015-05-16T13:21:33.888+0000: 93967.467: [GC concurrent-mark-end, 
>>> 0.4016735 secs]
>>> 2015-05-16T13:21:33.905+0000: 93967.484: [GC remark 93967.486: [GC 
>>> ref-proc93967.486: [SoftReference, 725 refs, 0.0043522 
>>> secs]93967.490: [*WeakReference, 1430199 refs, 0.7913479 
>>> secs*]93968.281: [FinalReference, 367 refs, 0.0036350 
>>> secs]93968.285: [PhantomReference, 221 refs, 0.0031875 
>>> secs]93968.288: [JNI Weak Reference, 0.0001281 secs], 1.0652076 
>>> secs], 1.0832167 secs]
>>>  [Times: user=15.10 sys=0.19, real=1.08 secs]
>>>
>>> Appreciate your help,
>>> -Joy
>>>
>>>
>>>
>>> _______________________________________________
>>> hotspot-gc-use mailing list
>>> hotspot-gc-use at openjdk.java.net  <mailto:hotspot-gc-use at openjdk.java.net>
>>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>
>>
>>
>
>
> _______________________________________________
> hotspot-gc-use mailing list
> hotspot-gc-use at openjdk.java.net <mailto:hotspot-gc-use at openjdk.java.net>
> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>
>
>
>
> _______________________________________________
> hotspot-gc-use mailing list
> hotspot-gc-use at openjdk.java.net
> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.openjdk.java.net/pipermail/hotspot-gc-use/attachments/20150520/3d2ed998/attachment-0001.html>


More information about the hotspot-gc-use mailing list