CRR (S): 7097586: G1: improve the per-space output when using jmap -heap
Tony Printezis
tony.printezis at oracle.com
Thu Jan 12 18:32:33 UTC 2012
Hi all,
I'd like a couple of code reviews for this change that enhances the heap
summary information generated by the SA (which is used for the jmap
-heap output):
http://cr.openjdk.java.net/~tonyp/7097586/webrev.0/
Currently, the heap summary generated for G1 is as close as possible to
what's generated for the other GCs. Bengt made a good suggestion that
it'd be helpful to enhance the output with some G1-specific information
in order to make it more informative. The important changes are the 15
lines or so that were changed in HeapSummary.java, the rest is
boilerplate to be able to access specific fields and objects from the
SA. I included before / after jmap -heap output below.
Note that we actually had a small bug in the code which caused the
sizing information in the G1MonitoringSupport object to become
inconsistent between a cleanup and the subsequent GC: the old space
information was not updated to reflect any old region reclamation during
cleanup. I fixed this as part of this change too (I'll add a note to the
CR).
Tony
BEFORE:
using thread-local object allocation.
Garbage-First (G1) GC with 8 thread(s)
Heap Configuration:
MinHeapFreeRatio = 40
MaxHeapFreeRatio = 70
MaxHeapSize = 1073741824 (1024.0MB)
NewSize = 1048576 (1.0MB)
MaxNewSize = 4294967295 (4095.9999990463257MB)
OldSize = 4194304 (4.0MB)
NewRatio = 2
SurvivorRatio = 8
PermSize = 16777216 (16.0MB)
MaxPermSize = 67108864 (64.0MB)
Heap Usage:
G1 Young Generation
Eden Space:
capacity = 19922944 (19.0MB)
used = 3145728 (3.0MB)
free = 16777216 (16.0MB)
15.789473684210526% used
From Space:
capacity = 2097152 (2.0MB)
used = 2097152 (2.0MB)
free = 0 (0.0MB)
100.0% used
To Space:
capacity = 0 (0.0MB)
used = 0 (0.0MB)
free = 0 (0.0MB)
0.0% used
G1 Old Generation
capacity = 19922944 (19.0MB)
used = 5849192 (5.578224182128906MB)
free = 14073752 (13.421775817871094MB)
29.359074642783717% used
Perm Generation:
capacity = 16777216 (16.0MB)
used = 2749208 (2.6218490600585938MB)
free = 14028008 (13.378150939941406MB)
16.38655662536621% used
1719 interned Strings occupying 137520 bytes.
AFTER (I marked the changes with bold; note that now there's only one
Survivor section, as G1 does not have the concept of two survivors that
are always allocated):
using thread-local object allocation.
Garbage-First (G1) GC with 8 thread(s)
Heap Configuration:
MinHeapFreeRatio = 40
MaxHeapFreeRatio = 70
MaxHeapSize = 67108864 (64.0MB)
NewSize = 1048576 (1.0MB)
MaxNewSize = 4294967295 (4095.9999990463257MB)
OldSize = 4194304 (4.0MB)
NewRatio = 2
SurvivorRatio = 8
PermSize = 16777216 (16.0MB)
MaxPermSize = 67108864 (64.0MB)
* G1HeapRegionSize = 1048576 (1.0MB)
*
Heap Usage:
*G1 Heap:
regions = 57
capacity = 59768832 (57.0MB)
used = 18018304 (17.18359375MB)
free = 41750528 (39.81640625MB)
30.146655701754387% used
*G1 Young Generation:
Eden Space:
* regions = 3
* capacity = 30408704 (29.0MB)
used = 3145728 (3.0MB)
free = 27262976 (26.0MB)
10.344827586206897% used
*Survivor Space:
regions = 2
* capacity = 2097152 (2.0MB)
used = 2097152 (2.0MB)
free = 0 (0.0MB)
100.0% used
G1 Old Generation:
* regions = 13
* capacity = 27262976 (26.0MB)
used = 12775424 (12.18359375MB)
free = 14487552 (13.81640625MB)
46.85997596153846% used
Perm Generation:
capacity = 16777216 (16.0MB)
used = 2741840 (2.6148223876953125MB)
free = 14035376 (13.385177612304688MB)
16.342639923095703% used
1710 interned Strings occupying 136904 bytes.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/hotspot-gc-dev/attachments/20120112/f35253bc/attachment.htm>
More information about the hotspot-gc-dev
mailing list