G1: higher perm gen footprint or a possible perm gen leak?
Jon Masamitsu
jon.masamitsu at oracle.com
Mon Jan 6 12:01:41 PST 2014
On 01/03/2014 12:46 PM, Wolfgang Pedot wrote:
> Looks like the mail you quoted (from Jose Otavio Carlomagno Filho) was
> in response to mine but I have not received it...
>
> Just to clarify:
> I know why permGen fills up and its an expected behaviour in this
> application. Having 1-2 full GCs a day is certainly not ideal but its
> also no killer and I like how G1 handles the young/old heap. What makes
> me wonder is why after every 4th full GC permGen usage drops a good
> 250MB lower than the 3 collects before and there is space for
> significantly more classes afterwards (165k vs 125k). Something else in
> permGen must get cleaned up at that time...
> That rythm keeps constant so far no matter how much time passes between
> full GCs.
>
> I dont really think G1 causes this 3-1 rythm specifically but whats
> interesting is that CMS with ClassUnloading never got significantly
> below that 0.8GB if I remember correctly.
Try
-XX:MarkSweepAlwaysCompactCount=1
which should make every full GC compact out all
the dead space.
Alternatively try
-XX:MarkSweepAlwaysCompactCount=8
and see if that changes the pattern.
product(uintx, MarkSweepAlwaysCompactCount, 4, \
"How often should we fully compact the heap (ignoring the dead " \
"space parameters)")
Jon
>
> regards
> Wolfgang
>
> PS: my older question about G1 and incremental permGen possibility to
> this mailing list is actually linked in that stackoverflow-thread so we
> have a complete circle here ;)
>
>
>
> Am 03.01.2014 19:05, schrieb YU ZHANG:
>> Very interesting post. Like someone mentioned in the comments, with
>> -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled, CMS can clean
>> classes in PermGen with minor GC. But G1 can only unload class during
>> full gc. Full GC in G1 is slow as it is single threaded.
>>
>> Thanks,
>> Jenny
>>
>> On 1/3/2014 7:47 AM, Jose Otavio Carlomagno Filho wrote:
>>> We recently switched to G1 in our application and started experiencing
>>> this type of behaviour too. Turns out G1 was not causing the problem,
>>> it was only exposing it to us.
>>>
>>> Our application would generate a large number of proxy classes and
>>> that would cause the Perm Gen to fill up until a full GC was performed
>>> by G1. When using ParallelOldGC, this would not happen because full
>>> GCs would be executed much more frequently (when the old gen was
>>> full), which prevented the perm gen from filling up.
>>>
>>> You can find more info about our problem and our analysis here:
>>> http://stackoverflow.com/questions/20274317/g1-garbage-collector-perm-gen-fills-up-indefinitely-until-a-full-gc-is-performe
>>>
>>> I recommend you use a profiling too to investigate the root cause of
>>> your Perm Gen getting filled up. There's a chance it is a leak, but as
>>> I said, in our case, it was our own application's fault and G1 exposed
>>> the problem to us.
>>>
>>> Regards,
>>> Jose
>>>
>>>
>>> On Fri, Jan 3, 2014 at 1:33 PM, Wolfgang Pedot
>>> <wolfgang.pedot at finkzeit.at <mailto:wolfgang.pedot at finkzeit.at>> wrote:
>>>
>>> Hi,
>>>
>>> I am using G1 on 7u45 for an application-server which has a "healthy"
>>> permGen churn because it generates a lot of short-lived dynamic
>>> classes
>>> (JavaScript). Currently permGen is sized at a little over 1GB and
>>> depending on usage there can be up to 2 full GCs per day (usually only
>>> 1). I have not noticed an increased permGen usage with G1 (increased
>>> size just before switching to G1) but I have noticed something odd
>>> about
>>> the permGen-usage after a collect. The class-count will always
>>> fall back
>>> to the same level which is currently 65k but the permGen usage after
>>> collect can either be ~0.8GB or ~0.55GB. There are always 3 collects
>>> resulting in 0.8GB followed by one scoring 0.55GB so there seems to be
>>> some kind of "rythm" going on. The full GCs are always triggered by
>>> permGen getting full and the loaded class count goes significantly
>>> higher after a 0.55GB collect (165k vs 125k) so I guess some classes
>>> just get unloaded later...
>>>
>>> I can not tell if this behaviour is due to G1 or some other factor in
>>> this application but I do know that I have no leak because the
>>> after-collect values are fairly stable over weeks.
>>>
>>> So I have not experienced this but am sharing anyway ;)
>>>
>>> happy new year
>>> Wolfgang
>>>
>>> Am 03.01.2014 10:12, schrieb Srinivas Ramakrishna:
>>> > I haven't narrowed it down sufficiently yet, but has anyone
>>> noticed if
>>> > G1 causes a higher perm gen footprint or, worse, a perm gen leak
>>> perhaps?
>>> > I do realize that G1 does not today (as of 7u40 at least)
>>> collect the
>>> > perm gen concurrently, rather deferring its collection to a
>>> stop-world full
>>> > gc. However, it has just come to my attention that despite full
>>> > stop-world gc's (on account of the perm gen getting full), G1
>>> still uses
>>> > more perm gen
>>> > space (in some instacnes substantially more) than ParallelOldGC even
>>> > after the full stop-world gc's, in some of our experiments. (PS:
>>> Also
>>> > noticed
>>> > that the default gc logging for G1 does not print the perm gen
>>> usage at
>>> > full gc, unlike other collectors; looks like an oversight in logging
>>> > perhaps one
>>> > that has been fixed recently; i was on 7u40 i think.)
>>> >
>>> > While I need to collect more data using non-ParallelOld, non-G1
>>> > collectors (escpeially CMS) to see how things look and to get
>>> closer to
>>> > the root
>>> > cause, I wondered if anyone else had come across a similar issue
>>> and to
>>> > check if this is a known issue.
>>> >
>>> > I'll post more details after gathering more data, but in case
>>> anyone has
>>> > experienced this, please do share.
>>> >
>>> > thank you in advance, and Happy New Year!
>>> > -- ramki
>>> >
>>> >
>>> > _______________________________________________
>>> > hotspot-gc-use mailing list
>>> > hotspot-gc-use at openjdk.java.net
>>> <mailto:hotspot-gc-use at openjdk.java.net>
>>> > http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>> >
>>>
>>> _______________________________________________
>>> hotspot-gc-use mailing list
>>> hotspot-gc-use at openjdk.java.net
>>> <mailto:hotspot-gc-use at openjdk.java.net>
>>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> hotspot-gc-use mailing list
>>> hotspot-gc-use at openjdk.java.net
>>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>
>>
>> _______________________________________________
>> hotspot-gc-use mailing list
>> hotspot-gc-use at openjdk.java.net
>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>
> _______________________________________________
> hotspot-gc-use mailing list
> hotspot-gc-use at openjdk.java.net
> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
More information about the hotspot-gc-use
mailing list