java.lang.OutOfMemoryError: GC overhead limit exceeded

Bernd Eckenfels bernd at eckenfels.net
Fri Mar 7 02:04:04 PST 2014


Hello,

The timeouts can be caused by excessive GC pauses, it is typical to see them along with und a OOM. From your description it sounds like a classical memory leak. The heapdump in MAT will tell you where it happens. In addition to that you can also check your gc.log file, it will most likely show an steady increase of "Heap usage after GC" and then in one point a larger number of full GCs back-to-back.

BTW: MAT has a headless mode which can be used on the production server to actually parse the heapdump into its index files. This is the phaye which needs more RAM. If you do this you can later copy the parsed results to your workstation and open it with a smaller heap.

-- 
http://bernd.eckenfels.net


-----Original Message-----
From: luke <luke.bike at gmail.com>
To: Jose Otavio Carlomagno Filho <jocf83 at gmail.com>, "hotspot-gc-use at openjdk.java.net" <hotspot-gc-use at openjdk.java.net>
Sent: Fr., 07 Mär 2014 9:09
Subject: Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

Thanks Jose,
I think I haven't in my application "System.gc()" calls, but I'm working on
a very large application so I'll check if some explicit call to GC has been
introduced.

I think the problem is how you wrote that "GC is running but is unable to
free space in the heap". Is it possible that GC can't use cpu native
threads and so can't run correctly?

few minutes before this  outOfMemory in my jboss  log I see:
            *SocketTimeoutException: *

*Caused by: java.net.SocketTimeoutException: Read timed out*
Could be related to my problem?
thanks
luca



2014-03-06 20:53 GMT+01:00 Jose Otavio Carlomagno Filho <jocf83 at gmail.com>:

> If I'm not mistaken, "*GC overhead limit exceeded" *means the GC is
> running but is unable to free space in the heap.
>
> In many cases, this is caused by the application repeatedly calling
> "System.gc()", which normally triggers a full GC. You should check your
> application code and remove these calls if they exist.
>
> Additionally, you can add "-XX:+DisableExplicitGC" to your startup
> parameters, this way the GC will not run if your application calls
> "System.gc()".
>
> Jose
>
>
> On Thu, Mar 6, 2014 at 12:13 PM, luke <luke.bike at gmail.com> wrote:
>
>> Hi,
>>
>> I'm java.lang.OutOfMemoryError in a java application working on a JBossAS.
>> It's strange that OutOfMemory happen when application is not so stressed.
>> In my application log I found this exception
>>
>> *WARN  [org.jboss.mq.Connection] Connection failure, use
>> javax.jms.Connection.setExceptionListener() to handle this error and
>> reconnect*
>> *org.jboss.mq.SpyJMSException: Exiting on IOE; - nested throwable:
>> (java.net.SocketTimeoutException: Read timed out)*
>> *    at
>> org.jboss.mq.SpyJMSException.getAsJMSException(SpyJMSException.java:72)*
>> *    at org.jboss.mq.Connection.asynchFailure(Connection.java:423)*
>> *    at
>> org.jboss.mq.il.uil2.UILClientILService.asynchFailure(UILClientILService.java:174)*
>> *    at
>> org.jboss.mq.il.uil2.SocketManager$ReadTask.handleStop(SocketManager.java:466)*
>> *    at
>> org.jboss.mq.il.uil2.SocketManager$ReadTask.run(SocketManager.java:395)*
>> *    at java.lang.Thread.run(Thread.java:619)*
>> *Caused by: java.net.SocketTimeoutException: Read timed out*
>>
>> and after some minutes:
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *2014-03-06 01:09:32,173 WARN  [org.jboss.mq.Connection] Exception
>> listener ended abnormally:   java.lang.OutOfMemoryError: GC overhead limit
>> exceeded    at
>> java.lang.ThreadLocal.createInheritedMap(ThreadLocal.java:217)    at
>> java.lang.Thread.init(Thread.java:358)    at
>> java.lang.Thread.<init>(Thread.java:445)     at
>> org.jboss.mq.SpyMessageConsumer.setMessageListener(SpyMessageConsumer.java:237)
>> at
>> it.oneans.iemx.qf.ejb.QueueService$QueueServiceExceptionListener.onException(QueueService.java:193)
>> at
>> org.jboss.mq.Connection$ExceptionListenerRunnable.run(Connection.java:1356)
>>     at java.lang.Thread.run(Thread.java:619)*
>>
>> In my GC.log I can see  a rapid increase in heap memory:
>>
>> 54967.049: [GC [PSYoungGen: 171815K->3032K(2024448K)]
>> 1716963K->1583328K(8315904K), 0.0466930 secs] [Times: user=0.20 sys=0.09,
>> real=0.04 secs]
>>
>> 54967.097: [*Full GC (System)* [PSYoungGen: 3032K->0K(2024448K)] [*ParOldGen:
>> 1580296K->1501278K*(6291456K)] 1583328K->1501278K(8315904K) [PSPermGen:
>> 230071K->229632K(239744K)], 4.5397660 secs] [Times: user=18.01 sys=2.81,
>> real=4.53 secs]
>>
>> ...
>>
>> 55546.522: [GC [PSYoungGen: 1883953K->129792K(1929216K)]
>> 6315956K->4689948K(8220672K), 0.7681860 secs] [Times: user=8.76 sys=0.61,
>> real=0.77 secs]
>>
>> 55561.317: [GC [PSYoungGen: 1890304K->124543K(1928448K)]
>> 6450460K->4814699K(8219904K), 1.8698640 secs] [Times: user=3.30 sys=0.26,
>> real=1.87 secs]
>>
>> ...
>>
>> 55754.485: [GC [PSYoungGen: 1753886K->116213K(1881920K)]
>> 7755780K->6232689K(8173376K), 0.5959420 secs] [Times: user=4.34 sys=0.30,
>> real=0.60 secs]
>>
>> 55755.083: [Full GC [PSYoungGen: 116213K->0K(1881920K)] [*ParOldGen:
>> 6116476K->6031245K*(6291456K)] 6232689K->6031245K(8173376K) [PSPermGen:
>> 229665K->222795K(231488K)], 36.6400980 secs] [Times: user=160.17 sys=8.40,
>> real=36.63 secs]
>>
>> Could be OutOfmemory a side effect related to not enough free sockets  on
>> the server or something else?
>>
>>
>> thanks in advance for any suggestions
>> luca
>> P.S.:my gc triggers:
>>                 -Xms6g -Xmx6g -XX:MaxPermSize=512m
>>                 -Dsun.rmi.dgc.client.gcInterval=2100000
>> -Dsun.rmi.dgc.server.gcInterval=2100000
>>                 -XX:+UseParallelOldGC -XX:+UseParallelGC
>>                 -XX:MaxHeapFreeRatio=70 -XX:MinHeapFreeRatio=40
>> -Xverify:none -XX:+BindGCTaskThreadsToCPUs
>>                 -XX:NewSize=2g -XX:MaxNewSize=2g -XX:SurvivorRatio=4
>>                  -Djava.awt.headless=true
>>
>> _______________________________________________
>> hotspot-gc-use mailing list
>> hotspot-gc-use at openjdk.java.net
>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>
>>
>


More information about the hotspot-gc-use mailing list