Scalability issues with ParallelOld

Jon Masamitsu Jon.Masamitsu at Sun.COM
Tue Mar 10 17:36:03 UTC 2009



On 03/10/09 09:36, Alex Aisinzon wrote:
> Jon
> 
> As always, your feedback is always enlightening and very much
> appreciated.
> I have some additional comments/questions before closing that
> conversation thread:
> The server has 2 cores. We have servers with more cores (4 and 8) and
> will likely run some tests with those. On these servers, ParallelOldGC
> may or may not prove better performing for the following reason: while I
> used a single JVM on that 2 cores server, we would use 4 JVMs on an 8
> core server (same ratio of one JVM for 2 cores). In that case, the many
> threads used during a full GC would compete for resources with the other
> running JVMs. 

Yes, multiple VM's doing GC's at the same time will
compete for resources.  That's not just true for UseParallelOldGC.
It's also the case with UseParallelGC where parallel GC threads
are used for the young gen collection.  Consider setting
ParallelGCThreads explicitly if you frequent instances of poor
scaling (i.e., if you look at the user time and the real time
and they show little scaling).

> Some Sun benchmarks (by example
> http://www.spec.org/jbb2005/results/res2008q2/jbb2005-20080506-00485.htm
> l) infer that ParallelOldGC may be beneficial even with 2 cores. My
> hypothesis is that this is because the application tested (the
> SPECJBB2005 code) uses much fewer longer lived objects, as shown by the
> JVM tunings (-Xmx3350m -Xms3350m -Xmn2800m infers that the tenured is
> only 550MB large).

That benchmark benefits from the young generation parallel GC
(UseParallelGC) which scales better than the old generation
parallel GC.  Also, as you note, much of the data is short
lived so gets collected by the young generation collections.

> 
> Thanks again for your thoughts.

If you do anymore CMS experiments, send me a log and
I'll look to see if it is worth pursuing.


> 
> Regards
> 
> Alex Aisinzon
> 
> -----Original Message-----
> From: Jon.Masamitsu at Sun.COM [mailto:Jon.Masamitsu at Sun.COM] 
> Sent: Tuesday, March 10, 2009 8:57 AM
> To: Alex Aisinzon
> Cc: hotspot-gc-use at openjdk.java.net
> Subject: Re: Scalability issues with ParallelOld
> 
> 
> 
> On 03/09/09 16:14, Alex Aisinzon wrote:
>> Hi all
>>
>> I try to am experimenting with ParallelOldGC in our performance
> testing
>> environment.
>> The server is a Dual Core Opteron 280 (old hardware with few cores by
>> today's standard).
>> With Sun JDK 1.5, full collections with ParallelGC and 2 GC threads
> last
>> between 5 & 7 seconds. I tried used ParallelOld with 2 GC threads. The
>> full collections are almost twice the time.
>> I then tried with Sun JDK 1.6 to see if it was any better. It is not
>> significantly
> 
> In the logs with UseParallelOldGC I see an average major pause of about
> 14 sec with 1.5 and about 9.3 sec with 1.6.  That's about the
> improvement I would expect.
> 
> With UseParallelOldGC using 2 GC threads is about the break even point.
> That depends on the application.  I would not expect using 2 GC threads
> with UseParallelOldGC to be better than UseParallelGC.  UseParallelOldGC
> does do more work.  If you look at the last entry in the 1.6 log
> 
> 3034.907: [Full GC [PSYoungGen: 12157K->0K(276160K)] [ParOldGen: 
> 2324556K->828465K(2330176K)] 2336714K->828465K(2606336K) [PSPermGen: 
> 82577K->82549K(102400K)], 9.0664330 secs] [Times: user=16.81 sys=0.03, 
> real=9.07 secs]
> 
> the user time is 16.81 secs and the real time is 9 secs so that says to
> me that we have both GC threads working in parallel and perhaps doing
> twice the work of the UseParallelGC on a full collection.
> 
> 
>> I have enclosed the logs (Sun JDK 1.5 with ParallelGC and
> ParallelOldGC
>> and Sun JDK 1.6 with ParallelOldGC).
>>
>> I have also experimented with CMS with mixed results: I could not get
> it
>> to work when using the 64 bit release and double the heap. With 32
> bit,
>> it experienced some rare but longer pauses than ParallelGC. Therefore,
>> it did not seem a good alternative to ParallelGC to consistently have
>> short pauses.
> 
> CMS sometimes takes quite a bit of tuning to run well.  The goal of
> CMS is to not have the full GC's (that would be analogous to the full
> GC's with UseParallelGC or UseParallelOldGC).
> 
>> One thing to note is that our application has a large amount of long
>> lived objects. My experience is that a lot of long lived objects make
>> full collections longer.
> 
> This (plus the need for low pauses) would indicate that CMS may a good
> choice although CMS works best when there is excess processing power
> that CMS can use concurrently with the application.  Does you
> platform have 2 or 4 hardware threads?
> 
>> What are my options to consistently reduce the longest GC pauses?
>>
>> Would our application profile (lots of long lived objects) make it a
>> good candidate for the coming lower collector aka G1?
> 
> Depends of lots of things but part of the design of G1 is
> is to do collections of the heap in increments.  You would
> get shorter pauses that a full GC certainly but would get
> more collections (more, shorter collections).  G1 also does works
> concurrently with application so having available processing
> power helps.
> 
> 
>> Thanks in advance
>>
>> Alex Aisinzon
>>
>> -----Original Message-----
>> From: hotspot-gc-use-bounces at openjdk.java.net
>> [mailto:hotspot-gc-use-bounces at openjdk.java.net] On Behalf Of
>> hotspot-gc-use-request at openjdk.java.net
>> Sent: Wednesday, February 25, 2009 8:41 PM
>> To: hotspot-gc-use at openjdk.java.net
>> Subject: hotspot-gc-use Digest, Vol 15, Issue 7
>>
>> Send hotspot-gc-use mailing list submissions to
>> 	hotspot-gc-use at openjdk.java.net
>>
>> To subscribe or unsubscribe via the World Wide Web, visit
>> 	http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>> or, via email, send a message with subject or body 'help' to
>> 	hotspot-gc-use-request at openjdk.java.net
>>
>> You can reach the person managing the list at
>> 	hotspot-gc-use-owner at openjdk.java.net
>>
>> When replying, please edit your Subject line so it is more specific
>> than "Re: Contents of hotspot-gc-use digest..."
>>
>>
>> Today's Topics:
>>
>>    1. Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>>       platform	within data processing application (Jon
> Masamitsu)
>>    2. Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>>       platform	within data processing application (Y Srinivas
>> Ramakrishna)
>>
>>
>> ----------------------------------------------------------------------
>>
>> Message: 1
>> Date: Wed, 25 Feb 2009 14:42:32 -0800
>> From: Jon Masamitsu <Jon.Masamitsu at Sun.COM>
>> Subject: Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>> 	platform	within data processing application
>> To: David Sitsky <sits at nuix.com>
>> Cc: Y Srinivas Ramakrishna <Y.S.Ramakrishna at Sun.COM>,
>> 	hotspot-gc-use at openjdk.java.net
>> Message-ID: <49A5C958.8070704 at sun.com>
>> Content-Type: text/plain; charset=us-ascii
>>
>> David,
>>
>> This is an educated guess but I would say that no out-of-memory
>> was thrown because there was still significant space in the
>> young gen after a collection.
>>
>> 62712.664: [Full GC [PSYoungGen: 98240K->98240K(214720K)] [PSOldGen: 
>> 683678K->683678K(699072K)] 781918K->781918K(913792K) [PSPermGen:
>>
>> The young gen capacity being 214720K and the used space
>> in the young gen being  98240K.  That missing space may be in the
>> survivor spaces which are not directly available to the
>> application with the UseParallelGC collector.  The logic
>> that would throw the out-of-memory is very conservative
>> and probably does not allow for such a case.
>>
>> Jon
>>
>>
>>
>> David Sitsky wrote On 02/25/09 13:59,:
>>
>>> Hi Ramki,
>>>
>>> The next message I posted to the hotspot list showed a GC trace when
> I 
>>> allocated more memory (an extra 300 megs), where everything worked
>> fine. 
>>>  I am well aware that the heap size for this particular application
>> and 
>>> data set was too small.  My run from last night with the extra heap
> is 
>>> still running nicely.
>>>
>>> I reported this issue, because it seemed to me no progress was being 
>>> made and no OutOfMemoryErrors were being generated.  The application
>> was 
>>> effectively "stuck" making no progress at all.  I couldn't even
> connect
>>> to it with jconsole, although jstack worked fine.
>>>
>>> My understanding is this condition is meant to be detected, and 
>>> OutOfMemoryError is meant to be thrown, but perhaps I am mistaken?
>> This 
>>> is with the parallel GC.
>>>
>>> Cheers,
>>> David
>>>
>>> Y Srinivas Ramakrishna wrote:
>>>  
>>>
>>>> Doesn't the heap look too full?
>>>> If a 64-bit JVM why use such an oversubscribed
>>>> and small heap? Either make the old gen bigger or make the
>>>> young gen smaller (giving that space to the older gen)
>>>> so that each scavenge does not degenerate to a full gc
>>>> as in your trace below.
>>>>
>>>> This discussion probably belongs on hotspot-gc-use at o.j.n list
>>>> so i have cross-posted over to that list with a bcc to
>>>> the hotspot-dev list.
>>>>
>>>> Also the GC tuning guides to be found here might be useful
>>>> reading:-
>>>>
>>>> http://java.sun.com/javase/technologies/hotspot/gc/index.jsp
>>>>
>>>> -- ramki
>>>>
>>>> ----- Original Message -----
>>>> From: Jon Masamitsu <Jon.Masamitsu at Sun.COM>
>>>> Date: Tuesday, February 24, 2009 10:00 pm
>>>> Subject: Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>> platform within data processing application
>>>> To: David Sitsky <sits at nuix.com>
>>>> Cc: hotspot-dev at openjdk.java.net, Tom Rodriguez
>> <Thomas.Rodriguez at Sun.COM>
>>>>    
>>>>
>>>>> David,
>>>>>
>>>>> Can you also send a GC log from a run where there
>>>>> is not a problem?  As I understand it, that would
>>>>> be a 32bit run.
>>>>>
>>>>> Jon
>>>>>
>>>>> David Sitsky wrote On 02/24/09 16:04,:
>>>>>
>>>>>      
>>>>>
>>>>>> Jon Masamitsu wrote:
>>>>>>
>>>>>>
>>>>>>        
>>>>>>
>>>>>>> Jon Masamitsu wrote On 02/23/09 17:20,:
>>>>>>>
>>>>>>>   
>>>>>>>
>>>>>>>          
>>>>>>>
>>>>>>>> ...
>>>>>>>>
>>>>>>>> Increase the heap by 30%.  Also increase the the perm gen size
>>>>>>>> (-XX:MaxPermSize=<nn>).
>>>>>>>>
>>>>>>>> Please use -XX:+PrintGCDetails -XX:+PrintGCTimeStamps when 
>>>>>>>>            
>>>>>>>>
>>>>> gathering the
>>>>>      
>>>>>
>>>>>>>> GC  logs.
>>>>>>>> If you've already gathering some, send those but in future runs,
> 
>>>>>>>>            
>>>>>>>>
>>>>> use the
>>>>>      
>>>>>
>>>>>>>> above.
>>>>>>>>     
>>>>>>>>
>>>>>>>>            
>>>>>>>>
>>>>>> Here is a sample of output from a stuck process.  You can see its 
>>>>>>        
>>>>>>
>>>>> doing 
>>>>>      
>>>>>
>>>>>> a full GC about every 3 seconds, and it seems as if there is
> little 
>>>>>> progress..
>>>>>>
>>>>>> Please let me know if you need more information.
>>>>>>
>>>>>> Cheers,
>>>>>> David
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>        
>>>>>>
>>> _______________________________________________
>>> hotspot-gc-use mailing list
>>> hotspot-gc-use at openjdk.java.net
>>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>>  
>>>
>>
>>
>> ------------------------------
>>
>> Message: 2
>> Date: Wed, 25 Feb 2009 20:41:10 -0800
>> From: Y Srinivas Ramakrishna <Y.S.Ramakrishna at Sun.COM>
>> Subject: Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>> 	platform	within data processing application
>> To: David Sitsky <sits at nuix.com>
>> Cc: hotspot-gc-use at openjdk.java.net
>> Message-ID: <71e0b0454a6.49a5ace6 at sun.com>
>> Content-Type: text/plain; charset=us-ascii
>>
>>
>> Yes, sorry, my bad.
>> I think Jon can explain that the trace you included
>> will show metrics (GC overhead and Space free(d)) that somehow
>> fall below the thresholds that trigger the GCOverhead related
>> OOM. These thresholds can of course be modified via suitable
>> JVM options.
>>
>> -- ramki
>>
>> ----- Original Message -----
>> From: David Sitsky <sits at nuix.com>
>> Date: Wednesday, February 25, 2009 1:59 pm
>> Subject: Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64
>> platform within data processing application
>> To: Y Srinivas Ramakrishna <Y.S.Ramakrishna at Sun.COM>
>> Cc: hotspot-gc-use at openjdk.java.net
>>
>>
>>> Hi Ramki,
>>>
>>> The next message I posted to the hotspot list showed a GC trace when
> I
>>> allocated more memory (an extra 300 megs), where everything worked 
>>> fine.  I am well aware that the heap size for this particular 
>>> application and data set was too small.  My run from last night with 
>>> the extra heap is still running nicely.
>>>
>>> I reported this issue, because it seemed to me no progress was being 
>>> made and no OutOfMemoryErrors were being generated.  The application 
>>> was effectively "stuck" making no progress at all.  I couldn't even 
>>> connect to it with jconsole, although jstack worked fine.
>>>
>>> My understanding is this condition is meant to be detected, and 
>>> OutOfMemoryError is meant to be thrown, but perhaps I am mistaken?  
>>> This is with the parallel GC.
>>>
>>> Cheers,
>>> David
>>>
>>> Y Srinivas Ramakrishna wrote:
>>>> Doesn't the heap look too full?
>>>> If a 64-bit JVM why use such an oversubscribed
>>>> and small heap? Either make the old gen bigger or make the
>>>> young gen smaller (giving that space to the older gen)
>>>> so that each scavenge does not degenerate to a full gc
>>>> as in your trace below.
>>>>
>>>> This discussion probably belongs on hotspot-gc-use at o.j.n list
>>>> so i have cross-posted over to that list with a bcc to
>>>> the hotspot-dev list.
>>>>
>>>> Also the GC tuning guides to be found here might be useful
>>>> reading:-
>>>>
>>>> http://java.sun.com/javase/technologies/hotspot/gc/index.jsp
>>>>
>>>> -- ramki
>>>>
>>>> ----- Original Message -----
>>>> From: Jon Masamitsu <Jon.Masamitsu at Sun.COM>
>>>> Date: Tuesday, February 24, 2009 10:00 pm
>>>> Subject: Re: 100% CPU usage in "VM Thread" for Hotspot 10/11 on x64 
>>> platform within data processing application
>>>> To: David Sitsky <sits at nuix.com>
>>>> Cc: hotspot-dev at openjdk.java.net, Tom Rodriguez
>> <Thomas.Rodriguez at Sun.COM>
>>>>> David,
>>>>>
>>>>> Can you also send a GC log from a run where there
>>>>> is not a problem?  As I understand it, that would
>>>>> be a 32bit run.
>>>>>
>>>>> Jon
>>>>>
>>>>> David Sitsky wrote On 02/24/09 16:04,:
>>>>>
>>>>>> Jon Masamitsu wrote:
>>>>>>
>>>>>>> Jon Masamitsu wrote On 02/23/09 17:20,:
>>>>>>>
>>>>>>>   
>>>>>>>> ...
>>>>>>>>
>>>>>>>> Increase the heap by 30%.  Also increase the the perm gen size
>>>>>>>> (-XX:MaxPermSize=<nn>).
>>>>>>>>
>>>>>>>> Please use -XX:+PrintGCDetails -XX:+PrintGCTimeStamps when 
>>>>> gathering the
>>>>>>>> GC  logs.
>>>>>>>> If you've already gathering some, send those but in future runs,
> 
>>>>> use the
>>>>>>>> above.
>>>>>>>>     
>>>>>> Here is a sample of output from a stuck process.  You can see its 
>>>>> doing 
>>>>>> a full GC about every 3 seconds, and it seems as if there is
> little
>>> progress..
>>>>>> Please let me know if you need more information.
>>>>>>
>>>>>> Cheers,
>>>>>> David
>>>>>>
>>>>>> 62402.320: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8063178 secs] [Times:
> user=2.81
>>> sys=0.00, real=2.81 secs]
>>>>>> 62405.128: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8029997 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.81 secs]
>>>>>> 62407.932: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7917325 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62410.725: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7891387 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62413.515: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7649110 secs] [Times:
> user=2.76
>>> sys=0.00, real=2.76 secs]
>>>>>> 62416.281: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7803983 secs] [Times:
> user=2.78
>>> sys=0.00, real=2.78 secs]
>>>>>> 62419.063: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7643979 secs] [Times:
> user=2.76
>>> sys=0.00, real=2.76 secs]
>>>>>> 62421.828: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8114336 secs] [Times:
> user=2.81
>>> sys=0.00, real=2.81 secs]
>>>>>> 62424.640: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7964912 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62427.438: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8107278 secs] [Times:
> user=2.81
>>> sys=0.00, real=2.81 secs]
>>>>>> 62430.249: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 3.2345212 secs] [Times:
> user=2.84
>>> sys=0.00, real=3.24 secs]
>>>>>> 62433.484: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8341520 secs] [Times:
> user=2.82
>>> sys=0.00, real=2.82 secs]
>>>>>> 62436.319: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8698768 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.87 secs]
>>>>>> 62439.190: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9323230 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.92 secs]
>>>>>> 62442.124: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9644960 secs] [Times:
> user=2.96
>>> sys=0.00, real=2.96 secs]
>>>>>> 62445.089: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 3.0059221 secs] [Times:
> user=3.00
>>> sys=0.00, real=3.00 secs]
>>>>>> 62448.095: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9832815 secs] [Times:
> user=3.00
>>> sys=0.00, real=2.99 secs]
>>>>>> 62451.079: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9587156 secs] [Times:
> user=2.93
>>> sys=0.00, real=2.95 secs]
>>>>>> 62454.039: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9488345 secs] [Times:
> user=2.92
>>> sys=0.00, real=2.95 secs]
>>>>>> 62456.988: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8969788 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.90 secs]
>>>>>> 62459.886: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8794991 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62462.766: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683675K->683675K(699072K)] 781915K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8842411 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.89 secs]
>>>>>> 62465.651: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683675K(699072K)] 781916K->781915K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8669173 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.85 secs]
>>>>>> 62468.519: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8664429 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.86 secs]
>>>>>> 62471.386: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8844494 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.89 secs]
>>>>>> 62474.271: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8648398 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.87 secs]
>>>>>> 62477.137: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8971068 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.90 secs]
>>>>>> 62480.034: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8655618 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.86 secs]
>>>>>> 62482.901: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 3.0366140 secs] [Times:
> user=2.78
>>> sys=0.00, real=3.04 secs]
>>>>>> 62485.939: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8541753 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.85 secs]
>>>>>> 62488.794: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8582816 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.86 secs]
>>>>>> 62491.653: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8673218 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.86 secs]
>>>>>> 62494.521: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9014120 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.90 secs]
>>>>>> 62497.424: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8805843 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62500.305: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8905128 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62503.196: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9052007 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.92 secs]
>>>>>> 62506.102: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9004575 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.90 secs]
>>>>>> 62509.003: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9160655 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.92 secs]
>>>>>> 62511.920: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9013277 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.90 secs]
>>>>>> 62514.822: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8982061 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62517.721: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8922437 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62520.614: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8873520 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.89 secs]
>>>>>> 62523.502: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8805296 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62526.383: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8958714 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.89 secs]
>>>>>> 62529.279: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8735384 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62532.154: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8705676 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.87 secs]
>>>>>> 62535.025: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8723947 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.87 secs]
>>>>>> 62537.898: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8624400 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.86 secs]
>>>>>> 62540.761: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8245748 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62543.587: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8432269 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62546.432: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8394157 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62549.272: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8471951 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.85 secs]
>>>>>> 62552.121: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8584107 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.86 secs]
>>>>>> 62554.981: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683676K->683676K(699072K)] 781916K->781916K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8376807 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62557.820: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8402486 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62560.661: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8482704 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.85 secs]
>>>>>> 62563.511: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8115973 secs] [Times:
> user=2.81
>>> sys=0.00, real=2.81 secs]
>>>>>> 62566.324: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8523278 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.86 secs]
>>>>>> 62569.177: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8128563 secs] [Times:
> user=2.81
>>> sys=0.00, real=2.81 secs]
>>>>>> 62571.990: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7830644 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62574.774: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8065106 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.81 secs]
>>>>>> 62577.582: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7892171 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62580.372: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8059306 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62583.179: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8641470 secs] [Times:
> user=2.82
>>> sys=0.00, real=2.86 secs]
>>>>>> 62586.044: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8421364 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62588.887: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8852699 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62591.773: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9164279 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.92 secs]
>>>>>> 62594.690: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9450010 secs] [Times:
> user=2.95
>>> sys=0.00, real=2.95 secs]
>>>>>> 62597.636: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9744636 secs] [Times:
> user=2.98
>>> sys=0.00, real=2.98 secs]
>>>>>> 62600.611: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9900849 secs] [Times:
> user=2.99
>>> sys=0.00, real=3.00 secs]
>>>>>> 62603.602: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.6332370 secs] [Times:
> user=2.62
>>> sys=0.00, real=2.62 secs]
>>>>>> 62606.236: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9801260 secs] [Times:
> user=2.95
>>> sys=0.00, real=2.98 secs]
>>>>>> 62609.226: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9166374 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.92 secs]
>>>>>> 62612.150: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9475729 secs] [Times:
> user=2.95
>>> sys=0.00, real=2.95 secs]
>>>>>> 62615.098: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9328670 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.93 secs]
>>>>>> 62618.040: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8963825 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.90 secs]
>>>>>> 62620.937: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8834715 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.89 secs]
>>>>>> 62623.821: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8800691 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.87 secs]
>>>>>> 62626.701: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683677K(699072K)] 781918K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8642587 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.87 secs]
>>>>>> 62629.566: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683677K->683677K(699072K)] 781917K->781917K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8574615 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.86 secs]
>>>>>> 62632.424: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8383412 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>>>>> 62635.264: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8409891 secs] [Times:
> user=2.82
>>> sys=0.00, real=2.84 secs]
>>>>>> 62638.106: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7906216 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62640.898: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7891730 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62643.688: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7892940 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62646.479: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7766807 secs] [Times:
> user=2.78
>>> sys=0.00, real=2.78 secs]
>>>>>> 62649.257: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7796531 secs] [Times:
> user=2.78
>>> sys=0.00, real=2.78 secs]
>>>>>> 62652.037: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7687240 secs] [Times:
> user=2.76
>>> sys=0.00, real=2.76 secs]
>>>>>> 62654.807: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7613769 secs] [Times:
> user=2.76
>>> sys=0.00, real=2.76 secs]
>>>>>> 62657.570: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7712254 secs] [Times:
> user=2.78
>>> sys=0.00, real=2.78 secs]
>>>>>> 62660.342: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7968108 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62663.139: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.7924173 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.79 secs]
>>>>>> 62665.933: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8002912 secs] [Times:
> user=2.79
>>> sys=0.00, real=2.81 secs]
>>>>>> 62668.736: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8291434 secs] [Times:
> user=2.82
>>> sys=0.00, real=2.82 secs]
>>>>>> 62671.566: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8527186 secs] [Times:
> user=2.86
>>> sys=0.00, real=2.85 secs]
>>>>>> 62674.419: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8982825 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.90 secs]
>>>>>> 62677.318: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9254483 secs] [Times:
> user=2.93
>>> sys=0.00, real=2.93 secs]
>>>>>> 62680.244: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9707015 secs] [Times:
> user=2.95
>>> sys=0.00, real=2.96 secs]
>>>>>> 62683.216: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9894145 secs] [Times:
> user=3.00
>>> sys=0.00, real=3.00 secs]
>>>>>> 62686.206: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9870305 secs] [Times:
> user=2.98
>>> sys=0.00, real=2.98 secs]
>>>>>> 62689.193: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9884647 secs] [Times:
> user=2.98
>>> sys=0.00, real=3.00 secs]
>>>>>> 62692.183: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9635276 secs] [Times:
> user=2.96
>>> sys=0.00, real=2.96 secs]
>>>>>> 62695.147: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9407559 secs] [Times:
> user=2.93
>>> sys=0.00, real=2.93 secs]
>>>>>> 62698.088: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9299386 secs] [Times:
> user=2.93
>>> sys=0.00, real=2.93 secs]
>>>>>> 62701.019: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8994903 secs] [Times:
> user=2.90
>>> sys=0.00, real=2.90 secs]
>>>>>> 62703.919: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9163417 secs] [Times:
> user=2.92
>>> sys=0.00, real=2.92 secs]
>>>>>> 62706.836: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9216473 secs] [Times:
> user=2.93
>>> sys=0.00, real=2.93 secs]
>>>>>> 62709.758: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.9052547 secs] [Times:
> user=2.89
>>> sys=0.00, real=2.90 secs]
>>>>>> 62712.664: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8902824 secs] [Times:
> user=2.85
>>> sys=0.00, real=2.89 secs]
>>>>>> 62715.555: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8865932 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.89 secs]
>>>>>> 62718.442: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8605445 secs] [Times:
> user=2.87
>>> sys=0.00, real=2.87 secs]
>>>>>> 62721.304: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683678K->683678K(699072K)] 781918K->781918K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8662771 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.86 secs]
>>>>>> 62724.171: [Full GC [PSYoungGen: 98240K->98240K(214720K)] 
>>> [PSOldGen: 683679K->683679K(699072K)] 781919K->781919K(913792K) 
>>> [PSPermGen: 49694K->49694K(49984K)], 2.8369076 secs] [Times:
> user=2.84
>>> sys=0.00, real=2.84 secs]
>>
>> ------------------------------
>>
>> _______________________________________________
>> hotspot-gc-use mailing list
>> hotspot-gc-use at openjdk.java.net
>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
>>
>>
>> End of hotspot-gc-use Digest, Vol 15, Issue 7
>> *********************************************
>>
>>
>>
> ------------------------------------------------------------------------
>> _______________________________________________
>> hotspot-gc-use mailing list
>> hotspot-gc-use at openjdk.java.net
>> http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use
_______________________________________________
hotspot-gc-use mailing list
hotspot-gc-use at openjdk.java.net
http://mail.openjdk.java.net/mailman/listinfo/hotspot-gc-use



More information about the hotspot-gc-dev mailing list