Bug: humongous arrays allocations OOMEs
Roman Kennke
rkennke at redhat.com
Tue Nov 29 11:00:09 UTC 2016
Ok, please push!
Roman
Am Dienstag, den 29.11.2016, 11:59 +0100 schrieb Aleksey Shipilev:
> Regression/acceptance test here:
> http://cr.openjdk.java.net/~shade/shenandoah/acceptance-alloc.patch
>
> -Aleksey
>
> On 11/29/2016 11:33 AM, Roman Kennke wrote:
> > Seems to be caused by a bug in current heuristics. The heuristics
> > refactoring +fixes I'm working on seems to solve it. Stay tuned...
> >
> >
> > Roman
> >
> >
> > Am Dienstag, den 29.11.2016, 11:03 +0100 schrieb Aleksey Shipilev:
> > > Hi,
> > >
> > > A very simple test:
> > >
> > > public class Alloc {
> > > static final int SIZE = Integer.getInteger("size", 1_000_000);
> > > static Object sink;
> > >
> > > public static void main(String... args) throws Exception {
> > > for (int c = 0; c < 1000000; c++) {
> > > sink = new int[SIZE];
> > > }
> > > }
> > > }
> > >
> > > OOMEs after some critical array size:
> > >
> > > $ java -XX:+UseShenandoahGC -Dsize=1 Alloc
> > > <OK>
> > > $ java -XX:+UseShenandoahGC -Dsize=1000 Alloc
> > > <OK>
> > > $ java -XX:+UseShenandoahGC -Dsize=1000000 Alloc
> > > Exception in thread "main" java.lang.OutOfMemoryError: Java heap
> > > space
> > > at Alloc.main(Alloc.java:12)
> > >
> > > Thanks,
> > > -Aleksey
> > >
>
>
More information about the shenandoah-dev
mailing list