RFR: 8376125: Out of memory in the CDS archive error with lot of classes [v7]

Xue-Lei Andrew Fan xuelei at openjdk.org
Mon Feb 9 21:25:16 UTC 2026


On Fri, 6 Feb 2026 06:31:45 GMT, Thomas Stuefe <stuefe at openjdk.org> wrote:

> > I tried to prototype the idea to support large CDS archives with UseCompactObjectHeaders: #29556. And it really works! There is no problem to get 10GB CDS archives (I did not test bigger archives yet, 7,500 mega classes + 1,000 small classes).
> > The prototype is based on this pull request to check out large archive, and you may look at [this commit](https://github.com/openjdk/jdk/pull/29556/changes/e7a12c372480f405d2a08a75bdabac91c7328346) only, for the idea to separate Klass objects from other metadata in the archive.
> > Again, it is just a prototype to show the idea. I will close JDK-8377137 as a duplicate of JDK-8377222.
> 
> Very nice. The re-calculating of the nKlass IDs is a beauty spot, though. Would be nice if we could avoid that.

I tried a explore more to avoid that, and did not find a good balance between start-up and peak time performance.

BTW, I tried parallel patching with existing AOTCacheParallelRelocation.  The AOTCacheParallelRelocation feature provides a meaningful improvement (~15% for 1.2GB and ~24% for 3.4GB archives) in AOT cache loading time by parallelizing the pointer relocation in the RW and RO regions.  This looks like a promising direction worth exploring further, if not yet.

-------------

PR Comment: https://git.openjdk.org/jdk/pull/29494#issuecomment-3862696903


More information about the hotspot-dev mailing list