RFR: 8376125: Out of memory in the CDS archive error with lot of classes [v5]
Xue-Lei Andrew Fan
xuelei at openjdk.org
Wed Feb 4 20:19:39 UTC 2026
On Tue, 3 Feb 2026 04:25:14 GMT, Ashutosh Mehra <asmehra at openjdk.org> wrote:
>> According to https://gitlab.haskell.org/ghc/ghc/-/issues/17414
>>
>>> File reads/writes bigger than 2GB result in an "Invalid argument" exception on macOS. Files bigger than 2GB still work, but individual read/write operations bigger than 2GB fail.
>>
>> I think it's better to move this fix into `os::pd_write()` (within `#ifdef __APPLE__`) to limit the writes to less than 2GB.
>
> @iklam thanks for digging that up. It explains why the INT_MAX limit worked. But I should also mention that the above output does not show the actual reason for the failure. Here the `os::write` failed causing the `fd` to be closed in `FileMapInfo::write_bytes`. However, the error is not propagated up the call chain and we end up calling `FileMapInfo::seek_to_position` which throws EBADF (Bad file descriptor).
> So while we can keep the change in `os::write` (or `os::pd_write` as suggested) I think we should also fix `FileMapInfo::write_bytes` to 1) print the os error, and 2) terminate the write operation gracefully. I am also fine if this is done in a follow-up pr.
I am going to move the update to os::pd_write. Further improvement will be in a follow-up request.
-------------
PR Review Comment: https://git.openjdk.org/jdk/pull/29494#discussion_r2765817603
More information about the hotspot-dev
mailing list