AW: best practice for processing large binary file
Christian Schlichtherle
christian at schlichtherle.de
Tue Sep 27 01:46:44 PDT 2011
Hi Andrew,
> Assuming the file is larger than not just JVM heap but total system RAM
> also, what then?
This should be covered by the Javadoc at
http://download.oracle.com/javase/7/docs/api/java/nio/channels/FileChannel.h
tml#map(java.nio.channels.FileChannel.MapMode,%20long,%20long)
The size parameter for the region to map cannot exceed Integer.MAX_VALUE.
This may or may not be more than the available RAM. So in any case, you
would need to prepare your code to map a file in chunks.
> In those cases I would like some way for the JVM to be smartly loading
> the file into RAM ahead of me accessing it with ByteBuffer's getter
> methods. Once the data has been accessed then that portion of the
> buffered data could be freed up. Is that how it works?
Given all these limitations of FileChannel.map(), I would look for something
else, in particular for a long running application. A traditional buffering
solution might be better. If you just need read-only access to the large
file, you might want to consider this (shameless self-advertising):
http://truezip.java.net/apidocs/de/schlichtherle/truezip/rof/BufferedReadOnl
yFile.html
Regards,
Christian
More information about the nio-discuss
mailing list