best practice for processing large binary file

Andrew M andrew at oc384.net
Fri Sep 23 19:25:47 PDT 2011


I have large binary files (up to 1GB) of 56 byte records of int, float 
and char primitives.  I want to read these files and process the 
records.  The JVM can have several GB of available heap if necessary.

In the bad old days I would have a Thread that reads the file and put()s 
a byte[] in a LinkedBlockingQueue<byte[]> while a consumer Thread 
take()s and processes the byte[].

Now I'm wondering if nio/nio2 and jdk7 allow something faster and also 
more elegant.  Should I use a SeekableByteChannel as shown here?

   http://download.oracle.com/javase/tutorial/essential/io/file.html

should I use a direct byte buffer for extra speed?  Should I be using a 
memory mapped file?  Files.newBufferedReader?  Files.newInputStream?

Thanks,
Andrew











More information about the nio-discuss mailing list