hprof format question
Kirk Pepperdine
kirk.pepperdine at gmail.com
Wed Oct 31 18:33:37 UTC 2018
Hi Simon,
I’ve also started a small project to try and solve the we need to look at very large heap problem. My solution is to load the data into Neo4J. You can find the project on my GitHub account.
So, I believe I’ve taken the same tactic in just abandoning the segment for the moment. It would be useful to sort that out but I listed it as a future…
Kind regards,
Kirk
L
> On Oct 31, 2018, at 4:07 AM, Simon Roberts <simon at dancingcloudservices.com> wrote:
>
> Hi all, I'm hoping this is the correct list for a question on the hprof file format (1.0.2)?
>
> I found this information: http://hg.openjdk.java.net/jdk6/jdk6/jdk/raw-file/tip/src/share/demo/jvmti/hprof/manual.html <http://hg.openjdk.java.net/jdk6/jdk6/jdk/raw-file/tip/src/share/demo/jvmti/hprof/manual.html>
>
> and have been working on a small project to read these files. (Yes, I know that NetBeans/VisualVM and Eclipse both have such libraries, and a number of other tools have been derived from those, but so far as I can tell, they all are fundamentally built on the notion of fully decoding everything, and creating memory representations of the entire heap. I want to pull out only certain pieces of information--specifically object counts by class--from a large, ~20Gb, dump file, and those tools just give up the ghost on my systems.)
>
> Anyway, my code reads the file pretty well so far, except that the file I want to analyze seems to contradict the specifications of the document mentioned above. Specifically, after processing about five HEAP_DUMP_SEGMENTS with around 1.5 million sub-records in each, I come across some ROOT_JNI_LOCAL records. The first 15 follow the format specified in the above document (one 8 byte "ID" and two four byte values.) But the 16th omits the two four byte records (well, it might simply have more, but visual analysis shows that after the 8 byte ID, I have a new block tag, and a believable structure. I've actually noticed that several of the record types defined in this "group" seem to diverge from the paper I mentioned.
>
> My solution is that if my parser trips, it abandons that HEAP_DUMP_SEGMENT from that point forward. It doesn't seem to matter much, since I was looking for object data, and it appears that all of that has already been handled. However, clearly this is not ideal!
>
> Is there any more detailed, newer, better, information? Or anything else I should know to pursue this tool (or indeed a simple object frequency by classname result) from an hprof 1.0.2 format file?
>
> (And yes, I'm pursuing a putative memory leak :)
>
> Thanks for any input (including "dude, this is the wrong list!")
> Cheers,
> Simon
>
>
>
> --
> Simon Roberts
> (303) 249 3613
>
>
>
> --
> Simon Roberts
> (303) 249 3613
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.openjdk.java.net/pipermail/serviceability-dev/attachments/20181031/39f95fff/attachment.html>
More information about the serviceability-dev
mailing list