Tuesday, June 24, 2008

Philip Jacob thinks that the Eclipse Memory Analyzer is a must have tool :

I also had a little incident with a 1.5Gb heap dump yesterday. I wanted to analyze it after one of our app servers coughed it up (right before it crashed hard) to find out what the problem was. I tried jhat, which seemed to require more memory than could possibly fit into my laptop (with 4Gb). I tried Yourkit, which also stalled trying to read this large dump file (actually, Yourkit’s profiler looked pretty cool, so I shall probably revisit that). I even tried firing up jhat on an EC2 box with 15Gb of memory… but that also didn’t work. Finally, I ran across the Eclipse Memory Analyzer. Based on my previous two experiences, I didn’t expect this one to work…. but, holy cow, it did. Within just a few minutes, I had my culprit nailed (big memory leak in XStream 1.2.2) and I was much further along than I was previously.

Thanks Philip for the positive feedback!I didn't know that EC2 supports big multi core boxes. That is very interesting because the Eclipse Memory Analyzer does take advantage of multiple cores and the available memory on 64 bit operating systems. It will "fly" on one of these boxes.

Wednesday, June 04, 2008

a very interesting leak, were WeakHashmap doesn't seem to release entries that don't seem to be referenced anymore.Actually when interned String literals are used the entries stay in the WeakHashmap even after all hard references seem to be removed.

I say "seemed to be removed" because there actually is still a reference from the Class, that is still loaded, to the interned String literal.

Unfortunately using a heap dump do find this out, does not work, because this implicit reference is (currently) not written to the heap dump file.

Tuesday, June 03, 2008

Recently I tried the Netbeans UML module to sketch some simple use case diagrams.It worked pretty well, but it didn't feel very responsive all the time. I quickly checked the memory consumption and found that it would be much higher than during my last test.

I therefore took another heap dump. Here comes the overview:Finalizers?

So this time Netbeans needed 74,2 Mbyte, much more than last time.Surprisingly 15,5Mbyte alone are consumed by instances of the class java.lang.ref.Finalizer.Such a high memory usage caused by Finalizer instances is not normal.Usually you would see Finalizer instance using a few hundred Kbyte.Next I simply checked the retained set (the object that would be reclaimed, if I could remove the Finalizer instances from memory) of these Finalizer instances:

So int[] arrays are consuming most of the memory. I again used the "immediate dominator" query on those int[] arrays to see who is keeping them in memory:

So lets take a look at those sun.awt.image.IntegerInterleavedRaster instances and see who is referencing them:

Can we blame Tom Sawyer?

We see again that java.awt.image.BufferedImage is involved as well as Java2d.surfaceData sun.java2d.SunGraphics2D is referenced by com.tomsawyer.editor.graphics.TSEDefaultGraphics (what a nice package name).Lets look at the code of surfaceData sun.java2d.SunGraphics2D: public void dispose() { surfaceData = NullSurfaceData.theInstance; invalidatePipe(); }

public void finalize() { }

"dispose" should clean surfaceData, but at least to to me it seems that nobody has called it.So I decompiled TSEDefaultGraphics and found dispose to be empty:

public void dispose() { }

So my guess is (without digging deeply into the code) that TSEDefaultGraphics needs to be fixed and call dispose on it's surfaceData instance variable.At the End

What this shows, is that you not only need to be very careful with implementing finalize(), but yo also need to take check whether you use objects that implement finalize().Objects that really need to implement finalize should be small and you should not reference large objects.