Java Web Application Memory Leak

Created: 6 January 2016  Modified:

Recently a Java Web Application on which I work ran out of memory and had to be restarted. We have monitoring tools but not ones that are helpful identifying the source of a memory leak. Working for a large organization with an established bureacracy my options were quite limited. After some research the only available option was to use the “jmap” command line utility that comes with the JDK. This discussion assumes that you have access to the computer/server on which the Java application server is running.

To follow along you will need an application server preferrably with a web application deployed to the server. You will need a current version of the JDK instlled. In my setup I am running Liferay Portal on a Tomcat 7 application server using JDK 8. These instructions should work with any Java base application server. From a high level view it is a simple process.

As a developer I have multiple versions of the JDK installed on my workstation. The first time I ran jmap I received the following error. This was due to using Oracle JDK 8 to run the server and the OpenJDK 8 being in my path. This was easily resolved by running the jmap from the Oracle JDK installation.

Terminal window in mysecurity directory

bash-4.3$ jmap -histo:live 12345
Error attaching to process: sun.jvm.hotspot.runtime.VMVersionMismatchException: Supported versions are 25.65-b01. Target VM is 25.45-b02
sun.jvm.hotspot.debugger.DebuggerException: sun.jvm.hotspot.runtime.VMVersionMismatchException: Supported versions are 25.65-b01. Target VM is 25.45-b0

Jmap can provide you with an overview of the heap which is useful for tuning garbage collection and a histogram listing objects memory usage from most to least. I will provide an example of the first but our focus will largely be on the second.

jmap -heap 22340

bash-4.3$ ./jmap -heap 22340
Attaching to process ID 22340, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 25.45-b02

using parallel threads in the new generation.
using thread-local object allocation.
Concurrent Mark-Sweep GC

Heap Configuration:
   MinHeapFreeRatio         = 40
   MaxHeapFreeRatio         = 70
   MaxHeapSize              = 1073741824 (1024.0MB)
   NewSize                  = 134217728 (128.0MB)
   MaxNewSize               = 134217728 (128.0MB)
   OldSize                  = 939524096 (896.0MB)
   NewRatio                 = 2
   SurvivorRatio            = 8
   MetaspaceSize            = 16777216 (16.0MB)
   CompressedClassSpaceSize = 1073741824 (1024.0MB)
   MaxMetaspaceSize         = 4294963200 (4095.99609375MB)
   G1HeapRegionSize         = 0 (0.0MB)

Heap Usage:
New Generation (Eden + 1 Survivor Space):
   capacity = 120848384 (115.25MB)
   used     = 62650520 (59.748191833496094MB)
   free     = 58197864 (55.501808166503906MB)
   51.84224887938924% used
Eden Space:
   capacity = 107479040 (102.5MB)
   used     = 49281184 (46.998199462890625MB)
   free     = 58197856 (55.501800537109375MB)
   45.85190191501525% used
From Space:
   capacity = 13369344 (12.75MB)
   used     = 13369336 (12.749992370605469MB)
   free     = 8 (7.62939453125E-6MB)
   99.99994016161152% used
To Space:
   capacity = 13369344 (12.75MB)
   used     = 0 (0.0MB)
   free     = 13369344 (12.75MB)
   0.0% used
concurrent mark-sweep generation:
   capacity = 939524096 (896.0MB)
   used     = 17592173835622 MB
   free     = 12802787843112 (1.2209689944374084E7MB)
   -1362588.609863179% used

69485 interned Strings occupying 6458400 bytes.

Information such as the ratios, Eden space and New Generation space can be used to tune your JVMs garbage collection. JVM tuning is outside the scope of this article. Java Performance Tuning delves into the subject quite thoroughly.

jmap -heap 22340

bash-4.3$ ./jmap -histo:live 22340 | more

 num     #instances         #bytes  class name
----------------------------------------------
   1:       1540727      159689648  [C
   2:         61031       71410712  [B
   3:       2486669       59680056  java.util.HashMap$Node
   4:        147882       32565744  [Ljava.util.HashMap$Node;
   5:       1528219       24451504  java.lang.String
   6:        257708       22678304  java.lang.reflect.Method
   7:        279874        5420832  [Ljava.lang.Class;
   8:        128250        5130000  java.util.HashMap
   9:        152320        4874240  java.util.LinkedHashMap$Entry
  10:        110241        4750624  [Ljava.lang.Object;
  11:         74052        4739328  java.lang.reflect.Field
  12:         78133        4375448  java.util.LinkedHashMap
  13:        140261        3366264  java.util.Hashtable$Entry
  14:        147399        2876032  [Ljava.lang.String;
  15:         30736        2647576  [I
  16:         25009        2552912  java.lang.Class
  17:         85499        2051976  java.lang.ref.WeakReference

The “:live” part of the command only displays objects that are still being referenced. This will be more useful for diagnosing my memory leak. As we can see the objects taking up the most memory are “[C”. What is this class? It turns out this is documented in the Java API documents under Class.getName(). A copy of which is provided below.

Class.getName() type codes

Element Type       | Encoding     |
-------------------|---------     |
boolean            | Z            |
byte               | B            |
char               | C            |
class or interface | Lclassname;  |
double             | D            |
float              | F            |
int                | I            |
long               | J            |
short              | S            |

Characters taking up most of the space is believable since the JVM stores Strings as character arrays in memory. This will be most useful if you take a snapshot when your application is running normally. Then when something goes wrong you will have a baseline for comparison.

tags: tomcat - performance - java - jmap - memory leak
   Less Is More