An anti-pattern is an often-used software approach which, while it may work up to a point, is inefficient and likely to cause problems in production. Yes, you can use these solutions, but in the long term, they’re likely to cause performance issues and system crashes.
Memory leaks are a troubleshooter’s nightmare. They’re often insidious, happening in the background and not easy to diagnose.
This article describes three anti-patterns that are among the top Java memory leak causes: unbounded caches, static collections and unclosed resources.
Java Memory Leaks: Causes, Symptoms and Diagnostic Tools
A Java memory leak occurs when objects are retained when the program no longer needs them. For example, variables created to hold information related to a single online sale, such as the customer’s name and details of the products purchased, won’t be needed after the sale has completed and the information recorded in the database. If these variables aren’t cleared when the transaction completes, they build up in memory over time. This is known as a memory leak.
Objects are only released for garbage collection when they have no live references pointing to them. This happens when the objects go out of scope. Local variables, for example, are released when the method or block that created them completes. It also happens when the object is set to null. A common cause of memory leaks in Java, therefore, is declaring variables within the wrong scope.
However, if some other object is holding a reference to those variables, they will not be garbage collected until that reference is released. Memory leaks often occur when a long-living object, such as a class or instance variable, holds a reference to a local variable.
Over time, a memory leak can cause performance to degrade. The garbage collector (GC) has to work harder and harder to try to clear enough memory for new requests. This results in high CPU usage, and long pauses where all application threads are stopped during critical GC activity. Eventually, the system may crash with an OutOfMemoryError.
The image below shows a comparison between a typical healthy GC pattern and a memory leak pattern. Memory usage is graphed over time, with GC events marked as red triangles.

Fig: Healthy GC pattern vs Memory Leak Pattern
In the healthy application, the GC is consistently able to bring memory usage down to a similar level. In the leak pattern, the bottom line slopes upwards, showing that uncleared memory is increasing. GC events become more frequent, until eventually they may run back to back.
For more information on Java memory leak causes, and how to diagnose and fix them, see this article: Common Memory Leaks in Java.
A GC log analyzer such as GCeasy and a heap dump analyzer such as HeapHero are the best tools for diagnosing memory leaks.
Common Anti-Patterns That Cause Java Memory Leaks
Let’s look now in detail at the three anti-patterns we mentioned earlier.
1. Unbounded Caches
Caches are used to speed up applications when either re-fetching or recalculating data is less efficient than storing it in memory. Use cases include:
- Storing reference tables;
- Storing details of most-bought products;
- Resized images.
This can speed up applications dramatically if it’s used correctly. However, this advantage is lost if memory is over-used and the GC has to work harder, hogging CPU time.
Unbounded caches are caches that:
- Have no size limit;
- Have no eviction policy.
This allows the cache to keep growing over time, and may lead to OutOfMemoryErrors. It’s important to have a caching algorithm that sets limits on the size of the collection that stores the cache, and removes old items to make room for new ones. We can write our own algorithm, but it’s often better to use third-party libraries such as Google Guava or Caffeine, which are efficient, configurable and tried and tested.
Let’s first look at setting a maximum size for a cache. Most caches are stored in objects created from classes that implement the Collection interface. The interface, and most of these classes, don’t have the facility to set a maximum size, unfortunately. However, it’s not difficult to overcome this.We can easily write our own classes to extend the chosen Collection class, including simple code such as this, overriding the superclass’s add method:
public void add(E element) {
if (size() >= max) {
aggressiveClear();
if (size() >= max)
throw new IllegalStateException("Max size reached");
}
super.add(e);
}
We first check if the added element is going to push the cache over a defined maximum size. If so, we call a method within our class that implements an aggressive clearance policy. If this doesn’t clear enough space, we throw an exception. Only if there is enough space do we call the add() method of the superclass to add the element.
If we’re using Guava or Caffeine, this functionality is already included. Alternatively, the Apache Commons libraries offer several bounded Collection classes. These allow us to specify a maximum size.
Next let’s look at eviction policies.
We need some sort of mechanism to make sure the items in the cache that are most likely to be called for remain in memory, while the less likely are cleared periodically to make room for new items. Popular eviction policies include:
- LRU (Least Recently Used): We regularly evict items that haven’t been used lately. We include a time last used for each item in the cache;
- LFU (Least Frequently Used): We regularly evict items that aren’t often used. We include the number of requests since the last clearance for each item in the cache;
- TTL (Time to Live): We specify how long an item will remain in memory. We include the time added to the cache for each item.
We need to have a background thread that runs from time to time to implement the policy, scanning the cache for items to evict and removing them. Again, we can use our own algorithms for this, but it’s often simpler to use Guava or Caffeine.
To make our application flexible and able to respond to changing demands, it’s useful to make parameters such as the cache size and the clearance aggression levels configurable.
2. Static Collections
One of the major causes of Java memory issues is improper use of static variables. If these variables are collections, the danger of them causing problematic memory leaks is very much increased, since collections may be large.
Let’s start by reviewing the concept of scope in Java. The scope of a variable is determined by where in the code it is declared. Scope determines when an object becomes eligible for garbage collection, and its range of visibility. Once an object’s scope is no longer live, the object can be garbage collected, but only if no other live object holds a reference to it. The three main types of scope are summarized in the table below.
| Scope | Where and How Defined | When Eligible for Garbage Collection |
| (if no other live objects hold a reference to it) | ||
| Local | Within a block | When the block completes its task |
| Instance | Outside any block | When the object created from this class is released for GC |
| Class (Static) | Declared with keyword static | When the class is unloaded |
Additionally, any variables defined within an interface are both static and final.
As we see, static variables remain in memory until the defining class is unloaded. When does this happen? Only when the class loader that created the class is unloaded. In most cases, this never happens: the static variables remain in memory until the JVM shuts down. Applications that specifically include creating and removing their own class loaders in their design may unload classes. Other applications do not. An example of an application that does unload classes is a web server that has a separate class loader for each web app deployed, and allows a web app to be explicitly undeployed while the JVM is running.
If we therefore define a large collection as a static variable, it will usually remain there for the duration of the program. If the collection in turn holds a reference to a local variable, that variable can never be garbage collected, unless the program specifically sets the reference to null.
Let’s look at a sample program that illustrates this.
import java.nio.ByteBuffer;import java.util.ArrayList;// This program demonstrates a memory leak due to a class variable holding a reference// to a local variable// ***************************************************************************************public class BuggyProg14 {// Class variable holds reference to local variable static ArrayList arr = new ArrayList(); public static void main(String[] args) {// Create an object from the class BuggyProg14 obj = new BuggyProg14(); }// Permanent loop, which will eventually end in an OutOfMemoryError public BuggyProg14() { while(true) method1(); }// Method 1 creates a large local variable public void method1() { ByteBuffer buffer1= ByteBuffer.allocate(32768);// The class variable holds a reference to it arr.add(buffer1);// Even though it is set to null, it can't be garbage collected buffer1=null;// Delay to give time to take a heap dump try{Thread.sleep(20);} catch(Exception e){} }}
An ArrayList is created as a static variable. In the class’s constructor, it repeatedly loops through method1, which creates a large local variable buffer1, then adds it to the array. This variable can never be garbage collected, even when it goes out of scope after the method completes. Heap usage will continue to increase until the JVM is either killed or throws an OutOfMemoryError.
The bottom line: be sparing with the use of static variables, and never use them for collections. Static variables are one of the most common Java memory leak causes, and should always be used with caution.
3. Unclosed Resources
What are ‘Resources’? In programming terms, a resource is anything managed externally to the program, which is in finite supply. They must be requested when needed, and given back when no longer needed. Examples include memory, file handles and database connections. In this article, we’re specifically interested in memory.
Examples of classes that manage memory resources include streams, result sets and sockets. An extreme example is streams defined in the java.util.zip package, which hold very large chunks of memory.
This type of class should ideally have a close() method, which releases the resources. We need to make sure we always call this method when we no longer need the resources. Often, programmers forget to do this, or just couldn’t be bothered. This can result in huge amounts of wasted memory.Another point to remember is that we must make sure the call to the close() is always reached in all circumstances. Have a look at the logic of this program.
import java.io.FileInputStream;import java.io.IOException;// This program has a memory leak due to failing to close // file input streams when an exception is thrown// *******************************************************public class BuggyProg15 { private static String fileName=""; public static void main(String[] args) throws Exception { fileName=args[0]; for (int i = 0; i < 50000; i++) { try{Thread.sleep(20);} catch(Exception e){} readFileWithoutClosing(); } System.out.println("Finished reading files"); } // This leaks memory since the stream is never closed private static void readFileWithoutClosing() throws IOException { // Create a new input stream FileInputStream stream1 = new FileInputStream(fileName); try { // Read a few bytes (simulate use) byte[] buffer = new byte[1024]; stream1.read(buffer); // The next statement throws an exception, which then skips the close Integer.parseInt("A"); stream1.close(); } catch(Exception e) {} }}
The program theoretically reads a file, closes it and repeats this 50 000 times. This should work fine, since the resources are released each time. But are they? If an exception is thrown anywhere within this try block, (which in this case it always will be due to attempting to create an integer from a non-numeric string), the close() statement will never be reached. This creates a serious memory leak. So how do we prevent this?
Take a look at the modified method for processing the file below:
private static void readFile() throws IOException { // Create a new input stream FileInputStream stream1 = new FileInputStream(fileName); try { // Read a few bytes (simulate use) byte[] buffer = new byte[1024]; stream1.read(buffer); // The next statement throws an exception Integer.parseInt("A"); } catch(Exception e) { } finally { stream1.close(); } }}
Because the close() method is in a finally block, it will always be executed, regardless of whether the block threw an exception or not. This should always be standard practice when closing resources.
One final thing to think about here: can’t we just set the object that holds the resources to null, so all the resources are released? Maybe, but we can’t rely on it. Many older classes make use of a finalizer() method. This method has been deprecated, because the object is not actually garbage collected until the finalize() method has been run. If the finalizer queue is very busy, or even hung, it may be some time before this method is actually executed – or it may never be executed. This means that the object’s resources may not be released in a timely manner, and could result in a Java memory leak.
Conclusion
We’ve looked at three anti-patterns that are common Java memory leak causes.
We always need to bear these in mind when we’re writing new code. We should also look for them and remove them if we’re refactoring existing code.
When troubleshooting memory issues, being aware of these anti-patterns can help us find and solve the problem quickly.

Share your Thoughts!