OutOfMemoryError in Java — How to Fix (2026) | Tutorials Logic
What is This Error?
The OutOfMemoryError (OOM) is thrown when the JVM cannot allocate an object because it has run out of heap memory and the garbage collector cannot free enough space. This is a serious error that usually indicates a memory leak or insufficient heap configuration.
Error Messages:
java.lang.OutOfMemoryError: Java heap spacejava.lang.OutOfMemoryError: GC overhead limit exceededjava.lang.OutOfMemoryError: Metaspace
Common Causes
Quick Fix (TL;DR)
# Increase heap size (temporary fix)
java -Xmx512m -Xms256m MyApplication
# For large applications
java -Xmx2g MyApplication
# Enable GC logging to diagnose
java -Xmx512m -verbose:gc -XX:+PrintGCDetails MyApplication
Common Scenarios & Solutions
Scenario 1: Loading All Data at Once
// ❌ Loading millions of records into memory
List allRecords = database.findAll(); // OOM for large tables!
allRecords.forEach(r -> process(r));
// ✅ Process in batches
int pageSize = 1000;
int page = 0;
List batch;
do {
batch = database.findAll(PageRequest.of(page++, pageSize));
batch.forEach(r -> process(r));
batch.clear(); // Help GC
} while (batch.size() == pageSize);
// ✅ Or use streaming (Spring Data)
database.streamAll().forEach(r -> process(r));
Scenario 2: Memory Leak via Static Collection
class Cache {
// ❌ Static map grows forever — never cleared!
static Map cache = new HashMap<>();
static void add(String key, Object value) {
cache.put(key, value); // Memory leak!
}
}
// ✅ Use WeakHashMap — entries removed when key is GC'd
static Map cache = new WeakHashMap<>();
// ✅ Or use a bounded cache with eviction
static Map cache = Collections.synchronizedMap(
new LinkedHashMap<>(100, 0.75f, true) {
protected boolean removeEldestEntry(Map.Entry eldest) {
return size() > 100; // Max 100 entries
}
}
);
// ✅ Or use Caffeine/Guava cache with TTL
Cache cache = Caffeine.newBuilder()
.maximumSize(1000)
.expireAfterWrite(10, TimeUnit.MINUTES)
.build();
Scenario 3: Reading Large Files
// ❌ Reading entire 10GB file into memory
byte[] content = Files.readAllBytes(Paths.get("huge-file.csv")); // OOM!
// ✅ Stream lines one at a time
try (Stream lines = Files.lines(Paths.get("huge-file.csv"))) {
lines.forEach(line -> processLine(line));
}
// ✅ Or use BufferedReader
try (BufferedReader reader = new BufferedReader(new FileReader("huge-file.csv"))) {
String line;
while ((line = reader.readLine()) != null) {
processLine(line);
}
}
Best Practices to Avoid This Error
Related Errors
Key Takeaways
- OutOfMemoryError means the JVM heap is full and GC cannot free enough space
- Process large datasets in batches or streams instead of loading all at once
- Static collections that grow without bounds are a common source of memory leaks
- Use try-with-resources to ensure streams and connections are properly closed
- Profile your application with VisualVM or JProfiler to find memory leaks
- Increase heap size with -Xmx as a temporary fix while investigating the root cause
Frequently Asked Questions
Level Up Your Core java Skills
Master Core java with these hand-picked resources
10,000+ learners
Free forever
Updated 2026
Related Java Topics