Neo4j doesn't release memory

GDS 1.7 and Neo4j 4.3.4

What I have been observing is that, after a graph is loaded initially into the neo4j, the memory consumption isn't that high. However, if I run a training procedure on a projected graph, or do some other intensive query operations, the memory used may be doubled, and once the training or querying is done, the huge memory consumed isn't released. For example, I trained a FastRP model and during the training it used 10GB memory as shown from the linux free -hm command. After the training is done, the available memory just stays the same, which means the additional 10GB isn't released. The only way to get the memory back is to restart the database.

Is this normal in Neo4j? I looked at the memory tuning documentation, but it doesn't help with this.

1 Like

This is a Java thing, not a Neo4j thing.

You're using OS tools to check free memory, which don't have any insight into Java internals like JVM heap size or GC. Most GCs won't report back to the OS when they've freed up memory - once it's been commited to the heap, it won't be released/Java won't shrink the heap, even if it's free.

Starting with jdk12, this feature is being added to certain GCs (Shenandoah, G1), so it's still GC dependent, but at least possible...

You can use gds.alpha.systemMonitor() to get a better perspective on resource availability & utilization.

3 Likes

Hi Hello

Please note that you can configure - xms - xmx memory switches in heap memory apoc.conf file according to your application needs.

Thanking you
Sameer Sudhir

2 Likes