cancel
Showing results for 
Search instead for 
Did you mean: 

OutOfMemory due to Embedded Neo4J server

  1. Actually I am working on a java based project which has embedded neo4J server in it , and initially it's max heap size was 3G but due to it's more memory consumption I it was throwing OutofMemory Error so I increased the max heap to 6G but now I am getting this same error again , so my concern is how can I handle this , you have any approach that I could follow by monitoring it's memory usage and why it's taking very much heap space , is it's GC not working or something else ??

and how can I identify this issue in my local ?

is that any tool available for that by which I can able to handle

2 REPLIES 2

sameer_gijare14
Graph Buddy

Hi

There is a tool known as memrec.Please use it to tune your server memory.

Many thanks
Mr Sameer Gijare

@sandeepakkumar1999
What version of Neo4j?

The tool memrec and as described here Memory recommendations - Operations Manual is not the end all be all.

One can still cause out of memory even with the parameters set 'correctly'.

  • Run a query that results in a cartesian join between a set of nodes with label X against a set of nodes with label Y and whereby the X labeled nodes are 100 million and the Y labeled nodes are 50 million and more than likely you will out of memory.

  • Run a query that updates 500 million nodes in one txn and more than likely you will out of memory

Out of memory generally fall into either excessive query concurrency or excessive query complexity ( typically poorly defined queries) and or some combination of both. 100 concurrent match (n:Person) return n limit 1; will consume less heap/memory than a single match (n1:Person),(n2:Person) where n1.id<>n2.id with n1, n2 match .... ..... unwind .... collect... return distinct ....

As to memory configurations see

If this is Neo4j 4.4? ??? you can use CALL {} (subquery) - Neo4j Cypher Manual to batch large scale creates/updates/deletes