Hello,
While creating a in-memory graph using this code:
MATCH
WHERE
WITH gds.alpha.graph.project
I am getting this error : Failed to invoke procedure gds.graph.project
: Caused by: java.lang.IllegalStateException: Capacity exhausted.
I observed that is always happening when the memory exceeds 300GB.
My neo4j.conf file has no configuration. I tried a lot of combinations like max.heapsize etc. but giving no conf is looked like the best solution.
I reduce the concurrency to 2 from 4. But nothing changed.
Thank you.
Hi Berkay,
As you know he error message "Capacity exhausted" indicates that the memory available to Neo4j has been fully utilized and there is no more memory available for use. This error occurs when Neo4j attempts to allocate more memory than is available.
Since you mention that the error occurs when the memory usage exceeds 300GB, it is likely that this is due to a limitation of the machine on which Neo4j is running. The amount of memory available to Neo4j depends on the resources available on the machine, such as the amount of RAM. Do you know the specs of the machine you are using?
One approach to solving this issue is to optimize the memory usage of Neo4j. This can be achieved by optimizing the query, reducing the size of the data set being processed, or increasing the amount of available memory.
You can try to increase the amount of memory available to Neo4j by modifying the neo4j.conf file. You can increase the heap size by setting the "dbms.memory.heap.max_size" property. For example, you can set it to "8g" to allocate 8 GB of memory to the heap. You mention you tried a few config changes. Can you let us know what you have already attempted?
Another approach is to reduce the size of the data set being processed by filtering the data in the MATCH clause. This can be achieved by using the WHERE clause to filter the data before processing it. Do you know how many nodes/edges are in the projection. You can learn more about memory estimation here - Memory Estimation in Neo4j
If increasing the heap size and optimizing the query do not solve the issue, you may need to consider using a machine with more resources, such as more RAM or a higher CPU core count.
Let us know how things turn out.