Appropriate configurations mixture to accelerate import of big data in Neo4j database

Hello,

i am trying to import huge volumes of data in a community edition Neo4j graph database community edition that runs as a standalone server in linux. My approach is loading through Python Driver and transactions. However the python client application i wrote works but performs very badly. Can anyone suggest the appropriate mixture of configuration settings in /etc/neo4j/neo4j.conf configuration file in order to accelerate the import process through transactions? I allready use transactions in batches of size 1000000. Can anyone share his technicall skills to help me out? Any idea about what slows the execution of my batched queries would be appreciated. Thanks in advance for your time.

What's your total data volume?

50k per transaction

1M updates need 2-3G heap per tx

I'd suggest on a 16G server

4G heap, 10G page-cache.

use batches / parameters, but only 50k otherwise you blow the transport layer

https://medium.com/neo4j/5-tips-tricks-for-fast-batched-updates-of-graph-structures-with-neo4j-and-cypher-73c7f693c8cc