I am using python 3.6 and my neo4j v 4.0
I am currently writing large amounts of data to the neo4j database and it is taking way longer than I hope it to. I could really use some ideas on how to improve my current way of writing data to the database.
At the moment I receive big JSON files with anything between 300 - 3000 values that need to be inserted into neo4j. Some are nodes and some are relations.
Currently i'm using the python neo4j library to write to the database by running the following code:
with driver.session() as session: for nodeData in data1: session.write_transaction(create_data_function_node, nodeData, fileData)
Which essentially means im running thousands of individual MERGE queries against the database.
Currently my longest "write" time was about 270 seconds which is too much considering that I can't multi-thread writing to neo4j because as far as I know having 2 threads will just end up blocking each other (I did some mild testing on it and it did not seem to help the situation...)
What can I implement in order to try and write data in a more efficient manner ?