Hi,
I've been successfully able to use the neo4j-spark-connector to read graph data from Neo4j into Spark's dataframes. I now have a use case to write large amount of data from Spark back into Neo4j. Using the mergeEdgeList function, I successfully wrote back small amount of data into my graph in Neo4j. However, when tried with larger amount of data, I began running into the LockClient dead lock issue (I've found multiple instances of this reported).
2 questions:
- Is using neo4j-admin import the best way to import/update large graphs in Neo4j currently?
- Are there alternatives? I prefer to process raw data with Spark, and write into Neo4j also from Spark to avoid intermediate csv files..etc.
Thank you in advance.