This works great for creating nodes and relationships when the total count is under 1000, but when there are 1M+ nodes/relationships, it tends to not finish.
I know there are server parameters in neo4j.conf that can be adjusted which may help with the load. I know I can save the dataframe to csv and load from harddisk USING PERIODIC COMMIT. I know I can split my dataframe and create a for loop and process the loop from within python. But I don't want to go those routes. I want to get apoc.periodic.commit to work within the add_data function.
I have tried several iterations in attempt to get it to work, but to no avail. I am hoping the community can help.