Hi
I successfully set up apoc.load.json to work with AWS S3 and it works beautifully except for some very long files.
The code I am using to load data looks like this:
UNWIND ['<boto3 dynamically generated URL to file>'] AS files
CALL apoc.load.json(files) YIELD value
WITH value as this_file
UNWIND this_file as item
MERGE (n_source:Group { id: item.source_id }) MERGE (n_target:Application { id: item.target_id }) MERGE (n_source)-[r:HAS_RELATED_APPLICATION ]->(n_target)
Is there a way to force this approach to batch updates as they are coming in?
I tried using 'USING PERIODIC COMMIT 1000' but that was rejected flat out because of the call to the apoc function.
I also looked at using CALL apoc.periodic.iterate() but I couldn't find a breakdown of the steps in the query that would work with that function.
Any thoughts?