Hello,
I'm doing an import of the data into neo4J and I'm running out of space because of the growing transaction
folder.
With the database size close to 9GB I already have 45GB of transactions.
$ du -hs data/*
9.2G data/databases
4.0K data/dbms
0 data/dumps
4.0K data/server_id
45G data/transactions
Is it possible somehow to clean this folder or limit it's size using cypher (I'm asking about cypher as I'm planning to use auradb in the future)?
I use a list of queries like this one to import my data:
CALL apoc.periodic.iterate(
'
LOAD CSV WITH HEADERS FROM "https://path-to-the-csv-file.csv" AS row RETURN row
', '
MERGE (n: MyNode {some_id: row.some_id})
ON CREATE
SET n.created = datetime(row.created),
n.some_other_data = row.some_other_data
',
{batchSize:1000, parallel:false}) YIELD batches, total, operations
RETURN batches, total, operations;
Neo4j version:
5.6
Client: Neo4j Python Driver
Thank you for your help!