Python - What is the best way to avoid transaction errors when loading large data sets?

I've been trying to load large amounts of data using the python driver for neo4j as part of an ETL script that I am running. However I have been hitting dbms.memory.transaction.total.max errors at a certain point in the script.

I have several transactions running like similar to how the docs show:

def write_users(tx: Transaction, users):
                result = tx.run(query, users=users)
                result.consume()

session.execute_write(write_users, users=[some_large_list])

I have tried batching the list into smaller sets and running each batch individually, but the error message always remains the same. Is there something I'm missing about clearing the transaction data out of memory, or is there something I should be doing like using a new instance of with driver.session(database="neo4j") as session for each transaction or creating and closing the session between transactions?

I appreciate the help! :slight_smile:

How many writes are you executing?

Are they required to be in a transaction, I assume that if you are happy to try reducing the batch size then they don't need to be in a transaction? You could handle any failed writes via an exception process.

If you are doing ETL and it's largely the same data every time, you could also check for changes to reduce the set.

Can you share more of your code including how the transaction is instantiated and the query? Also which version of Neo4j?