Uploading CSV data to Neo4j instance in AuraDB

Hello, I have some LARGE files in a google cloud storage bucket, im already able to download them, but i cant upload them to neo4j. Here is my script for uploading:

src_edges = "file:///" + os.path.join(current_dir, edges_blob_name).replace("\\", "/")

script = """use """+str(bd_name)+"""
LOAD CSV with HEADERS FROM '"""+src_edges+"""'  AS row
with row WHERE row.oneway = 'True'
CALL {
...
}

This will actually works if I run neo4j locally, it just need in the configurations the download files path being enabled for neo4j, but i cant do this in the AuraDB instance because the file obviously wont be in the machine where that instance will be running, how can I upload it?

I also tried to upload it reading the csv file in my machine as a dataframe with pandas an upload the dataframe row by row itereating over the dataframe, but this is REALLY SLOW because the csv files are too big.

The bucket in cloud storage is private by the way.

Thanks to you all