I got something like that:
WITH a.id as `aId`, a.acRef as `acRef`
WITH b.id as bId, b.bcRef as `bcRef`
WHERE acRef=c.aRef and bcRef=c.bRef
RETURN aId, bId, c.id as `cId`
but ofcourse it doesn't work because I'm no...
I have table which size is ~100gb in postgresql. When I try to import that table:
/home/user/neo4j-etl-cli-1.2.1/bin/neo4j-etl export --url jdbc:postgresql://127.0.0.1:5432/base --user user --password pass --schema myschema --fs 100 --import-tool /us...
I've read etl documentation that you linked but I still can't exclude big tables from the import.
Let's say I have schema myschema in postgresql with many tables and I want to exclude two tables: firsttable and secondtable. I tried:
Thanks. I will have to write custom exporter to split those records in several csv files.
Could you tell me how to use exclude tables flag in neo4j-etl? I have to skip that one large table during export-import.