I have a spark dataframe that has a key column and multiple other columns (column1, column2, column3, etc...). I need to write each row of the dataframe to neo4j where the key column in each row is a node, and every other column in that row is a node with a link to that key column node. So the result would be a bunch of key column nodes and each one having several nodes connected to it representing the columns in the dataset. How would I achieve this using the neo4j spark connector? link
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| How to add property in existing node from spark connector | 0 | 24 | April 25, 2025 | |
| Suggest the best way to create nodes and relationships using spark job? | 3 | 66 | March 7, 2025 | |
| Neo4j Spark Connector: write in through query | 4 | 633 | February 10, 2023 | |
| Can we find any benchmarking figures for neo4j spark connector (DataFrame to DB) | 1 | 477 | November 12, 2020 | |
| How to write neo4j in python with neo4j-spark-connector | 12 | 2171 | November 12, 2020 |