I have a spark dataframe that has a key column and multiple other columns (column1, column2, column3, etc...). I need to write each row of the dataframe to neo4j where the key column in each row is a node, and every other column in that row is a node with a link to that key column node. So the result would be a bunch of key column nodes and each one having several nodes connected to it representing the columns in the dataset. How would I achieve this using the neo4j spark connector? link
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How to write neo4j in python with neo4j-spark-connector | 12 | 2102 | November 12, 2020 | |
Write in neo4j from hive | 0 | 343 | June 2, 2023 | |
Neo4j Spark Connector: write in through query | 4 | 585 | February 10, 2023 | |
Deleting nodes/relationships with the Spark connector | 1 | 285 | December 14, 2023 | |
Unable to write relationships using neo4j spark connector with spark 3.1.1 | 0 | 728 | April 15, 2021 |