Load multiple CSV rows into same node

I have the following CSV:

id  attr   value
1   abc    1.1    
1   eww    -9.4  
1   ssv    likj
2   we2    1 
2   eww   900  
3   kuku   -91  
3   lulu   383
3   ssv    bubu 

I would like to create 3 nodes that consists of:

Node 1: {id:1, abc: 1.1, eww: -9.4, ssv: "likj"}
Node 2: {id:2, we2: 1, eww: 900}
Node 3: {id:3, kuku: -91, lulu: 383, ssv: "bubu"}

How can I build it in Neo4j cypher?

Hello @steve5 :slight_smile:

This should work with apoc.load.csv() function:

CALL apoc.load.csv('file:///data.csv')
YIELD map
MERGE (n:Node {id: map.id})
SET n += map

Regards,
Cobra

1 Like

Because you have to use APOC to laod your CSV. I used the apoc.load.csv() function and not LOAD CSV.

1 Like

But how can I load it using my URL? @cobra

This should work:

WITH "https://drive.google.com/u/0/uc?id=1kcNZm0A2I3k9xN1IfNRITOHEzdnqgG7e&export=download" AS requests_url
CALL apoc.load.csv(requests_url)
YIELD map
MERGE (n:Node {id: map.id})
SET n += map

In your CSV, the first column doesn't have a column name so in the map object you have a key which has "" as name and that's why you have this error.

So this query will remove the column with no name then load nodes:

WITH "https://drive.google.com/u/0/uc?id=1kcNZm0A2I3k9xN1IfNRITOHEzdnqgG7e&export=download" AS requests_url
CALL apoc.load.csv(requests_url)
YIELD map
WITH apoc.map.clean(map, [""], []) AS map
MERGE (n:Node {request_id: map.request_id})
SET n += map

@cobra
I have changed the URL but still it doesn't work. Please have a look:

WITH "xxxxxxxxx" AS requests_url
CALL apoc.load.csv(requests_url)
YIELD map
WITH apoc.map.clean(map, [""], []) AS map
MERGE (n:Node {request_id: map.request_id})
SET n += map

Please advise.

The request works but the file is no longer accessible. That's why it's not working.

I'm sorry, I will repeat myself. Your link is not working that's why the query doesn't work with your link but the query is good. So in order to work, you need to make sure that your file is externally accessible.
image

Weird, but it worked, thanks, your solution is working. Thanks.