Hello, thank you very much for your answer !
I just check it works before to mark the topic as solved.
However, isn't it redundant to set a bus line id attribute in :NEXT_TO
whereas I plan to create a relationship between BusStop nodes and BusLines (containing, among others, the busline id) Nodes ?
EDIT:
As I said, I have a Json with two fields. My exemple shows a 'LineString' type but many rows has a 'MultiString' type representing buslines having alternative ways.
for 'MultiString' types, coordinates stores a 3 level nested array: an array of lines beeing arrays of bus stops beeing arrays of floats : [ [ [float, float ], [ float, float ], ... ], [ ... ] , ... ]
.
With this particular context, I'm using FOREACH( CASE )
syntax two discriminates those two cases. Unfortunately, having the coordinates
alliases outside the the FOREACH
statement throws the following error :
Variable `geoShape` not defined (line 9, column 32 (offset: 356) )
" FOREACH(trash in CASE WHEN geoShape.type = 'LineString' THEN[1] ELSE[] END |"
^
With the following code :
LOAD CSV WITH HEADERS FROM "file:///bus_lignes.csv" AS line FIELDTERMINATOR ";"
MATCH (l:BusLine) WHERE l.id_ligne = line.ID
CALL apoc.convert.fromJsonMap(line.`Geo Shape`) YIELD value AS geoShape
WITH geoShape.coordinates as coordinates
WITH coordinates , RANGE(0, SIZE(coordinates)-2) as iterList
FOREACH(i in iterList |
FOREACH(trash in CASE WHEN geoShape.type = 'LineString' THEN[1] ELSE[] END |
MERGE (s0:BusStop:GeographicPoint {
point : Point({
longitude : coordinates[i][0],
latitude : coordinates[i][1]
})
})
MERGE (s1:BusStop:GeographicPoint {
point : Point({
longitude : coordinates[i+1][0],
latitude : coordinates[i+1][1]
})
})
MERGE (s0)-[:SERVED_BY]->(l)
MERGE (s1)-[:SERVED_BY]->(l)
MERGE (s0)-[:NEXT_TO {
busLineId : l.id_ligne
}]->(s1)
) FOREACH(trash in CASE WHEN geoShape.type = 'MultiString' THEN[1] ELSE[] END |
FOREACH(subline in coordinates |
WITH subline , RANGE(0, SIZE(subline)-2) as iterList
FOREACH(i in iterList |
MERGE (s0:BusStop:GeographicPoint {
point : Point({
longitude : subline[i][0],
latitude : subline[i][1]
})
})
MERGE (s1:BusStop:GeographicPoint {
point : Point({
longitude : subline[i+1][0],
latitude : subline[i+1][1]
})
})
MERGE (s0)-[:SERVED_BY]->(l)
MERGE (s1)-[:SERVED_BY]->(l)
MERGE (s0)-[:NEXT_TO {
busLineId : l.id_ligne
}]->(s1)
)
)
)
)
However, putting the aliasing things inside the FOREACH
gives the following error:
Invalid use of WITH inside FOREACH (line 8, column 3 (offset: 279))
" WITH geoShape.coordinates as coordinates"
^
caused by the following code :
LOAD CSV WITH HEADERS FROM "file:///bus_lignes.csv" AS line FIELDTERMINATOR ";"
MATCH (l:BusLine) WHERE l.id_ligne = line.ID
CALL apoc.convert.fromJsonMap(line.`Geo Shape`) YIELD value AS geoShape
FOREACH(trash in CASE WHEN geoShape.type = 'LineString' THEN[1] ELSE[] END |
WITH geoShape.coordinates as coordinates
WITH coordinates , RANGE(0, SIZE(coordinates)-2) as iterList
FOREACH(i in iterList |
MERGE (s0:BusStop:GeographicPoint {
point : Point({
longitude : coordinates[i][0],
latitude : coordinates[i][1]
})
})
MERGE (s1:BusStop:GeographicPoint {
point : Point({
longitude : coordinates[i+1][0],
latitude : coordinates[i+1][1]
})
})
MERGE (s0)-[:SERVED_BY]->(l)
MERGE (s1)-[:SERVED_BY]->(l)
MERGE (s0)-[:NEXT_TO {
busLineId : l.id_ligne
}]->(s1)
)
)
FOREACH(trash in CASE WHEN geoShape.type = 'MultiString' THEN[1] ELSE[] END |
WITH geoShape.coordinates as coordinates
FOREACH(subline in coordinates |
WITH subline , RANGE(0, SIZE(subline)-2) as iterList
FOREACH(i in iterList |
MERGE (s0:BusStop:GeographicPoint {
point : Point({
longitude : subline[i][0],
latitude : subline[i][1]
})
})
MERGE (s1:BusStop:GeographicPoint {
point : Point({
longitude : subline[i+1][0],
latitude : subline[i+1][1]
})
})
MERGE (s0)-[:SERVED_BY]->(l)
MERGE (s1)-[:SERVED_BY]->(l)
MERGE (s0)-[:NEXT_TO {
busLineId : l.id_ligne
}]->(s1)
)
)
)
How can I work around this limitation ? May be I have a bad understanding of the statement I'm using ?
I'm sorry for this amount of code, I wanted to be exhaustive for a better understanding of my work.
Thank you for your help !