Hi,
i build a data processing using neo4j-spark-connector(v 2.1.0-M4). until now i used neo4j 3.3 and everything works fine.
i need to improve this process now, and for do it i need to use neo4j 3.4.5 features(datetime field) -
part of the code:
val date = Calendar.getInstance()
date.add(Calendar.DATE, -30)
date.set(Calendar.HOUR_OF_DAY, 0)
date.set(Calendar.MINUTE, 0)
date.set(Calendar.SECOND, 0)
date.set(Calendar.MILLISECOND, 0)
val d = ZonedDateTime.ofInstant(date.toInstant(), ZoneId.of(("UTC")))
val map = Map("date" -> d)
neo4j.cypher(relevanceQuery, map)
the map works because if i put long parameters it works.
the error i get:
org.neo4j.driver.v1.exceptions.ClientException: Unable to convert java.time.ZonedDateTime to Neo4j Value.
i changed all date properties to work with the new date type(before 3.4 the properties were just a long number), and now i'm a little bit stuck because of it :\ , cant upload new sprint to production.
Something that would help me tremendously would be some other folks describing / documenting how they use the connector. So if you have a few minutes at some time I would very much appreciate a blog post explaining how you use it and also feedback on what could be improved.