Can neo4j community edition integrate with Apache Spark ?
I created a sparksession like this and created Neo4j(sc) object but when I run cypher queries, it always gives me 0 output telling that cypher returned nothing.
val Spark = SparkSession
neo = Neo4j(sc)
val rdd = neo.cypher("Query").loadRowRdd
We just announced the new connector today (as in, hours ago): https://neo4j.com/blog/announcing-neo4j-connector-for-apache-spark/
GraphFrames in Spark are not a thing that is going to be supported going forward, because this part of the spark environment is not well supported, and neo4j is a better environment to use for graph operations and algorithms. If you want to know how to load anything from the graph into a DataFrame, consult these documentations:
David this is what I see in neo4j website
val graphFrame = neo.pattern(("Person","id"),("KNOWS",null), ("Person","id")).partitions(3).rows(1000).loadGraphFrame
but when I use this, it doesn't work but it gives error message like
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/graphx/VertexRDD
The documentation you're looking at that referencing graphframes is certainly the wrong, old documentation.
Neo4j Connector for Apache Spark version 4.0.0 refers to the version of the connector not the version of Spark itself. The compatibility information is listed here: https://neo4j.com/developer/spark/overview/#_spark_compatibility
Use the documentation linked in this post, which is connector version 4.0 - and as mentioned above, it does not involve the use of graphframes.
@VGVG I can't help you with 2.4.5-M1, partially because I don't know. I recommend you upgrade to 4.0.0 and use the "write" instructions in the documentation that I linked to save a dataframe to Neo4j. At this point, 2.4.5 is old code that I myself don't use that much, and it won't be receiving much attention going forward.