Error while reading from neo4j into spark

I follow the steps in (Quickstart - Neo4j Spark Connector) and use the following package:

$SPARK_HOME/bin/spark-shell --packages neo4j-contrib:neo4j-connector-apache-spark_2.12:4.0.1_for_spark_3

Now, in spark shell, when I try to execute this (mentioned in the same site):

 import org.apache.spark.sql.{SaveMode, SparkSession}
 val spark = SparkSession.builder().getOrCreate()
 val df = spark.read.format("org.neo4j.spark.DataSource")
 .option("url", "bolt://localhost:7687")
 .option("authentication.basic.username", "neo4j")
 .option("authentication.basic.password", "neo4j")
 .option("labels", "Person")
 .load()

The following error occur:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/ReadSupport

What shall I do to fix the problem?