Neo4j / Spark / Databricks - connection fails

Having issues connecting to neo4j from databricks. Installed the driver on the cluster successfully however my connection is failing:

df = (spark.read.format("org.neo4j.spark.DataSource")
.option("url", NEO_DEV_URL)
.option("authentication.type", "basic")
.option("authentication.basic.username", NEO_USER)
.option("authentication.basic.password", NEO_PASSWORD)
.option("labels", "GrootCampaignActual")
.load()
)

Using: neo4j://neo4j-poc-bolt.ds-xxx.xxx.io:7687 i get:
org.[REDACTED].driver.exceptions.ServiceUnavailableException: Could not perform discovery for database '<default database>'. No routing server available.

Using bolt -> bolt://neo4j-poc-bolt.ds-nonprod.wds.io:7687:

> org.[REDACTED].driver.exceptions.ServiceUnavailableException: Unable to connect to [REDACTED]-poc-bolt.ds-nonprod.wds.io:7687, ensure the database is running and that there is a working network connection to it.

any help would be appreciated! thank you!