Nodes with PointValue property cannot be successfully sink to another Neo4j instance

Hi team, I have 2 questions.

First I'm using neo4j streams to sync data from one neo4j instance to another, and I found that nodes with PointValue property cannot be successfully sink , it will throw error like

ErrorData(originalTopic=my-topic, timestamp=1684462474785, partition=0, offset=29675, exception=org.neo4j.graphdb.QueryExecutionException: Property values can only be of primitive types or arrays thereof. Encountered: Map{latitude -> Double(5.670000e+01), longitude -> Double(1.278000e+01), height -> Double(8.000000e+00)}., key=null, value={"meta":{"timestamp":1684379795919,"username":"test","txId":10101,"txEventId":0,"txEventsCount":1,"operation":"created","source":{"hostname":"xxxxx"}},"payload":{"id":"2","befo, executingClass=class streams.kafka.KafkaAutoCommitEventConsumer)
	at streams.service.errors.KafkaErrorService.report(KafkaErrorService.kt:37) ~[neo4j-streams-4.1.2.jar:?]
	at streams.kafka.KafkaAutoCommitEventConsumer.executeAction(KafkaAutoCommitEventConsumer.kt:97) ~[neo4j-streams-4.1.2.jar:?]
	at streams.kafka.KafkaAutoCommitEventConsumer.readSimple(KafkaAutoCommitEventConsumer.kt:89) ~[neo4j-streams-4.1.2.jar:?]
	at streams.kafka.KafkaAutoCommitEventConsumer.read(KafkaAutoCommitEventConsumer.kt:118) ~[neo4j-streams-4.1.2.jar:?]
	at streams.kafka.KafkaEventSink$createJob$1.invokeSuspend(KafkaEventSink.kt:163) [neo4j-streams-4.1.2.jar:?]
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:39) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750) [neo4j-streams-4.1.2.jar:?]
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678) [neo4j-streams-4.1.2.jar:?]

Here is my sink configurations

streams.sink.topic.cdc.sourceId=my-topic
streams.sink.topic.cdc.sourceId.labelName = Test
streams.sink.topic.cdc.sourceId.idName = neo4jSyncId
streams.sink.enabled=true
streams.source.enabled=false
kafka.bootstrap.servers=localhost:9092 
kafka.group.id=ss-8

Is there something wrong with my configuration?

The second is that I notice the neo4j streams is been deprecated and need to use Kafka connect, but according to my test, neo4j streams cannot completely replaced by Kafka connect. For example, Kafka connector can not capture deletes. Are there plans to re-enable the neo4j stream in the future, and can be integrated with Neo4j 5.

Thanks!