Unexpected LIMIT results with PySpark connector

EDIT: query.count can actually be a long or a query, my bad

Hello, query.count must be a Cypher query that returns an integer (e.g. RETURN 42 or something more dynamic), as documented towards the end of the following section.
Can you change your query.count setting and see what happens?