Support for running SPARQL into graph

Hi, everyone!

I would like to know if there's any plan to develop support for running SPARQL queries into Neo4j graphs.

I'm looking for a topic for my graduation conclusion work, and one thing I'm currently interested in are databases and query languages. I was wondering if there are any interest for the community something like a translation from SPARQL to Cypher. I found very few works on this (actually just one, not very expressive) and before engaging the work with my advisor, I want to know if it would be useful somehow to people other than me and my degree haha

obs: sorry for any english errors, I'm still not very good at it. Pt-br speaker here.

According to this post by Jesus Barrasa (the developer behind neosemantics), they have dropped direct support for SPARQL directly from neo4j via the neosemantics library. That's sad when one wants to do aggregation over other Linked Data sources, as (1) they don't understand Cypher and (2) doing an "export all" as RDF from a large neo4j database takes time.

Making a full-blown SPARQL endpoint, incl. SPARQL interpreter and converter to/from Cypher, might be very challenging. A more suitable (and scalable) approach is to implement a so-called Triple Pattern Fragment (TPF) endpoint instead. Translating such simple TPF requests to Cypher should not be very difficult and gives more flexibility when doing federated queries over the combination of Linked Data sources and neo4j data.

If you want to do SPARQL queries over such an endpoint (potentially combined with other Linked Data sources, i.e. SPARQL endpoints, RDF files, JSON-LD in webpages and other TPFs), one can use the Comunica query engine. This engine has a pretty good coverage of SPARQL I believe.

Anyway, I believe there would be a lot of value to share and discuss your work and progress here. There might be good inputs from the community. Maybe @jesus.barrasa also want's to give his opinion on the above :)

1 Like

Hi @mathib and @sh.thiago, that's quite interesting, actually
I had looked at TPFs in the past but I did not pursue it in the end (probably lack of time/resources) but looking at the HTTP based implementation in the doc you shared it really looks like building something like that on top of the n10s.rdf.export.spo procedure could be a low hanging fruit and like you say, would cover 'to some extent' the need for a SPARQL endpoint on top of n10s.

Again, it would all come down to priorities and resources available but definitely worth a thought. And @sh.thiago , contributions to the project are very welcome, so if you want to consider it...

Thanks for bringing it up :slight_smile: @mathib. Out of curiosity, is this something you've used/implemented before?


Hi @jesus.barrasa , great to read your reply and nice to see you're enthusiastic about it :slightly_smiling_face:

It looks indeed like low hanging fruit to implement. If available, it can become a pretty powerful feature when combined with the open source Comunica query engine which covers almost the entire SPARQL 1.1 spec

I did not develop a TPF endpoint from scratch myself, but I did actively use the endpoints enabled by setting up a Triple Pattern Fragment server using the open source available reference implementation

Hi all,

I'm part of the Linked Data Fragments/Comunica team.
Just FYI, in case someone is willing to take up this Neo4j<->TPF layer, be sure to let me know, as we're definitely willing to help out where possible.

Thanks for your offer @Ruben_Taelman :slight_smile:
I'll definitely take you up on that