Creating RDF n10s conformant Neo4j Graph (without loading in external triples) & Rewrite this SPARQL Query in Neo4j with n10s

PREFIX pr: <http://purl.org/ontology/prv/core#>
PREFIX prefix: <http://prefix.cc/>
PREFIX br: <http://vocab.deri.ie/br#>
PREFIX unit: <http://qudt.org/vocab/unit/>
PREFIX quantitykind: <http://qudt.org/vocab/quantitykind/>
PREFIX qudt: <http://qudt.org/schema/qudt/>
PREFIX sh: <http://www.w3.org/ns/shacl#>
PREFIX owl: <http://www.w3.org/2002/07/owl#>
PREFIX brick: <https://brickschema.org/schema/1.1/Brick#>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
PREFIX ff_120: <https://test.com/ff_120#>

SELECT ?s ?o ?uuid WHERE {
  VALUES ?subclass_constraint { brick:Reset_Setpoint brick:Limit }
  ?s a brick:HeatingCurve .
  ?s brick:hasPoint ?o .
  ?o rdf:type/rdfs:subClassOf* ?subclass_constraint .
  ?o brick:TimeseriesUUID ?uuid .
}

So I have loaded in an RDF ontology file:
https://brickschema.org/ontology

In one scenario:
I create a graph using the python RDF lib and the classes and relationships therein... then I serialize the triples and load it into Neo4j...

Then I need to come up with something like the query above... but am struggling

I assume the answer is somewhere in here: Chapter 11. Inferencing/Reasoning - Neosemantics(n10s) User Guide

@jesus.barrasa
Another big big question I have is tooling... I don't really want to have to create my semantic graph with some other toolchain and then load it into Neo4j from a file...

I have started experimenting with creating the graph natively in Neo4j... but I find myself having to "check" the way n10s parses my serialized .ttl graph I've make in python RDF lib... and then structure my Neo4j SDK to "build" the triples the same way... Is this required? Is there a better approach?

Finally

So for a follow-up... I've loaded in the ontology... and my turtle file...

CALL n10s.graphconfig.init({handleVocabUris: 'IGNORE', handleMultival: 'ARRAY'})
CREATE CONSTRAINT n10s_unique_uri ON (r:Resource) ASSERT r.uri IS UNIQUE
CALL n10s.rdf.import.fetch("https://raw.githubusercontent.com/iamliamc/neo4j-brick/main/spectral_brick_graph.ttl", "Turtle", {verifyUriSyntax: false})
CALL n10s.onto.import.fetch("https://brickschema.org/schema/1.2/Brick.ttl", "Turtle");

The image shows that the ontology uses 'SCO' to represent "subClassOf"

Here is my graph...

But the ontology I've loaded in doesn't appear to have the:
https://neo4j.com/docs/labs/nsmntx/current/inference/#_n10s_inference_haslabel

catNameProp by default...

:param inferenceParams =>  ({ catNameProp: "dbLabel", catLabel: "LCSHTopic", subCatRel: "NARROWER_THAN" });

my inferenceParams might look like but is a catNameProp required?:

:param inteferenceParams => ({catLabel:"Reset_Setpoint", subCatRel:"SCO"})

I get no results from this:

MATCH (r:Resource)<-[rel:hasPoint]-(hc:HeatingCurve) where n10s.inference.hasLabel(r, "Reset_Setpoint", ({catLabel:"Reset_Setpoint", subCatRel:"SCO"})) return r, rel, hc;

And I get results not limited to the correct subcategories from this:

Hi @liamconsidine , thanks for the detailed description and apologies for the late response.

We're actively working to simplify the ontology/inferencing apis to make it all simpler and also to make it very easy to create an ontology/taxonomy directly in neo4j without having to build it first as an RDF/OWL one and then import. So to your question, we want to be able to make use of existing ontologies, hence the integration with the RDF/OWL world via onto.import but we also want to make it easy to start from scratch and create a semantic model directly in neo4j. It's not going to happen in one day but I hope the next release includes some steps in this direction.
It's still an open question how far do we want to get in terms of primitives. Definitely yes to both classes and relationships hierarchies, definitely yes to domains and ranges but still unclear how much of owl:restrictions.

JB

1 Like

Let us try to reproduce your example. It will be a useful one to create tests around it for the changes in the inferencing api in the next release.

watch this space for comments...

JB

1 Like