I still can’t understand your model. Could you reduce your types down to a more granular list and differentiate the relations with a relationship property that specifies a sub-type?
Sorry, I don’t know the answer to your question. I doubt it though. It is probably set by same field size in a database record schema.
Imagine a tree with various number of children for each node, each node has one property called name. I would like to store node without property but encode the name in relationship type to have faster look up speed.
In reality you have exactly 65,536 relationship types that represent the limit and you would like to increase that, correct? I think it's a lot.
I understand the benefit of this application, in my current project we have worked to reduce the number of relationship types by generalization like adding the relationship property. Furthermore, I do not see the official note presenting this limit, but rather the number of relationships depending on the version of neo4j.
Imagine a tree with various number of children for each node, each node has one property called name. I would like to store node without property but encode the name in relationship type to have faster look up speed.
but if your nodes contained a property named name and you created an index on said node label and property index lookups are going to be real fast.
Have you done so and observed slow performance?
You dont state what version of Neo4j this is but from Introduction - Operations Manual and the Neo4j v5 documentation it indicates limits based upon if you using Neo4j Community or Enterprsie edition ( do you have a enterprise license ?).
but is 16 million unique Relationship Types even enough? For what its worth I'm not aware of many customers who use high_limit. Either I'm not aware or there model is such that they dont need 65k+ releationship types.
I haven't found this limit explicitly stated in documentation but rather found out when neo4j crashed with this error. I'm using the newest neo4j 5.17.0 bullseye.
Well, for my use case neo4j represents and index over much bigger data, that I'm storing on S3 and then querying.
It has a a tree structure and in leaf it has information about the path traveled that I process further.
It has two purposes:
persistence: I would like to hold this all in memory but after a crash I would lose it all.
too much data: Even if it would not crash. After some time it would crash on memory because the index is only growing.
but is 16 million unique Relationship Types even enough?
Probably not because this relationship types will be forever increasing.
But there's one more format in the link you send me that seems that could do the job (block format, it can hold up to 2^30 (1 073 741 824 relationship types)), but idk what's the difference with other format.
No but I'm considering it. Yes it's 5.17.0 (community version)
Neo4j doesn't contain the data I'm actually storing rather the location where to find the data.
I'm using the graph database properties of neo4j to create my own index over the actual data I'm storing/querying.
It's really hard for me to explain because the concept is quite robust.
So imagine node N has 100 milion edges. Each neighbour has just one property let's call it 'name' and this property is unique across all neighbours.
So for me to retrieve all 100 milion neigbours and find the one who has the propertValue I'm looking for I wanted to encode this into the relationshipType so I could retrieve that in O(1) instead of O(n).
And regarding the index you suggested on the property it would get really big over time and take a lot of additional space which is undesired.