Hi, we have a somewhat complex graph structure with a lot of small entities and nested relationships. Everything works fine usually, but we recently experienced a problem where we were trying to create a particularly large input (a couple of hundred nodes) through the graphql ogm and the database took several minutes to process it. As per my understanding, neo4j is capable of creating hundreds of nodes and relationships in a matter of seconds, if not less. We are willing to move towards directly executing cypher queries for this purpose but I wanted to know if we are doing something wrong here and if there are ways to optimize this. Here is an example input passed to model.create
{
"code":"xyz",
"emails":["a@b.com"],
"type":"sometype",
"createdThrough":{
"create":{
"node":{
"userPlatform":"Web"
}
}
},
"rateQuote":{
"create":{
"node":{
"amount":"5555555",
"source":"someone"
}
}
},
"Client":{
"connect":{
"where":{
"node":{
"_id":"some_id"
}
}
}
},
"Company":{
"connect":{
"where":{
"node":{
"code":"xyz"
}
}
}
},
"TopLevel":{
"create":{
"node":{
"Type":{
"create":{
"node":{
"data1":null,
"data2":null,
"agents":{
"create":[
{
"node":{
"name":"hello",
"phoneNumber":"1234567890"
}
}
]
},
"preferences":{
"create":{
"node":{
"data1":"val",
"data2":false,
"data3":3,
"data4":"CLNT-99905"
}
}
},
"anotherassociation":null,
"doc":"test-1",
"line":"testline",
"orderNumber":3534,
"StatusHistory":{
"create":{
"node":{
"status":"confirmed"
}
}
},
"status":"confirmed",
"SubType":{
"create":[
{
"node":{
"str1":"val1",
"str2":"val2",
"str3":"val3",
"str4":"val4",
"str5":"val5",
"_id":"49f635a5-d3c0-4cdf-a175-79c8d8a9d16c",
"date":"2023-01-10T13:36:55Z",
"locs":{
"create":[
{
"node":{
"Location":{
"connect":{
"where":{
"node":{
"_id":"knyjqvBH1DpMTIZrdwau"
}
}
}
},
"type":"",
"steps":1
}
},
{
"node":{
"Location":{
"connect":{
"where":{
"node":{
"_id":"4t7bKSsuYMg32yjbEfkc"
}
}
}
},
"type":"dropoff",
"contacts":{
"create":[
{
"node":{
"name":"asc",
"phoneNumber":"1234"
}
}
]
},
"steps":3
}
},
{
"node":{
"Location":{
"connect":{
"where":{
"node":{
"_id":"4WywjlY8G5HCOroS0ING"
}
}
}
},
"locationType":"qwe",
"steps":2
}
}
]
},
"status":"confirmed",
"subOrderNumber":"3534-A",
"StatusHistory":{
"create":{
"node":{
"status":"confirmed"
}
}
}
}
}
... 50 similar items
]
}
}
}
},
"MiddleNode":{
"create":[
{
"node":{
"ConnectorNode":{
"create":{
"node":{
"SubType":{
"connect":{
"where":{
"node":{
"_id":"49f635a5-d3c0-4cdf-a175-79c8d8a9d16c"
}
}
}
}
}
}
},
"middledata1":"random1",
"middledata2":"random2"
}
},
... 50 - 100 similar items based on certain business logic
]
}
}
}
}
}
We have Full text indexes on some of the parameters for searching if that makes a difference. Other relevant information:
NodeJS server running on EC2, Neo4j version 4 running on AuraDB.
Packages:
"@neo4j/graphql": "^3.7.0",
"@neo4j/graphql-ogm": "^3.7.0",
"neo4j-driver": "^5.0.1",
"graphql": "^16.6.0",
"apollo-server": "^3.10.2",
"apollo-server-core": "^3.10.2",
"apollo-server-express": "^3.10.2",
We have tried bumping up our @neo4j/graphql and @neo4j/graphql-ogm packages to ^3.14.2 in our local testing but that did not help much.