Graph Algorithms Link prediction memory error

Hello! Im doing a research in a social networtk and i found this repositorie from mark needham and adjusted this model in my data but i took memory error and i think that this caused because of the neo4j.How can i solve this problem?

What is the heap size set in neo4j.conf file?
Also, memory error was caused in Neo4j right?

This is the heap size

heap size.

and the error apears in python console ,although several times neo4j close from my script.Should i change the configuration file to keep the memory error at bay?

Oh your Jupyter is crashing? Can you copy paste the error here?
Just to double check if it is a Jupyter issue or Neo4j problem.
Also, can you try launching Jupyter with this command?

jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10

train_missing_links ="""
MATCH (u:User)
WHERE (u)-[:TRADES]-()
MATCH (u)-[:TRADES*2..3]-(other)
WHERE not((u)-[:TRADES]-(other))
RETURN id(u) AS node1, id(other) AS node2, 0 AS label

Although I do not see any memory error in your console, I would bump up the heap memory a bit.
Strange that this does not work, can you revisit your Cypher query again? I am not sure I understand what you are trying to achieve with the query.

Look in the left picture in the last line it has a memory error in the return training_missing_links Dataframe!Also i am trying to generate links between nodes that doesn't exist because a machine learning models to predict something you need different labels .
for example 1: If there an edge between nodes | 0: If there isn't an edge between nodes .
This seems because you want to predict prospective edges in a timeserie.Although we need negative examples,therefore i use this query to produce links tha doenst exist and because of the complexity i believe that neo4j stop.Can i change the heap file and to what size?I know how to change it but i dont know in which size?Also do you know if i could send concurently querries?

Heap size would depend on your database store.
Try using Neo4j admin-memrec tool

hello again! i have just cofigured the memory of my database like this!

because i have 8 gb ram?But the memory error still exists Do you know why?

This does not look like a Neo4j error. One possibility is that the dataframe is unable to fit in your pandas data frame maybe due to its dimensions.
An idea would be to save the data frame and do a to get the dimensions of the data frame, maybe just the size of the data frame you are returning is too large.