Giving memory to CypherQA Chain in multi input chain

I am struggling A LOT for giving ConversationBufferMemory to this chain:

from langchain_openai import ChatOpenAI
from langchain_community.graphs import Neo4jGraph
from langchain.chains import GraphCypherQAChain
from langchain.prompts import PromptTemplate

llm = ChatOpenAI(
openai_api_key="sk-..."
)

graph = Neo4jGraph(
url="bolt://localhost:7687",
username="neo4j",
password="pleaseletmein",
)

CYPHER_GENERATION_TEMPLATE = """
You are an expert Neo4j Developer translating user questions into Cypher to answer questions about movies and provide recommendations.
Convert the user's question based on the schema.

Schema: {schema}
Question: {question}
"""

cypher_generation_prompt = PromptTemplate(
template=CYPHER_GENERATION_TEMPLATE,
input_variables=["schema", "question"],
)

cypher_chain = GraphCypherQAChain.from_llm(
llm,
graph=graph,
cypher_prompt=cypher_generation_prompt,
verbose=True
)

cypher_chain.invoke({"query": "What role did Tom Hanks play in Toy Story?"})

Wouldn’t you just want to learn cypher so you can answer your question and many more questions yourself?

MATCH (movie:Movie {title:”Toy Story”})
MATCH (person:Person {name:”Tom Hanks”})
MATCH (movie)<-[rel:ACTED_IN]-(person)
RETURN movie.title as title, person.name, rel.roles as roles

Can you elaborate on what you are struggling with?

If I was looking to do this, I would probably create an agent with memory and then have the cypher generation chains as one of the tools.

I just submit a issue into the github repo: