The solution of challege in Neo4j & LLM Fundamentals

Very good tutorial!
However, the current answer does not use the dialogue history. The get_memory function is defined, so an answer using this would be better.

In this page (Adding the Neo4j Vector Retriever - Neo4j & LLM Fundamentals)

Like this,

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You are a movie expert. You find movies from a genre or plot.",
        ),
        MessagesPlaceholder(variable_name="chat_history"),
        ("human", "{input}"),
    ]
)

Sorry if the dialogue history can be kept without using MessagesPlaceholder.

Hi Chishu,

Welcome to the Neo4j community :slight_smile:

In this example the message history is managed by the agent.

chat_agent = RunnableWithMessageHistory(
    agent_executor,
    get_memory,
    input_messages_key="input",
    history_messages_key="chat_history",
)

The agent rephrases questions and responses for the individual tool including adding additional context.

However... you could give individual tools their own message history, which they would use in generating responses. It is worth noting though that the tools message history would contain the rephrased question from the agent, not the original conversation history with the user.

I hope this helps,

Martin