Hi I am developing graphRAG with Neo4J and Ollama. How to create finalanswer with the result?

Hi,

I want to create neo4J Aiagent to access NEO4J to get knowledge graph and generate final answer with Ollama.

I am sending a message to ollama with the result from neo4j.

I cannot get answer from Ollama llama3.2. Is there any solution to get answer from llm ?

Message(role='assistant', content='', thinking=None, images=None, tool_name=None, tool_calls=[ToolCall(function=Function(name='get_asset_list_by_tag', arguments={'input': 'b30ods'}))]), {'role': 'tool', 'content': 'CallToolResult(content=[TextContent(type='text', text='[[["<Node element_id=\'4:93612259-79d6-40fd-85a4-9cc3bd8c9d1a:26\' labels=frozenset({\'Asset\'}) properties={\'asset_title\': \'ABCDEF123456\', \'asset_id\': \'9efcbca4-02b4-4448-b592-fe0ff7729934\'}>"],["<Node element_id=\'4:93612259-79d6-40fd-85a4-9cc3bd8c9d1a:28\' labels=frozenset({\'Asset\'}) properties={\'asset_title\': \'abcdef123456\', \'asset_id\': \'472f9eac-0fb1-4cff-a121-bf61c450698a\'}>"]],"<neo4j._work.summary.ResultSummary object at 0x0000022FEB6819D0>",["a"]]', annotations=None, meta=None)], structured_content=None, meta=None, data=None, is_error=False)'}]

Hi @doyblackcat! I'm not an LLM expert, but this looks like an issue with Ollama rather than Neo4j.

The StackOverflow question here may have some insight for you, it seems that some smaller llama models have trouble handling both conversation and Tool calls.