GraphRAG using OllamaLLM: error when run SimpleKGPipeline

Hello,
I'm trying to follow the blog "GraphRAG Python Package: Accelerating GenAI With Knowledge Graphs" and the code from its repository. I don't want to use OpenAI stuff since I want to run everything locally, so I'm using the Ollama libraries.

This is the main content of the GraphRAG script:

import neo4j
from neo4j_graphrag.llm import OllamaLLM 
from neo4j_graphrag.embeddings.ollama import OllamaEmbeddings
from neo4j_graphrag.experimental.pipeline.kg_builder import SimpleKGPipeline

llm = OllamaLLM( model_name="mistral" )
embedder = OllamaEmbeddings( model="snowflake-arctic-embed2" )

driver = neo4j.GraphDatabase.driver( NEO4J_URI, auth=(NEO4J_USERNAME, NEO4J_PASSWORD), database=NEO4J_DATABASE)

kg_builder_pdf = SimpleKGPipeline(
    driver = driver,
    llm = llm,
    embedder = embedder,
    entities = node_labels,
    relations = rel_types,
    prompt_template = prompt_template,
    from_pdf = True
)

for path in FILE_PATHS:
    print(f"Processing : {path}")
    pdf_result = await kg_builder_pdf.run_async(file_path=path)
    print(f"Result: {pdf_result}")

The script runs smoothly until the moment of ingest the first PDF file. After a few seconds I get the following Validation Error:

pydantic_core._pydantic_core.ValidationError: 1 validation error for Neo4jNode
embedding_properties.embedding.0
Input should be a valid number [type=float_type, input_value=[-0.021932466, 0.01252911...0.03033293, 0.014525634], input_type=list]
For further information visit h ttps://errors.pydantic.dev/2.10/v/float_type

I'm working under Windows 11:

  • neo4j version: 5.27.0,
  • desktop version: 1.6.1

I'm still having problems with this script so I was forced to pay OpenAI and try the original code using OpenAILLM and OpenAIEmbeddings:

from neo4j_graphrag.llm import OpenAILLM
from neo4j_graphrag.embeddings.openai import OpenAIEmbeddings

llm = OpenAILLM(
    model_name='gpt-4o-mini',
    model_params={
        'response-format': {'type': 'json_object'},
        'temperature': 0
    }
)
emb = OpenAIEmbeddings()
...

But the error is different now:

.venv\Lib\site-packages\neo4j_graphrag\llm\openai_llm.py", line 138, in ainvoke
response = await self.async_client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\bduran\Documents\Coding\KnowledgeGraphs\Neo4j\graphRAG.venv\Lib\site-packages\openai_utils_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'response-format'


I'll appreciate any direction on how to implement a working graphRAG script, especially when using OllamaLLM and OllamaEmbeddings.
Thanks in advance,
Boris