Neo4j GraphRag issue: LLM response error and timeouts when calling local LLM

I am using the neo4j_graphrag package for a graph RAG implementation (using SimpleKGPiipeline) where the LLM and embeddings model are all local to the same machine running the graph rag service. The LLM is a locally served instance of mistral-7B or granite-7B using llamacpp_python. The code in progress can be seen here Graph RAG recipe by srampal · Pull Request #801 · containers/ai-lab-recipes · GitHub and in particular look at the file recipes/natural_language_processing/graph_rag/app/manage_graphdb.py

When I call SimpleKGPipeline() to populate the database using the local LLM to create the Nodes and Raletionships, I get errors about LM response JSON errors like the following and it takes a very long time (30-60 minutes) to process a small text file less than 10Kb in size in order to create try to create the Knowledge graph but does not create it and has a number of LLM errors, API timeouts and then finnaly some exceptions during timeout processing. I am fairly confident of the local LLM setup and it supports OpenAI API. So would appreciate some help in debugging whats going on.

Calling OpenAILLM with base_url http://10.88.0.1:8001/v1 model_name Starting the KG insertion pipeline at time 2024-11-18 22:41:35.264092 /opt/app-root/lib64/python3.11/site-packages/neo4j_graphrag/experimental/components/entity_relation_extractor.py:428: UserWarning: No document metadata provided, the document node won't be created in the lexical graph warnings.warn( LLM response is not valid JSON {"entities": { "0": {"label": "Executive 1", "type": "Person"} }, "relations": [ {"type": "SPOKE_AT", "start_node_id": "0", "end_node_id": null, "properties": {}}, {"type": "PARTICIPATED_IN", "start_node_id": "1", "end_node_id": "0", "properties": {}}, {"type": "DISCUSSED", "start_node_id": "0", "end_node_id": "2", "properties": {}} ], "potential_schema": ["Person", "Concept"] } for chunk_index=1 2024-11-18 23:11:40.461 Uncaught app exception Traceback (most recent call last): File "/opt/app-root/lib64/python3.11/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions yield File "/opt/app-root/lib64/python3.11/site-packages/httpx/_transports/default.py", line 377, in handle_async_request resp = await self._pool.handle_async_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request raise exc from None File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request response = await connection.handle_async_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/connection.py", line 103, in handle_async_request return await self._connection.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/http11.py", line 136, in handle_async_request raise exc File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/http11.py", line 106, in handle_async_request ) = await self._receive_response_headers(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/http11.py", line 177, in _receive_response_headers event = await self._receive_event(timeout=timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_async/http11.py", line 217, in _receive_event data = await self._network_stream.read( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_backends/anyio.py", line 32, in read with map_exceptions(exc_map): File "/usr/lib64/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/opt/app-root/lib64/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ReadTimeout

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/app-root/lib64/python3.11/site-packages/openai/_base_client.py", line 1572, in _request
response = await self._client.send(