Troubleshooting Neo4jVector and AzureOpenAIEmbeddings Integration

• I am taking the Retrievers course in the Neo4j Graph Academy.
• I created the following script but am encountering an error:

from langchain_community.vectorstores.neo4j_vector import Neo4jVector
from langchain_openai import AzureOpenAIEmbeddings

embedding_provider = AzureOpenAIEmbeddings(
    openai_api_base=os.environ["OPENAI_API_ENDPOINT"],
    openai_api_version=os.environ["OPENAI_API_VERSION"],
    openai_api_key=os.environ["OPENAI_API_KEY"],
)

movie_plot_vector = Neo4jVector.from_existing_index(
    embedding_provider,
    url="bolt://localhost:7687", 
    username="neo4j",
    password="pleaseletmein",
    index_name="moviePlots",
    embedding_node_property="embedding",
    text_node_property="plot",
)

result = movie_plot_vector.similarity_search("A movie where aliens land and attack earth.")
print(result)

• The error I'm getting is:
BadRequestError: Error code: 400 - {'error': {'message': "'messages' is a required property", 'type': 'invalid_request_error', 'param': None, 'code': None}}

• Could the issue be that Azure OpenAI is using gpt-3.5-turbo?

It looks like you are pointing to a LLM endpoint (gpt-3.5-turbo), which requires messages attribute. Note that embeddings and LLMs are not the same thing, and you should point to an embedding endpoint on Azure instead.

Here is a nice image by Leonie Monnigate that explains the difference.

1 Like

Thank you for the clear explanation.
I will try implementing it using the vector model.