I am serving an embeddings model using huggingface's text-embeddings-inference. How do I set the openai endpoint and model name. I tried the following but it didn't work
OPENAI_API_KEY = 'empty'
OPENAI_ENDPOINT = 'http://127.0.0.1:8080/v1/embeddings'
kg.query("""
MATCH (movie:Movie) WHERE movie.tagline IS NOT NULL
WITH movie, genai.vector.encode(
movie.tagline,
"OpenAI",
{
token: $openAiApiKey,
endpoint: $openAiEndpoint,
model: $openAiModel
}) AS vector
CALL db.create.setNodeVectorProperty(movie, "taglineEmbedding1", vector)
""",
params={"openAiApiKey":OPENAI_API_KEY, "openAiEndpoint": OPENAI_ENDPOINT, "openAiModel": "BAAI/bge-small-en-v1.5" })
It gives the following error:
ClientError: {code: Neo.ClientError.Procedure.ProcedureCallFailed} {message: Failed to invoke function `genai.vector.encode`: Caused by: org.neo4j.genai.util.GenAIProcedureException: Not authorized to make API request; check your credentials.}
Note that the following curl command works
curl -v -X POST http://127.0.0.1:8080/v1/embeddings -H "Content-Type: application/json" -d '{
"input": "Your text string goes here",
"model": "BAAI/bge-small-en-v1.5"
}'
so the endpoint is OK