About the GenAI, LLM, Vector Search category

With the breakthrough of large language models, adding generative AI capabilities to your applications is now possible for every developer. Neo4j’s knowledge graph grounds LLM responses in validated facts using retrieval augmented generation (RAG).

Neo4j, Docker, Langchain, and Ollama announced a new GenAI Stack designed to help developers to develop and build generative AI applications quickly and easily.

It’s pre-configured, ready-to-code, and secure with large language models from Ollama, vector and graph databases from Neo4j, and the LangChain framework. The GenAI Stack, available now in the Learning Center in Docker Desktop and in the repository at GitHub - docker/genai-stack: Langchain + Docker + Neo4j , addresses popular generative AI use cases using trusted open source content on Docker Hub.

To learn more: