Insights from 300 million rows of db

Hi folks,
I have 300 million rows of mongo db data.

I am looking for ways to extract insights from this structured data from input user natural language question.

-Some user queries can be answered via traditional db query.
-Some user queries can be answered via semantic search.
-And some user queries can be answered via graph db query.

My question is, what is the best practice to do that?

Option-1:

1.Create SEPARATE Neo4j graph db and a vector db for the same data
2.Use an LLM to decompose user query to sub-queries as vector or graph query types
3.Query both databases separately
4.Combine query results with a smart logic
5.And finally use an LLM to generate a natural language response

Option-2:

1.Use ONLY Neo4j db
2.Use an LLM to decompose user query to sub-queries as vector or graph query types
3.Query Neo4j
4.Combine query results with a smart logic
5.And finally use an LLM to generate a natural language response

Thank you

Why not ask chatgpt? ;)