Is Neo4j best for a Dictionary or Foreign Language App

I am building a foreign language app. I want to build a database which will store connections and relationships between foreign language words, audio, pictures, icons, and word meanings.

How do I know if Neo4j is the best option for this?
Is there a website where this question could be asked to get unbiased opinions on this?

Phillippe Talbot
Founder & CEO
Fonetic About Language
Language at the Speed of a Synaptic Spark

Yes, you can do as long as you have data with translated a 
foreign language to any local language.

MERGE (a:Italian {word: "xyz")
MERGE (b:LocalLanguage {meaning: "pst")
MERGE (a)-[:IN_LOCAL_LANGUAGE]-> (b)

Hi @phillip.talbot

Languages are distributed differently in different cultures.
Right now, what I am most interested in is translation.
The same facts in words are not simply translatable on a one-to-one basis.
Using English and Japanese, plus Klingon as an example, I am trying to translate using Neo4j.
I believe that a graph-based implementation would produce better translations.

There are not many examples of Neo4j in this field, but I think GraphAware's NLP and Hume can be helpful.

Interesting idea, however you might be right, but i do think, you need to look at AI based NLP also.
The "graph" has got "gates". Neo4j also has got fast reads, however elastic search might be better for the job. Neo4j is lagging behind on writes, and i think it's pretty serious. It's simply not good enough for common use. It's good enough for almost static data, where everything has been inserted and you hava day or two for it. So if you have a never changing dictionary go for it, but if you would like your user inserts it, it might be a big no.

Hi @fssrepository

I think Elasticsearch is a good product.
I've used it in combination with Neo4j.

Since Neo4j is equipped with ACID, it is at a disadvantage in terms of write speed when compared to other NoSQL that do not have ACID.
However, I think ACID is necessary for safe writing.
I have tried Neo4j for IoT and network device log collection.
There are tens of millions of data, and a data comes in milliseconds.
Still, it was able to handle it without any problems.

I think it is possible to make it run faster if you do Cypher tuning, indexing and memory tuning well.