Using sparse matrices in this platform

(Hphysik1987) #1

Hi I am new here. I am not sure this is a good place to ask this or not.
I have thousand of matrices with different dimension. All of them are square matrices. I want to convert them in graph or any embeddings that can be applied as input for Neural Network.
Could any one please guide me on this. Thanks in advance.

(Jiropole) #2

Could you provide more details on how you intend to model the data for input to a NN?

I rarely say this, but this may be better served by a relational DB. Otherwise, you could choose to store a matrix as a property on a node, 1d array of primitives L, and interpret them by indexing as a NxN matrix (where N = sqrt(size(L))). Or you could store a single column in a node, then connect the series of columns via a -[:NEXT]> relationship. Or you could fan out your matrix into NxN nodes with optimal traversal relationships.

But it really depends how often you are updating versus reading the data, what meta data you might wish to store, and other considerations for your use case.

(Hphysik1987) #3

Hi Jiropole,

I am new on this topic. I am looking for the best way to use such matrices as an input at each node.
One way of thinking: When I calculate the eigenvalue of such matrices, I will have a vector of same length as the row or column of each matrices, as these are square matrices. Then I will have like 5 eigenvalues as 5 features (for 5 by 5 matrix) and some additional features, by this way I can match a single matrix to other feature. However I might lose some information during this process.

I want to convert matrices into a vector form, so that I can have that vector as one feature and use it with some other available features to train my NN.

Could you please explain what is relational DB?

Thanks again.

(Jiropole) #4

Ah, I understand, you wish to run the NN within the graph. In that case you've probably made the right choice, but to answer your question, a relational DB is your typical row/column database like PostgreSQL. I have not done this myself so I'll let those with experience offer you better info.

(Hphysik1987) #5

Yes, That's true and correctly summarize my problem. Hoping to have some ideas from here. Thanks again.

(Michael Hunger) #6

How dense or sparse are your matrices?
How many dimensions do you currently have?
What kind of values?

(Hphysik1987) #7

Hi michael,

Most of the matrices are dense and some of them are sparse. Mostly the values lies between 0 to 5 (all are integers). The size of matrices are almost say 200 by 200 for maximum and some of them are 2 by 2 as well and others in between. For all of these matrices I have some other features which are integers or floating points numbers. I have almost 16000 of such matrices.