Managing Table Data

We have a several tables that have static data used as a reference. Its nothing complicated.

image

To start, I went ahead and turned these into nodes

Set row = 1, Set col = A, set value = 6.7
Set row = 1, Set col = B, set value = 8.2
Set row = 2, Set col = A, set value = 4.8

etc etc.

Its workable but looks clumsy. Any better ways to handle this kind of thing ? The largest table is about 200 entries, most are 20

Thanks

Hi @bill_dickenson

For now, I used LoadCSV.
This is not a good idea if you want to increase or decrease item names and numbers.
The number of items is determined by hard coding.

LOAD CSV WITH HEADERS FROM 'file:///table.csv' AS line
CREATE (:TableData {
  row: linenumber() - 1,
  col: 'A',
  value: toFloat(line.A)
})
CREATE (:TableData {
  row: linenumber() - 1,
  col: 'B',
  value: toFloat(line.B)
})
CREATE (:TableData {
  row: linenumber() - 1,
  col: 'C',
  value: toFloat(line.C)
})

If you add a name like ’Row’ to the header, the code would look like this。

Row, A, B, C
1, 6.7, 8.2, 2.4
LOAD CSV WITH HEADERS FROM 'file:///table.csv' AS line
CREATE (:TableData {
  row: toInteger(line.Row),
  col: 'A',
  value: toFloat(line.A)
})
CREATE (:TableData {
  row: toInteger(line.Row),
  col: 'B',
  value: toFloat(line.B)
})
CREATE (:TableData {
  row: toInteger(line.Row),
  col: 'C',
  value: toFloat(line.C)
})

Yeah but there has to be a more elegant structure. It works but dang is it clumsy
Thanks though

What format is its table data in, RDB, Excel, csv?

Right now, it comes in as CSV but I convert it to a series of scalar nodes. One of our tables has 180 nodes ( 61x3) but most are fairly small.

Thanks

How about a single node per table, with properties A, B and C,
and each property (table column) storing the column data as an array?

I stored a 3-dimensional table of meteo data (precipitation) with dimensions month, longitude, latitude as a single array property with a simple user-defined function to access element (i,j,k).

Thats a good thought. Its probably my best choice so far. Columns are usually small enough, although I suppose I will need 1 cyperquery per column unless there is a way to index properties. That would have made this approach a big winner.

Or I may have to break down and get Neo4j to talk to an external data store. Thats pesky but possible.

i'll try yours first and see if it works well enough. Thanks

What do you mean by "I will need 1 cyperquery per column" ?
Can you give a usage example for your table?

I imagined a function like

return bd.get(table.A, 5)

to return the 5th element of column A in the selected table.
The Java function would be trivial.

Been looking for decent examples of a java function like that for a while. If you have one you could share, I am all in. That particular function would be great.

World Bank ! I did a presentation for them on emerging software quality standards. It was an interesting group. One of the better ones. The questions were all pretty interesting. I wish they could see what we did since then !

Thanks

Actually you don't even need a user-defined function.
You can access the 5th element of column A like this:

match (t:table {name: "table123"}) return t.A[4]

(in your example you index your rows starting with 1, but Neo4j/Java arrays start with an index of 0, therefore the index is 4 for row 5)

BTW, I'm an independent consultant, I just work as a consultant for the World Bank (and other institutions).

Ahhhhh. No ! I didn't know that at all. Damn that might make it much more tolerable. Between you and Koji thats probably good enough for now ! Thank you