Handling huge tables - memory insufficient

Typically: "How do I... ", "How can I... " questions
Post Reply
Prometheus87
Posts: 14
Joined: 02 Oct 2013, 15:33

Handling huge tables - memory insufficient

Post by Prometheus87 »

Hello,

we created a simulation which generates huge tables during runtime.
To be more specific:
Table a is table which has about 40 number values.
Table b is table wich contains ~1 million elements of type table a.

Now we would like to expand it up to >4 million elements but we encounter "not enough memory" errors (16GB RAM, Win10)

We found various approaches, like luaJIT or extending to JSON formatted files on the harddisk or an sql interface. All of them seem a bit more complex and require significant time investment.
Has anyone a suggesion for a workaround or maybe even experience?


Thanks in advance!

fferri
Posts: 1216
Joined: 09 Sep 2013, 19:28

Re: Handling huge tables - memory insufficient

Post by fferri »

How about a C++ plugin with functions for reading/writing to your huge dataset?

coppelia
Site Admin
Posts: 10361
Joined: 14 Dec 2012, 00:25

Re: Handling huge tables - memory insufficient

Post by coppelia »

Hello,

yes, the plugin approach sound good. But there is probably simpler, depending on the type of data you are storing/manipulating.

e.g. instead of using tables, you could use packed values, in a string, e.g.:

Code: Select all

    buffer=buffer..sim.packFloatTable(tableData)
And reading your buffer data, e.g.:

Code: Select all

    local tableSubData=sim.unpackFloatTable(buffer,startIndex,valuesCount)
Above could possibly increase speed, and probably also save a bit of storage.
I'd probably try to use a single string/buffer, that contains all the data in a packed manner (concatenated 40-values packed floats). This way you'd be the most efficient in memory usage.

Cheers

Post Reply