Hello,
we created a simulation which generates huge tables during runtime.
To be more specific:
Table a is table which has about 40 number values.
Table b is table wich contains ~1 million elements of type table a.
Now we would like to expand it up to >4 million elements but we encounter "not enough memory" errors (16GB RAM, Win10)
We found various approaches, like luaJIT or extending to JSON formatted files on the harddisk or an sql interface. All of them seem a bit more complex and require significant time investment.
Has anyone a suggesion for a workaround or maybe even experience?
Thanks in advance!
Handling huge tables - memory insufficient
Re: Handling huge tables - memory insufficient
How about a C++ plugin with functions for reading/writing to your huge dataset?
Re: Handling huge tables - memory insufficient
Hello,
yes, the plugin approach sound good. But there is probably simpler, depending on the type of data you are storing/manipulating.
e.g. instead of using tables, you could use packed values, in a string, e.g.:
And reading your buffer data, e.g.:
Above could possibly increase speed, and probably also save a bit of storage.
I'd probably try to use a single string/buffer, that contains all the data in a packed manner (concatenated 40-values packed floats). This way you'd be the most efficient in memory usage.
Cheers
yes, the plugin approach sound good. But there is probably simpler, depending on the type of data you are storing/manipulating.
e.g. instead of using tables, you could use packed values, in a string, e.g.:
Code: Select all
buffer=buffer..sim.packFloatTable(tableData)
Code: Select all
local tableSubData=sim.unpackFloatTable(buffer,startIndex,valuesCount)
I'd probably try to use a single string/buffer, that contains all the data in a packed manner (concatenated 40-values packed floats). This way you'd be the most efficient in memory usage.
Cheers