[mlpack] Out of memory error

Anisa Llaveshi llaveshianisa at gmail.com
Mon Aug 20 04:52:15 EDT 2018


Greetings,

I have recently started using mlpack for a C++ application and I came
across a problem that I haven't been able to solve. I am using Linear
Regression to learn the parameters of a linear model. My training data is a
vector of 1-dimensional points. It consists of a vector of type double
(64-bit). I initialize the data points matrix from a std::vector structure
(where I have the data) using this constructor: arma::mat(std::vector)
Depending on the datasize of the dataset that I use to create the Linear
Regression model I get this error:


> error: arma::memory::acquire(): out of memory
> terminate called after throwing an instance of 'std::bad_alloc'
>   what():  std::bad_alloc
>

I am running the application on a machine which has 250GB of memory. When I
use 100k points (less then 1MB of data) I observe that ~28% of the memory
is being used to build the model. When I increase this number to 160k
points I observe that ~50% of the memory is being used and then the process
is killed. When I increase it a bit more the above error is immediately
thrown when trying to build the model.
I was wondering whether it is normal for the model to consume this much
memory for a small amount of data and if this is the case then what can one
do to use a larger dataset?

Any help would be appreciated.

Best regards,
Anisa Llaveshi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20180820/a692d900/attachment.html>


More information about the mlpack mailing list