In a c++ dynamic library I solve a least square problem using the Eigen Library. This Dll is called inside a python software where the problem configuration is settled. In a small sized problem the code works properly and returns the correct solution. If the number of points increases then the library throws std::bad_alloc.
More precisely, The code which creates the error simplified to its most is
try {
matrixA = new Eigen::MatrixXd(sizeX,NvalidBtuple); // initialize A
for (int i=0;i<sizeX;++i) {
int secondIndex = 0;
for (int k=0;k<btermSize;++k) {
if (bterm[k] == 1) { // select btuple that are validated by density exclusion
// product of terms
(*matrixA)(i,secondIndex) = 1.0;
secondIndex += 1;
}
}
}
} catch (std::bad_alloc& e) {
errorString = "Error 3: bad allocation in computation of coefficients!";
std::cout<<errorString<<" "<<e.what()<<std::endl;
return;
} catch (...) {
errorString = "Error 4: construction of matrix A failed! Unknown error.";
std::cout<<errorString<<std::endl;
return;
}
where matrixA is defined in the header file with Eigen::MatrixXd *matrixA;.
if sizeX and NvalidBtuple are smaller than about 20'000x3'000, the matrix definition works. If the size is bigger, it crashes.
The computer on which I did the tests has enough memory available, about 15G of free memory.
Is this a heap/stack problem? How can I make the library accept bigger matrices?
Any comment is welcom. Thanks.
Edit:
As remarked in an answer below, I was not clear on the NvalidBtuple defintion:
NvalidBtuple = 0;
for (int i=0;i<btermSize;++i) {NvalidBtuple += bterm[i];}
where bterm is a boolean vector. Thus, since in the loop we do the check if (bterm[k] == 1), the secondIndex is always smaller than NvalidBtuple.