You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If this is run outside gtest, then the error is just a simple "segmentation fault".
The way I got around this is by destroying the result object and creating a new one before every compute(), however this defeats the whole purpose of having an associative array inside the result.
I found the course of my problem. I need to classify between 25.000 different classes and this results in high bucketIdx that in return makes everything crash.
-- My bucketIdx had a high offset so adjusting them to be only between 1 and 25.000 worked
The segmentation fault happens everytime the classifier compute() function writes more than 2 times to the same ClassifierResult instance.
For example, as in the unit test, adding 2 more computes to the same result will cause not only some assertion errors, but also:
c.fastCompute(0, input1, 4, 34.7, false, true, true, &result1); c.fastCompute(1, input1, 4, 34.7, false, true, true, &result1); c.fastCompute(2, input1, 4, 34.7, false, true, true, &result1);
*** Error in
./unit_tests': free(): invalid next size (fast): 0x00007f5e58003290 ***Aborted (core dumped)`
If this is run outside gtest, then the error is just a simple "segmentation fault".
The way I got around this is by destroying the result object and creating a new one before every compute(), however this defeats the whole purpose of having an associative array inside the result.
See full discussion on this issue on the discourse forum
The text was updated successfully, but these errors were encountered: