Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDRClassifier compute causes segmentation fault #1134

Open
ywcui1990 opened this issue Oct 20, 2016 · 5 comments
Open

SDRClassifier compute causes segmentation fault #1134

ywcui1990 opened this issue Oct 20, 2016 · 5 comments

Comments

@ywcui1990
Copy link
Contributor

The segmentation fault happens everytime the classifier compute() function writes more than 2 times to the same ClassifierResult instance.

For example, as in the unit test, adding 2 more computes to the same result will cause not only some assertion errors, but also:

c.fastCompute(0, input1, 4, 34.7, false, true, true, &result1); c.fastCompute(1, input1, 4, 34.7, false, true, true, &result1); c.fastCompute(2, input1, 4, 34.7, false, true, true, &result1);

*** Error in./unit_tests': free(): invalid next size (fast): 0x00007f5e58003290 ***
Aborted (core dumped)`

If this is run outside gtest, then the error is just a simple "segmentation fault".
The way I got around this is by destroying the result object and creating a new one before every compute(), however this defeats the whole purpose of having an associative array inside the result.

See full discussion on this issue on the discourse forum

@dorinclisu
Copy link

Also, this happens with both the CLAClassifier and the SDRClassifier, because they write to ClassifierResult in the same way.

@rhyolight
Copy link
Member

@natoromano Are you interested in working on this?

@RuneKR
Copy link

RuneKR commented May 2, 2017

@rhyolight is this something that is currently being worked on?

@rhyolight
Copy link
Member

@Medsolve I don't think anyone is working on this right now.

@RuneKR
Copy link

RuneKR commented May 2, 2017

I found the course of my problem. I need to classify between 25.000 different classes and this results in high bucketIdx that in return makes everything crash.

-- My bucketIdx had a high offset so adjusting them to be only between 1 and 25.000 worked

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants