-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak using RandomForestClassifier and PCA #1881
Comments
Hi @cannolis thank you for the report! |
Here are the details about my environment: Python version: 3.9.19 Thank you for looking into this issue. I appreciate your help and support. If you need any further information, please let me know. |
Hi @cannolis, thank you for raising the issue. Can you please provide a reproducer for your specific case? My initial investigation based on your your description doesn't show anything noticeable. |
Hi @md-shafiul-alam , had the same problem here in my environment: Python version: 3.8.19 Here is a code example you can reproduce this problem.Running this script for minutes, my memory usage goes up from 700M to 1G, and it keeps increasing.To reproduce these problem may take longer time, but I'm sure this does exist, as my training process has been interuptted for times because of out of memory.Once I remove sklearnex, it works fine.
Dataset used in the code is here. |
Describe the bug
I am encountering a persistent memory leak when using RandomForestClassifier and PCA from the sklearnex library. With each iteration of my loop, the memory usage increases by approximately 20MB, which significantly impacts performance during large-scale data processing.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
I expect the memory usage to remain stable or return to the baseline after each iteration, ensuring efficient performance during large-scale data processing.
Environment:
• OS: Windows 10
• Compiler: PyCharm
• Version: 2024.1.2 Professional Edition
The text was updated successfully, but these errors were encountered: