-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Memory usage keeps increasing when switching models #233
Comments
Hmm. I can recreate this with both WaveNet and LSTM models. It does seem that the destructor is getting called. My understanding was also that the Eigen vectors & matrices used would automatically release their memory, so I don't think it's that. I've let the plugin run for a while (about an hour or so) on my Macbook, and I've seen the memory usage top out around 23% (of 16 GB...yikes!). But, after unloading the model and letting it sit for a while longer, usage is down to 9.3%. [Disclaimer: I'm basically unaware of good memory debugging tools for Windows & Mac. I love Valgrind, but I haven't personally run this project on Linux...] The closest thing this looks like is something where memory is being reclaimed lazily, and something is just letting the program hold onto memory while it seems unneeded by the rest of the system? Ugh, feels awful poking around in the dark on this without good tools. I'm going to add the "help wanted" label to this and would dearly appreciate any references to tools that can make this more straightforward to debug :/ Next thing I'm going to do is grab another iPlug2 plugin just to try and make sure the problem is on NAM's end. |
Seems like memory usage goes up while the NAM model is loaded. Might look inside for hints... |
I've always been uncomfortable about the dynamic resizing of vectors/matrices in the NAM code. Mostly because it is happening on the audio thread, but also because I don't know how Eigen memory management works. Eigen also has the potential to allocate temporary memory during matrix operations. |
Btw, this PR for the Core code makes it a lot easier to test stuff like this: sdatkinson/NeuralAmpModelerCore#34 I've been using it for benchmarking changes. I just did a bit of testing, and I'm not seeing any memory leaking when running a model for an extended period of time. |
Repeatedly loading models, however, leaks like a sieve - even using std::move, which should destroy the previous model. |
You can use the macOS Instruments/Leaks profiling tool. |
FYI: the slightly obscene baseline memory usage when the UI is opened on windows, would be fixed by #232 |
From testing, or asking others, some observations my end:
|
@DomMcsweeney cant confirm your second point. |
@fichl thanks for replying, just checked just now with both 0.7.2 and 0.7.3, and it seems you are correct and it is not happening now here my end either... but yesterday it certainly was as I tried it several times. So that is at least a positive, but a little bit confusing at the same time. |
Just went back and confirmed that this happens waaay back on b54c7f7 when loading new models. Also replaced the However, there is one hint in that it's at least stable in memory usage (while not changing models), whereas the most recent version continuously accumulates as you play 😢 So things have been getting worse 😅 |
Running with leaks, only opening and closing the standalone app (without loading any models): looks like there are leaks in that alone; full output to terminal below. Will try loading a model next.
|
When testing directly (ie: not using the plugin code) I didn't see any continuous leakage when running models - that is likely something in the plugin code. The leak loading new models looks to be a core code issue. |
Second experiment:
Terminal output: output.txt Notes:
Summary of Eigen leaks:1.
|
Holy smokes... ...the destructors for ...Fix coming right up! 😅 |
Neural Amp Modeler's memory usage keeps increasing when switching from model to model (or even reloading the same model).
Steps to reproduce the behaviour:
Expected behaviour:
Desktop:
Regards,
Shameless Plugs
The text was updated successfully, but these errors were encountered: