I am running into a memory leak when I loop through a set of netCDF files, read their contents and plot them, but only when I run this loop in a Jupyter notebook. The same memory leak happens when I use xarray, netCDF4-python, or h5py as the underlying library to read the package. I have tried adding explicit del and gc.collect commands in the loop but still see the same amount of memory growth. I also put the loop within a function, so all variables should be garbage collecting and do not have global scope. Does anyone have any idea what might be going on under the hood in Jupyter that could be causing memory retention/memory leak that is invisible to the Python garbage collector?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Why reassigment leads to memory leak? | 0 | 471 | January 28, 2021 | |
MemoryError: Unable to allocate | 0 | 1157 | January 20, 2023 | |
Where jupyterlab store variable if it does not assign ? And how it remove? | 3 | 1826 | January 23, 2022 | |
Memory Error causes Cell to be Erased | 0 | 419 | April 20, 2020 | |
Spurious error when running Jupyter notebook, code runs fine in terminal | 6 | 2163 | November 8, 2019 |