I am running into a memory leak when I loop through a set of netCDF files, read their contents and plot them, but only when I run this loop in a Jupyter notebook. The same memory leak happens when I use xarray, netCDF4-python, or h5py as the underlying library to read the package. I have tried adding explicit del and gc.collect commands in the loop but still see the same amount of memory growth. I also put the loop within a function, so all variables should be garbage collecting and do not have global scope. Does anyone have any idea what might be going on under the hood in Jupyter that could be causing memory retention/memory leak that is invisible to the Python garbage collector?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
MemoryError: Unable to allocate | 0 | 1140 | January 20, 2023 | |
Why reassigment leads to memory leak? | 0 | 468 | January 28, 2021 | |
Help to see the logs of an ipykernel | 0 | 1968 | May 19, 2022 | |
Jupyterlab uses up all my free space to write jupyterlab error txt | 2 | 411 | November 8, 2023 | |
Garbage Value Issue in Jupyter Notebook Markdown Cells | 1 | 787 | March 25, 2024 |