Juypter notebook taking forever to load


I have been using JN to run some python code on a large data set

It is possible I created an infinite loop. The notebook size has expanded from 0.5 MB to 65 MB

I get a blank page showing it is loading with the spinning loading wheel, but the background is blank.
It is very unresponsive to shutting down etc.

I left it opening overnight and it more or less opened for me but as soon as I started to scroll through, it went into loading mode again.

I am on a tight deadline and have already lost 3 nights trying to figure it out

I wonder if I could stop it from running the output? nbstripout?
I have installed but not sure how to run it on my file and or whether this would be the solution.

Thanks for any help you can give me!

i don’t use nbstripout, but notebooks are just JSON (and there’s a schema). You can DIY whatever you need with stdlib stuff:

So in a second notebook, e.g. cleaning.ipynb:

import json
from pathlib import Path

raw = json.loads(Path("notebook.ipynb").read_text())

for cell in raw["cells"]:
    if "outputs" in cell:
        cell["outputs"] = []


then, once you open and save the new notebook.cleaned.ipynb, the server will take care of fixing up file-on-disk to be proper nbformat.