Speed up notebook saving/loading if working on remote servers (compression?)

Hello,
I am not 100% sure if this is the right place to post this or if an issue in one of the jupyter repos would be better.

Anyway, we have Jupyter Lab running on a remote server and the users connect to it from their local machines.
However, with larger notebooks saving (and also loading) of the notebooks becomes really slow.
The main problem ist the saving, as this happens far more often of course.
For example, notebooks with a size of 20 MB (lots of plots) take around 3 minutes to save on average.

Now, if I understand everything correctly saving transfers the json representation (accessible through the contents api) to the server. Is there any way of compressing this? There is also a related issue #5760.

Initially, I suspected that the --NotebookApp.websocket_compression_options does exactly this. However this option has no influence. I see the requests in the network monitor and they are still plain json. However, it changes the response header of the websockets by adding Sec-Websocket-Extensions: permessage-deflate.
Especially reading this issue (#2490), after which the default compression was disabled, it sounds like it was doing exactly what I am looking for. I also tested this option on my local windows and Linux machines.

Especially with auto-saving and bad network connections this becomes really frustrating for some users, some of which work with even larger files.

I would be more than happy for any hints how to speed this up. May it be compression, chunk-processing or whatever.

1 Like