We’ve noticed slow notebook downloads of even basic notebooks when converting to PDF in our jupyterhub installation (using z2jh if that matters).
For example, I’ve got a notebook with a single cell that just does !pip freeze
and it can take upwards of 40 seconds to get the download prompt for the PDF.
Downloading the same notebook as markdown or restructured text is a lot faster, so I’m guessing the slowness comes from the latex conversion.
I noticed this thread [1] which is intriguing if we can get the same basic functionality without latex but I’d need to test it out in our hub deployment.
I also see that nbconvert
has a lot of configuration options but I’m not sure if any of those could be used to help tweak the performance or help profile, does anyone have thoughts or experience there? I’m thinking maybe setting logging to debug
to see if that can help show where time is spent, or maybe using py-spy
?
For reference these are the relevant packages we’re installing for nbconvert
in the notebook server image:
RUN apt-get -y install --reinstall texlive-xetex texlive-fonts-recommended texlive-generic-recommended texlive-latex-extra texlive-publishers texlive-science texlive-pstricks texlive-pictures pandoc
...
RUN conda update nbconvert
jupyter/scipy-notebook
is the base container image.