Precompiling IJulia times out

I’m having an issue starting Julia kernels and the Pluto interface when using the jupyter/datascience-notebook image as a base to build my own image. The issues occur when running locally on my Macbook (M1) and when used on a JupyterHub hosted on GCP.

When using the jupyter/datascience-notebook image “as is”, the Julia kernel starts fine in Jupyter Lab, as does the Pluto interface (although it does take a little while for IJulia to precompile when starting a notebook in the lab environment that attaches to a Julia kernel, and even longer for the Pluto interface to fully load).

However, when I add some additional python libraries into the environment, neither of these features work due to time out issues. Any idea on what might be the culprit? I also install RStudio, but this issue was present before I started including RStudio in my image, so I don’t believe RStudio is related to this issue.

I’m including my Dockerfile below in case it sheds any light on what may be in conflict. I’m planning to work through it section by section to see what component causes the timeout, but that could take some time. Any insight would be greatly appreciated!

FROM jupyter/datascience-notebook:latest

# Set user as root to do system updates so everything is up-to-date
USER root
RUN apt-get update && \
    apt-get install -y --no-install-recommends \
    latexmk \
    lmodern \
    zip \
    unzip

# This will install R-studio server
RUN wget https://download2.rstudio.org/server/jammy/amd64/rstudio-server-2023.09.1-494-amd64.deb && \
    apt-get install -y ./rstudio-server-2023.09.1-494-amd64.deb && \
    rm rstudio-server-2023.09.1-494-amd64.deb

# RStudio needs to run as the notebook user
RUN chown -R ${NB_USER}:rstudio-server /var/lib/rstudio-server && \
    chmod -R g=u /var/lib/rstudio-server

# Add RStudio to the Path
ENV PATH=$PATH:/usr/lib/rstudio-server/bin

# Set user as the notebook user to install packages
USER ${NB_UID}

# Install otter-grader and git lab extensions for easily creating assignments
RUN pip install --no-cache-dir otter_grader_labextension
RUN pip install --pre "jupyterlab-git==0.50.0a0"

# Install additional packages
RUN mamba install --quiet --yes \
    'datascience' \
    'sympy' \
    'jupyter-book' \
    'otter-grader' \
    'astropy' \
    'nbgitpuller' \
    'opencv' \
    'jupyter-resource-usage' \
    'jupyter-rsession-proxy' \
    'jupyterlab-spellchecker' \
    'jupyterlab-myst' \
#    'jupyterlab-git' \ # Waiting for update to work with JLab 4
    'r-ggformula' \
    'r-essentials' \
    'plotly' && \
    mamba clean --all -f -y && \
    fix-permissions "${CONDA_DIR}" && \
    fix-permissions "/home/${NB_USER}"

# Turn off MyST extension by default
RUN jupyter labextension disable jupyterlab-myst

# Install MyST backend
RUN npm install -g mystmd

A quick update on this: it seems like even a minimal addition to the Dockerfile causes this issue to emerge. For example, just adding the zip and unzip packages will cause the the Julia components to “break”.

FROM jupyter/datascience-notebook:latest

USER root

RUN apt-get update --yes && \
    apt-get install -y --no-install-recommends \
    zip \
    unzip

USER ${NB_UID}

Yet another update, it seems like any change to the image will cause this problem. Even adding a simple package via mamba causes this to happen:

FROM jupyter/datascience-notebook:latest
USER ${NB_UID}
RUN mamba install --quiet --yes \
    'datascience' && \
    mamba clean --all -f -y && \
    fix-permissions "${CONDA_DIR}" && \
    fix-permissions "/home/${NB_USER}"

Building this image will cause Julia kernels & the Pluto interface to stop working.

I think this may be an architecture mismatch issue: I’m building the images on x64 but running them on ARM (Apple M1). The base image runs fine on the Macbook Pro (M1) because it’s pulling in the image for the correct host architecture. But, since I’m adding on top of the image on a linux machine (GCP server), it’s building on top of the x64 image and only creating my new image also as an x64.

The Mac M1 I’m testing on can run the x64 image, but I’m guessing the slowdown caused by running a non-native image is causing it to hit the timeout issues I’m observing. I’ll keep testing things and see what I can discover.