I am setting up a JupyterHub that will be available to all members of an institution. I am using the dockerspawner with the stack jupyter/all-spark-notebook. I have installed JupyterLab and it comes up as the default interface.
My questions is this: I would like to install several extensions to offer more tools to my users. These include server extensions, lab extensions and notebook extensions. When I log in to the hub and install them via the terminal into my own environment, it works. But I’d like to “commit” those features to the base image so that they’re available to all users when they log in. Is there a recommended way to do that?
Similarly, I’d like to have a collection of sample notebooks installed to the image that will appear in all user’s file systems, that they can browse through and try out. Is there a recommended way to do that?
In this example the first image just installs some dependencies on top of the jupyter/base-notebook, the second image adds more stuff plus some notebooks. It’s split like this because we’ve got other notebook images derived from our own jupyter-docker “base image”, though you can obviously use just a single image.
You can either docker build and docker push manually, or setup an automated Docker Hub build (free if you’re repositories are public), these are the corresponding images for the two repos:
Make sure the extensions are installed and enabled with —sys-prefix instead of the user’s home. Most extensions provided as conda packages do this by default. That way any user running with that env will have those extensions enabled.
For files that are part of the image, you can put them in a location in the image that’s symlinked into the user’s home directory (I’m presuming home is mounted as a volume for user persistence). E.g. ln -s /opt/share/notebooks ~/examples