Building Docker image from jupyter/base-notebook but with a different conda environment

My team and I share an environment.yml file so we can all work from the same environment. I’m trying to make this the default for our JupyterHub, but I can’t get the hub to spawn notebooks with this environment, even though it does exist in the conda envs inside the Docker image. I tried modifying the command in the Dockerfile as

CMD /bin/bash -c ". activate myenv &&"

This works if I run the image directly, but if I try to set it as the image for JupyterHub, notebooks are run in the base environment. Any tips on getting this up and running?

Have you had a look at the post_start_exec hook at ? There is some explanation in the doc string but I am not sure whether the generated documention is hosted somewhere in the web.

I did try adding . activate myenv there with no luck. However, I was able to get my notebooks running in my environment with this in config.yaml:

  defaultUrl: "/lab"
  cmd: ["/bin/bash", "-c", ". activate myenv && jupyterhub-singleuser --SingleUserNotebookApp.default_url=/lab"]

Strangely, the defaultUrl entry doesn’t seem to have any effect, hence the --SingleUserNotebookApp.default_url=/lab.

1 Like

defaultUrl is converted into a notebook argument that’s passed to cmd.

Your in-line script doesn’t handle extra arguments. If you replace it with a script that handles $@ I think it should work as expected.


@petebachant, Have you seen this:
It’s about how to use dockerspawner in jupyterhub.

And if you have right configuration. You can change different kernels in jupyter notebook.