Creating conda environment for installation of packages

I have been using the jupyter/minimal-notebook docker image as the base for my docker image, which includes various geospatial packages. One of those packages, proj, utilizes the conda environmental variables to specify a critical location for the package to run. The package fails if I install it using the base conda environment (which the Jupyter stack uses to install all packages, it appears) because it cannot set the environmental variable if a conda environment is not activated. However, I am trying but failing at getting the conda environment to be active when running the docker image as a notebook. Any advice on how to get this running would be greatly appreciated!

This works when I run the docker image in interactive mode using the bash shell, but not when I run the docker image as a notebook:

FROM jupyter/minimal-notebook:7a0c7325e470

ARG conda_env=notebooks
COPY environment.yml .
RUN conda env create --quiet -f environment.yml && \
    conda clean --all -f -y

RUN conda init bash
RUN echo "conda activate ${conda_env}" >> ~/.bashrc

WORKDIR work

This actually activates the conda environment in the notebooks, and the necessary environmental variable is set, but the code fails with: ERROR 1: PROJ: pj_obj_create: Open of /opt/conda/envs/notebooks/share/proj failed

FROM jupyter/minimal-notebook:7a0c7325e470

ARG conda_env=notebooks
COPY environment.yml .
RUN conda env create --quiet -f environment.yml && \
    conda clean --all -f -y

ENV PATH $CONDA_DIR/envs/${conda_env}/bin:$PATH
ENV CONDA_DEFAULT_ENV ${conda_env}

# this appears to do nothing but was worth a try in case it was a permissions issue
RUN fix-permissions $CONDA_DIR
WORKDIR work
1 Like

I’m trying to do something similar, and was able to have some luck by adding this line:

CMD /bin/bash -c ". activate ${conda_env} && start-notebook.sh"

However, this didn’t work with JupyterHub.