Our approach for serving jupyterhub & jupyterlab on k8s as a public service

I am Frodo (@inscite), yet another Ph.D. student from the Republic of Korea.
I’m happy to share our experience of using jupyterhub as a public service!

The research center I am currently working is planned to serve Jupterlab analytics service for any citizen researcher and professional researcher as a public research toolkit in our nation.
Our goal for serving this service is to promote active participation in sharing and analyzing public research data which are outcomes of nationally-funded research projects.

Have a quick look at our spawned jupyterlab image on k8s!

To achieve this goal, I had to solve several incompatibilities between our demands and jupyterhub, especially the base-notebook docker image.

  1. Our service is handling each user’s information between FreeIPA and one large shared storage (e.g. Lustre, NFS). Well, jupyterhub also can handle and separate each FreeIPA user with configuration hack. Nevertheless, all docker images spawned on k8s (not only k8s but also docker) have the identical UID/GID as 1000/100, which is quite desperate for users to handing shared storage with their own UID/GID.
    I’ve solved this issue by receiving username, uid, and gid as environment variables. If so with k8s job creation, some backend could spawn each jupyterlab container with separate uid/gid declaration.

  2. Our service also plans to serve literally ‘public conda environments’ as a pre-defined and ready-made conda environment. Any users on our system can refer to these public conda environments mounted on shared storage. As a result, users who wish to use customized or pre-defined (such as rapids.ai) conda environment can utilize these envs on both jupyter terminal and notebook kernel.

  3. GPUs!
    I also modified a base image from a pure ubuntu image into NVIDIA’s CUDA image. Due to the compatibility issue, I changed the default version of ubuntu from focal (20.04) to bionic (18.04). It seems okay to execute my previous GPU jobs written in python, including nvidia-smi!

That’s a brief introduction to features I implemented on wonderful jupyter docker images.
I’ll keep sharing implementation on my repository: inscite/jupyter-docker-stacks: Ready-to-run Docker images containing Jupyter applications (github.com)

If you have any questions about the implementation, feel free to contact me. Thank you for reading!

1 Like

Can you tell me how to make jupyterhub jump to jupyterlab? Thanks