There are use cases where other applications would like the services of JupyterHub without a JupyterLab or classic Notebook interface. What we’d like to do is use the Hub spawner to spawn a server that doesn’t present either the Notebook or JupyterLab UI, but (say) an RStudio UI. Has anyone had experience with doing something like this? What protocols should a server spun up from JupyterHub follow?
Any pointers/thoughts appreciated
There are two ways:
- Use jupyter-server-proxy to proxy a webapp via Jupyter server/lab: Welcome to Jupyter Server Proxy’s documentation! — Jupyter Server Proxy documentation
- Use jhsingle-native-proxy to replace the JupyterHub singleuser process and proxy a webapp: New package to run arbitrary web service in JupyterHub (jhsingle-native-proxy)
It may not be not quite what you’re asking, but I run RStudio Server as a server spawned from JupyterHub. The user still get the basic JupyterHub UI at first after login, but then you click in and launch an RStudio instance.
Once the user selects RStudio, it launches RStudio Server (open source) in the container, and points the user to it, so they get their very own RStudio interface to work in.
It uses GitHub - jupyterhub/jupyter-rsession-proxy: Jupyter extensions for running an RStudio rsession proxy under the hood. The exact I use image is on DockerHub - wesleyburr/trent-rstudio:rstudio0907. It’s based on the Berkeley Data8 variant. You’re welcome to joint your JupyterHub at it if you like.
That is EXACTLY what we’re looking for. Many, many thanks.
Were you using z2jh, or did you plug this into a different Hub?
Yes, z2jh on bare metal. Using the nfs-csi storage solution mentioned if you go in the docs to microk8s, and do their pathway involving NFS (MicroK8s - Use NFS for Persistent Volumes). Here’s the helm jhub yaml, with personal details removed. Note that as far as I can tell, if you follow the current z2jh docs for bare metal, the Configure Storage part (step 4, about OpenEBS) does not work - I was not able to get it working in 4 days of trying. But using the nfs-csi from the microk8s setup guide worked a treat, provided you have a NFS server to point to.
I can share the rest of my config if this is actually something you’re trying to get setup. As soon as I get a spare day, I’m going to write up the full process anyway, because I really strugged to get it going, so it’s good to document it for future me (or similar folks). Also if you need a properly spaced version of the below, just ping - email is firstname.lastname@example.org.
# authenticator_class: azuread
- display_name: “SageMath - for all Mathematics”
description: “Open source alternative to Maple, MATLAB and Mathematica”
- display_name: “RStudio Interface - for Statistics and Data Science”
description: “A data science environment, but loaded through RStudio’s interface instead.”
- display_name: “Jupyter Notebook - Datascience environment”
description: “Jupyter Notebook environment, with R, Julia and Python.”
- display_name: “Jupyter Notebook - Python Only”
description: “To avoid too much bells and whistles: Python.”
Thanks a ton. I’m checking with a colleague to see if this is sufficient for us; if not, he or I will be in touch on the weekend or Monday.
This is really exciting.
This still requires that you run the full python + jupyter stack in the container, correct?
Depends which container you’re talking about, I guess. It’s a k8s framework, so there are a lot of containers around. The main set is JupyterHub, which … yes, is a full stack. Then the images you get are typically (at least for my applications) a Jupyter notebook server with some specified kernels, so typically have Python and Jupyter. However, as manics noted above, if you want, you can do almost anything - launch a container and use jhsingle-native-proxy and run whatever you want.
I want you to know that we tried it and it works! This is really terrific, and we think it points the way to hosting a large number of other applications on JupyterHub. Thanks very much for this, and we’d like to assist you in documenting it (for our own sake as well as yours).