I have such a challenge
What I have got:
a) docker (Linux)
b) two networks: [frontend] and [backend]
c) Jupyterhub working in [frontend] network
d) Jupyterlab spawned (by DockerSpawner) into [frontend] network
Info: The above configuration works
Database runs in [backend] network.
Spawned Jupyterlab and created inside notebook program dosn’t see database.
(because Jupyterlab runs in diffrent network)
The challenge (problem)
Is it possible for spawned Jupyterlab to be connected to both the frontend and backend networks?
Or maybe there is another way for Jupyterlab to see a database placed in another docker network?
You might be able to use a direct WebSocket connection to have the two networks communicate. That would be fast. I dont know of an existing tool or JupyterHub or JupyterLab Server Extension available to do what you want now. Are you using 1 or 2 docker containers? Docker should allow interprocess communications. This is a suggestion; i have not done this myself.
I run containers via docker-compose.
Jupyterhub as a docker container is placed in [fronted] network.
Then Jupyterhub creates (spawns) Jupyterlab as a container, but I could only setup one network name.
In this case I had to set [fronted] network.
c.DockerSpawner.network_name = fronted
The problem is that, my databases works into [backend] network.
The goal is run Jupyterlab that has two newtorks.
How to do it?
Is It possible?
And one more thing.
When I run Jupyterlab (giving up Juyterhub) by docker-compose with options :
I can connect to database with code running in Juputerab.