How to connect a local jupyterlab notebook to a remote kernel?

Hi everyone! I need some advice, please.

What would be the most simple way to connect a running jupyter lab to a remote kernel?

I’ve briefly looked into the kernel and enterprise gateways and some extensions but the JEG seems an overkill and i didn’t understand how to use the kernel gateway or if the project is alive at all.


I would appreciate any pointers, links and ideas.



IIUC Jupyter has introduced the concept of “kernel provisioners” which can potentially be remote.

@kevin-bates has an implementation over in GitHub - kevin-bates/remote_provisioners but I’m not too sure how ready for prime-time they are as yet.


If you share your goals/requirements, it might be possible to accomplish this another way, even if remote kernels would be the ideal method.

For example, naively, why you can’t store data from Host 1 and resume calculating with another tab which has the notebook for host 2 which reads that data, then hit the run button.



how ready for prime-time they are as yet.

Hi @dhirschfeld - unfortunately, these are no more than a proof of concept. I’m sorry, but I’m unable to find the necessary time to move these forward.

Thank for your question.

I have a xeus based python kernel (source code) integrated into a biomedical imaging application (3D Slicer).

I would like to use that kernel in an existing Jupyter Lab that is provided to me by a cloud service provider.

I have root access to the VM that’s running the Jupyter Lab server, but I cannot/don’t want to install all the Qt/GUI dependencies that are required to run Slicer natively on the host. I am aiming to have Slicer running either in a container (preferable) or on a dedicated host.

I figured as much, but thanks for the confirmation/update.

I think there’s huge value in the kernel provisioners concept so I hope the community picks it up…

1 Like

Hi, @kevin-bates :wave:
Do you mind if I ping you with some questions if i decide to hack into the remote provisioners repo?

1 Like

Hi, I have similar goals to allow a client (jupyterlab) connecting to an existing remote kernel.

I have successfully used the remote provisioner developed by @kevin-bates but, by design, it allows a client to start a fresh remote kernel, not connecting to an already running kernel.

@pll_llq Happy to connect and discuss to move forward with this need.


No, not at all. You can find my email address on my Github profile.

1 Like

I think there’s huge value in the kernel provisioners concept so I hope the community picks it up

Yes, I agree and hope so as well.

1 Like

Wow, this is great! @echarles would you be up for a call sometime this week?

Was any progress made on this? The ability to manage and connect to kernels remotely would be ideal for my use case.

Not on my end :man_shrugging:t2:
For my specific use case of I have taken a different approach using GitHub - jupyterhub/jupyter-remote-desktop-proxy: Run a Linux Desktop on a JupyterHub to connect to an application running inside a container on the same host as my lab instance

I see, that’s what I have been trying to avoid. I started a new thread on the topic before I saw yours:, hopefully between the two threads we can figure something out!

1 Like