How to connect a local jupyterlab notebook to a remote kernel?

Hi everyone! I need some advice, please.

What would be the most simple way to connect a running jupyter lab to a remote kernel?

I’ve briefly looked into the kernel and enterprise gateways and some extensions but the JEG seems an overkill and i didn’t understand how to use the kernel gateway or if the project is alive at all.

image

I would appreciate any pointers, links and ideas.

Thanks

2 Likes

IIUC Jupyter has introduced the concept of “kernel provisioners” which can potentially be remote.

@kevin-bates has an implementation over in GitHub - kevin-bates/remote_provisioners but I’m not too sure how ready for prime-time they are as yet.

2 Likes

If you share your goals/requirements, it might be possible to accomplish this another way, even if remote kernels would be the ideal method.

For example, naively, why you can’t store data from Host 1 and resume calculating with another tab which has the notebook for host 2 which reads that data, then hit the run button.

John

2 Likes

how ready for prime-time they are as yet.

Hi @dhirschfeld - unfortunately, these are no more than a proof of concept. I’m sorry, but I’m unable to find the necessary time to move these forward.

Thank for your question.

I have a xeus based python kernel (source code) integrated into a biomedical imaging application (3D Slicer).

I would like to use that kernel in an existing Jupyter Lab that is provided to me by a cloud service provider.

I have root access to the VM that’s running the Jupyter Lab server, but I cannot/don’t want to install all the Qt/GUI dependencies that are required to run Slicer natively on the host. I am aiming to have Slicer running either in a container (preferable) or on a dedicated host.

I figured as much, but thanks for the confirmation/update.

I think there’s huge value in the kernel provisioners concept so I hope the community picks it up…

1 Like

Hi, @kevin-bates :wave:
Do you mind if I ping you with some questions if i decide to hack into the remote provisioners repo?

1 Like

Hi, I have similar goals to allow a client (jupyterlab) connecting to an existing remote kernel.

I have successfully used the remote provisioner developed by @kevin-bates but, by design, it allows a client to start a fresh remote kernel, not connecting to an already running kernel.

@pll_llq Happy to connect and discuss to move forward with this need.

2 Likes

No, not at all. You can find my email address on my Github profile.

1 Like

I think there’s huge value in the kernel provisioners concept so I hope the community picks it up

Yes, I agree and hope so as well.

1 Like

Wow, this is great! @echarles would you be up for a call sometime this week?

Was any progress made on this? The ability to manage and connect to kernels remotely would be ideal for my use case.

Not on my end :man_shrugging:t2:
For my specific use case of I have taken a different approach using GitHub - jupyterhub/jupyter-remote-desktop-proxy: Run a Linux Desktop on a JupyterHub to connect to an application running inside a container on the same host as my lab instance

I see, that’s what I have been trying to avoid. I started a new thread on the topic before I saw yours: https://discourse.jupyter.org/t/single-user-setups-for-local-client-remote-kernel/13882, hopefully between the two threads we can figure something out!

1 Like

A bit late to the party but for what it is worth:

I built a python package that integrates with Jupyter (via a custom Kernel Provisioner) for launching and connecting to Jupyter kernels on remote systems via SSH. Kernels can be spawn directly from the JupyterLab UI or from the command line.

Python package: https://pypi.org/project/sshpyk/
Docs and GitHub: https://github.com/casangi/sshpyk/
(I donated my work to be maintained by the Common Astronomy Software Applications in an attempt to have a tool that does not need to be rebuilt every few years, a search that led me initially to this forum)

Cheers

2 Likes