What would be the most simple way to connect a running jupyter lab to a remote kernel?
I’ve briefly looked into the kernel and enterprise gateways and some extensions but the JEG seems an overkill and i didn’t understand how to use the kernel gateway or if the project is alive at all.
If you share your goals/requirements, it might be possible to accomplish this another way, even if remote kernels would be the ideal method.
For example, naively, why you can’t store data from Host 1 and resume calculating with another tab which has the notebook for host 2 which reads that data, then hit the run button.
Hi @dhirschfeld - unfortunately, these are no more than a proof of concept. I’m sorry, but I’m unable to find the necessary time to move these forward.
I have a xeus based python kernel (source code) integrated into a biomedical imaging application (3D Slicer).
I would like to use that kernel in an existing Jupyter Lab that is provided to me by a cloud service provider.
I have root access to the VM that’s running the Jupyter Lab server, but I cannot/don’t want to install all the Qt/GUI dependencies that are required to run Slicer natively on the host. I am aiming to have Slicer running either in a container (preferable) or on a dedicated host.
Hi, I have similar goals to allow a client (jupyterlab) connecting to an existing remote kernel.
I have successfully used the remote provisioner developed by @kevin-bates but, by design, it allows a client to start a fresh remote kernel, not connecting to an already running kernel.
@pll_llq Happy to connect and discuss to move forward with this need.