How to use ssh to Z-JH in local cluster

It’s a good idea for data scientist.
They can use vscode to link the single note-book by ssh , it useful to debug .

Now , Let me say how I practice .

1、The latest helm version 1.2.0 is seem stable .

2、Use jupyterhub-ssh to build a jupyterhub-ssh service.By the way, we have choose a version at first.

3、Use the special value for zero-to-jupyter.
proxy.https.enable must be true. for TLS
proxy.https.type must be letsencrypt. for start deployment auto-https can’t empty

4、change traefik extraStaticConfig, I think I need dnsChallenge in local cluster.
I blocked in this step , learning traefik now , anybody have some suggestions ?

5、Now traefik is useless, and I can give up TLS , so , just change service port to jupyter-ssh is greate, and delete useless network policy, fine.

I also have cert-manager build certs, But I don’t know how to use it , The cert.yaml is like this.

kind: ClusterIssuer
  name: jupyterhub-self-signing-issuer
  selfSigned: {}

kind: Certificate
  name: jupyterhub-cert
  namespace: jupyterhub-system
    - hub.jupyterhub-system.svc.cluster.local
  isCA: true
  secretName: jupyterhub-tls
    name: jupyterhub-self-signing-issuer
    kind: ClusterIssuer

OK, then I give up to use traefik, because it was so difficult to use , delete some useless things, and do not use TLS , It’s so happy to use JH , So greate, ssh done.

Hi! I installed it according to the configuration on github, but I still can’t use ssh connection. Is it necessary to modify some configuration?


It depends how you have your ingress setup.

Are you using an external ingress controller like NGINX Ingress Controller ?

If so, you need to do three things:

  1. Expose a TCP Service
  2. Label your ingress controller
  3. Configure the jupyterhub-ssh Network Policy so that it allows the network flow from the ingress to your jupyterhub-ssh service

1 and 2 can be accomplished with a configuration like:

  podLabels: "true"

# Enable tcp-services-configmap that will add additional port to services mapping
  22022: jhub/jupyterhub-ssh:22

For number 3:

  enabled: true

      - ports:
        - protocol: TCP
          port: ssh

          - namespaceSelector:

You should then be able to do ssh -p 22022

This here is super exciting.
I have tried to get it working, but I just keep getting timeout when connecting to the pods via. ssh.
I have re-build the docker image in the repo mentioned above, but that unfortunately led to no logs being outputted to std-out.
Our setup is pretty standard, k8s cluster setup. I create a connection to the svc deployed with the helm setup. Like so: kubectl -n jhub port-forward svc/jupyterhub-ssh 8022:22. and then I do the following ssh <user name>@ -p 8022. But all I get is a timeout. Not sure what I’m missing? Do the pods that we are connecting in to need sshd, service daemon? Is there any logging I can check to see what take place? Any help is appreciated, thanks.

Hi and welcome !

Did you check the NetworkPolicy side of things ?

The default one from jupyterhub-ssh is restrictive (and correctly so).

1 Like

Thanks for the reply. It was the network policy that was creating this hickup. Now I get another error though… I can see from the code that it is when it calls the following endpoint:

asyncssh] [conn=0, chan=0] Set write buffer limits: low-water=16384, high-water=65536
[asyncssh] [conn=0, chan=0] New SSH session requested
[asyncssh] [conn=0, chan=0]   Env: LANG=en_US.UTF-8
[asyncssh] [conn=0, chan=0]   Env: LC_ALL=en_US.UTF-8
[asyncssh] [conn=0, chan=0]   Interactive shell requested
[asyncssh] [conn=0, chan=0] Uncaught exception
Traceback (most recent call last):
  File "/home/jovyan/.local/lib/python3.8/site-packages/asyncssh/", line 829, in _reap_task
  File "/srv/jupyterhub-ssh/jupyterhub_ssh/", line 155, in _handle_client
    async with ClientSession() as client, Terminado(
  File "/srv/jupyterhub-ssh/jupyterhub_ssh/", line 22, in __aenter__
    data = await resp.json()
  File "/home/jovyan/.local/lib/python3.8/site-packages/aiohttp/", line 1097, in json
    raise ContentTypeError(
aiohttp.client_exceptions.ContentTypeError: 0, message='Attempt to decode JSON with unexpected mimetype: text/html', url=URL('http://hub:8081/user/<username>/api/terminals')
[asyncssh] [conn=0, chan=0] Closing channel due to connection close
[asyncssh] [conn=0, chan=0] Channel closed: 0, message='Attempt to decode JSON with unexpected mimetype: text/html', url=URL('http://hub:8081/user/<username>/api/terminals')
[asyncssh] [conn=1] Accepted SSH client connection

It seems that there is something missing in my setup? What version of jupyterhub are you using? I assume this here is the hub version, some how I don’t have that service.

I have hooked in to my setup, and found that I get a 405 method not allowed when hitting that endpoint.

Not sure what to do about it… I kind of need to know what was supposed to be returned from that endpoint :stuck_out_tongue:

Thanks for the help.

I have it working with JupyterHub 3.0.0 deployed using the helm chart version 2.0.0.

What do you have on your side ?

Thanks for the fast reply.
We are using Jupyterhub 1.5.0 with chart version 1.2.0, so one version before yours. What I don’t get is that I can’t seem to find any reference in the code base of the service called: http://hub:8081/user/<username>/api/terminals
I found this here github issue: Unable to establish a ssh connection · Issue #33 · yuvipanda/jupyterhub-ssh · GitHub
Where the guy actually have the same error. In there he talks about self-signed certs, but I’m not sure what it is he is referring to. Maybe you know?
Thanks again for your help.

1 Like

So I use the following setting to make use of the “labextensions” → JUPYTERHUB_SINGLEUSER_APP: "jupyter_server.serverapp.ServerApp"
Could that have an impact?

Any chance you did not set hubUrl when deploying jupyterhub-ssh ?

Thanks for your reply. I did set it to http://hub:8081. I have tried to change this, but then I get different errors, which seems more related to the fact it is set wrong. I have tried to create a port forward to the hub:8081 in my cluster, and if I use a postman/google chrome developer interface, I’m not able to access the hub:8081/user/<username>/api/terminals endpoint. Getting a 405 back, which is method is not allowed. Normally that mean something is not implemented yet or access rights is wrong. I can see that this endpoint comes from the jupyter_server library, but the version we use has this here code in the right place, so I’m all out of ideas.
In the mean time VSCode released an update to there code bin that actually can create a tunnel out of the pod for SSH, just like using localtunnel or ngrok or similar tools.

so right now we have a workaround for our users that uses VSCode. But I still would like to get this here running, since we have a few people using pycharm, where we do not have a solution yet.

My be a silly question but are you using the actual FQDN for your hub ? And is it really answering on 8081 ?
By default k8s’s ingress are on 80 and 443, if you want to expose different ports you have to do it explicitly for it as I wrote in my example above.

Thanks for your reply. I really appreciate you taking your time with my questions :slight_smile: .
Ok, I’m not sure I understand your question. My setup is a private k8s cluster on GKE. I use the http://hub:8081 as it is the internal svc for the hub? (hub pod/svc)
When I say I have tried accessing the hub pod directly, I created a tunnel using kubectl to the svc/hub - HTTPS_PROXY=localhost:9060 kubectl -n jhub port-forward svc/hub 8081:8081 (where the HTTPS_PROXY=localhost:9060 part is for a bastion server in front).
When creating the SSH tunnel using the jupyterhub-ssh pod, we have a similar approach. Create a tunnel using kubectl to the svc/jupytherhub-ssh - HTTPS_PROXY=localhost:9060 kubectl -n jhub port-forward svc/jupyterhub-ssh 8022:22, and then from VSCode or PyCharm you access 8022 on localhost - ssh kaaquist@localhost -p 8022. Hope that makes more sense, maybe also in regards to my setup? :).
I did try to use hub.jhub.svc.cluster.local:8081 as my url too, but that did not work.