Error 'Initial list of events failed' or 'Initial list of pods failed' when accessing jupyerhub

Hey!

I’ve set up Binderhub for our university and it works if I use version “0.2.0-n1011.hb49edf6” but not the newest ones (“1.0.0-0.dev.git.3018.he00ec49” is the latest one I’ve tried).
When trying to access ‘hub.binder.mydomain.com’, I receive error “Service Unavailable”.
When checking the logs for the hub pod, it says:

[I 2023-02-13 13:14:10.447 JupyterHub oauth2:102] OAuth redirect: 'https://hub.binder.edu.liu.se/hub/oauth_callback'
[I 2023-02-13 13:14:10.448 JupyterHub log:186] 302 GET /hub/oauth_login?next=%2Fhub%2F -> https://login.microsoftonline.com/913f18ec-7f26-4c5f-a816-784fe9a58edd/oauth2/authorize?response_type=code&redirect_uri=https%3A%2F%2Fhub.binder.edu.liu.se%2Fhub%2Foauth_callback&client_id=5d07725f-e78b-4cf8-bf9c-df7df65b34bb&state=[secret] (@130.236.18.90) 1.01ms
[I 2023-02-13 13:14:10.965 JupyterHub roles:238] Adding role user for User: henbj71@liu.se
[I 2023-02-13 13:14:11.440 JupyterHub base:810] User logged in: henbj71@liu.se
[I 2023-02-13 13:14:11.441 JupyterHub log:186] 302 GET /hub/oauth_callback?code=[secret]&state=[secret]&session_state=[secret] -> /hub/ (@130.236.18.90) 720.65ms
[I 2023-02-13 13:14:11.625 JupyterHub log:186] 302 GET /hub/ -> /hub/home (henbj71@liu.se@130.236.18.90) 159.45ms
[I 2023-02-13 13:14:11.673 JupyterHub log:186] 200 GET /hub/home (henbj71@liu.se@130.236.18.90) 32.06ms
[E 2023-02-13 13:14:11.685 JupyterHub reflector:385] Initial list of events failed
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
[E 2023-02-13 13:14:11.685 JupyterHub spawner:2402] Reflector for events failed to start.
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2400, in catch_reflector_start
await f
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
Task was destroyed but it is pending!
task: <Task pending name='Task-394' coro=<shared_client.<locals>.close_client_task() running at /usr/local/lib/python3.9/site-packages/kubespawner/clients.py:58> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7f4e50aa4e80>()]>>
Task exception was never retrieved
future: <Task finished name='Task-398' coro=<KubeSpawner._start_reflector.<locals>.catch_reflector_start() done, defined at /usr/local/lib/python3.9/site-packages/kubespawner/spawner.py:2398> exception=SystemExit(1)>
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2400, in catch_reflector_start
await f
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/jupyterhub/app.py", line 3313, in launch_instance
loop.start()
File "/usr/local/lib/python3.9/site-packages/tornado/platform/asyncio.py", line 215, in start
self.asyncio_loop.run_forever()
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 601, in run_forever
self._run_once()
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1905, in _run_once
handle._run()
File "/usr/local/lib/python3.9/asyncio/events.py", line 80, in _run
self._context.run(self._callback, *self._args)
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2403, in catch_reflector_start
sys.exit(1)
SystemExit: 1
Exception ignored in:
<coroutine object shared_client.<locals>.close_client_task at 0x7f4e50a58e40>
RuntimeError: coroutine ignored GeneratorExit 

My config file looks like this (removed passwords and other secrets):

config:
  BinderHub:
    debug: true
    cors_allow_origin: '*'
    use_registry: true
    image_prefix: "gitlab.it.liu.se:5000/drs/binderhub/prod-"
    hub_url: "https://hub.binder.edu.liu.se" #If no cert, use http. If cert, use https
    
    auth_enabled: true
    use_named_servers: true
    
    # To display the university logo
    template_path: /etc/binderhub/custom/templates
    extra_static_path: /etc/binderhub/custom/static
    extra_static_url_prefix: /extra_static/
    template_variables:
      EXTRA_STATIC_URL_PREFIX: "/extra_static/"

registry:
  url: "https://gitlab.it.liu.se:5000/drs/binderhub"
  username: "SECRET"

jupyterhub:
  custom:
    binderauth_enabled: true

  singleuser:
    cmd: jupyter-labhub
    memory:
      guarantee: 128M
      limit: 4G
    cpu:
      guarantee: .1
      limit: 1

  prePuller:
    continuous:
      enabled: true

  scheduling:
    userScheduler:
      enabled: false # Requires Cluster Roles if set to true
      
  cull:
    enabled: true
    users: false
    removeNamedServers: true
    every: 60
    timeout: 900

  hub:
    db:
      pvc:
        storageClassName: ceph
    config:
      BinderSpawner:
        cors_allow_origin: '*'
    allowNamedServers: true
    namedServerLimitPerUser: 50
    redirectToServer: false
    loadRoles:
      user:
        scopes:
          - self
          - "access:services"
    services:
      binder:
        oauth_no_confirm: true
        oauth_redirect_uri: "https://binder.edu.liu.se/oauth_callback"
        oauth_client_id: "service-SECRET" #cookie
    config:
      Authenticator:
        admin_users:
          - "henbj71@ad.liu.se"
        auto_login: true
      AzureAdOAuthenticator:
        client_id: "SECRET"
        oauth_callback_url: "https://hub.binder.edu.liu.se/hub/oauth_callback"
        tenant_id: "SECRET"
        username_claim: upn
      JupyterHub:
        authenticator_class: azuread
      BinderSpawner:
        auth_enabled: false

  rbac:
    create: false

  proxy:
    service:
      type: ClusterIP
      
  ingress:
    enabled: true
    hosts:
      - hub.binder.edu.liu.se
    annotations:
      kubernetes.io/ingress.class: nginx-public
      cert-manager.io/issuer: binder-letsencrypt-issuer
    tls:
    - secretName: hub-binder-edu-liu-se-tls
      hosts:
      - hub.binder.edu.liu.se

    
service:
  type: ClusterIP
  
ingress:
  enabled: true
  hosts:
    - binder.edu.liu.se
  annotations:
    kubernetes.io/ingress.class: nginx-public
    cert-manager.io/issuer: binder-letsencrypt-issuer
  tls:
  - secretName: binder-edu-liu-se-tls
    hosts:
    - binder.edu.liu.se
  annotations:
    nginx.ingress.kubernetes.io/auth-url: "https://$host/login/"
    nginx.ingress.kubernetes.io/auth-signin: "https://$host/_oauth"

imageBuilderType: dind
dind:
  hostLibDir: "/var/lib/dind/henbj71"
  hostSocketDir: "/var/run/dind/henbj71"
  daemonset:
    image:
      name: docker
      tag: 20.10.12-dind


# These require clusterroles. So we disable them
imageCleaner:
  enabled: false
    
pdb:
  enabled: false
  
  
# Only for displaying the LiU logo
initContainers:
  - name: git-clone-templates
    image: alpine/git
    args:
      - clone
      - --single-branch
      - --branch=master
      - --depth=1
      - https://github.com/henricbjork2liu/binderhub-custom-files
      - /etc/binderhub/custom
    securityContext:
      runAsUser: 0
    volumeMounts:
      - name: custom-templates
        mountPath: /etc/binderhub/custom

extraVolumes:
  - name: custom-templates
    emptyDir: {}

extraVolumeMounts:
  - name: custom-templates
    mountPath: /etc/binderhub/custom

The newest one I’ve tried is “1.0.0-0.dev.git.3018.he00ec49”. Now there were a few things I had to change in the config files so I might have missed something.

I’ve not updated my default roles, but from what I can see they shouldn’t be the issue.

# Source: binderhub/templates/rbac.yaml
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: default-roles
rules:
- apiGroups: [""] # "" indicates the core API group
  resources: ["pods"]
  verbs: ["get", "watch", "list", "create", "delete"]
- apiGroups: [""]
  resources: ["pods/log"]
  verbs: ["get"]
- apiGroups: [""]
  resources: ["events"]
  verbs: ["list"]
---
# Source: binderhub/templates/rbac.yaml
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
  name: default-roles
subjects:
- kind: ServiceAccount
  namespace: henbj71-binderhub
  name: default
roleRef:
  kind: Role
  name: default-roles
  apiGroup: rbac.authorization.k8s.io

Hi , may i know how did you fix this issue, i have encounter this issue too

1 Like

I’m still running on the old version. I haven’t really had much time to troubleshoot this further.

It looks like the error is occurring in JupyterHub rather than BinderHub.

To help narrow down the problem further please could you

already turn on the debug logging, below log is all error logs

If debug logs are enable for JupyterHub you should see a lot more than what’s shown above.

To investigate further please can you:

  • try deploying JupyterHub on it’s own without BinderHub
  • show us your full JupyterHub configuration
  • give us some information on your deployment, e.g. how you’ve setup k8s, the version, etc?

I finally found some time to test this out.
I uninstalled binderhub and installed the newest version of jupyterhub.
This is my configuration:

custom:
  cors: 
    cors_allow_origin: '*'

debug:
  enabled: true

singleuser:
  image:
    # You should replace the "latest" tag with a fixed version from:
    # https://hub.docker.com/r/jupyter/datascience-notebook/tags/
    # Inspect the Dockerfile at:
    # https://github.com/jupyter/docker-stacks/tree/HEAD/datascience-notebook/Dockerfile
    name: jupyter/datascience-notebook
    tag: "cde8b4389a"
  # `cmd: null` allows the custom CMD of the Jupyter docker-stacks to be used
  # which performs further customization on startup.
  cmd: null

hub:
  pdb:
    enabled: false
  db:
    pvc:
      storageClassName: ceph

    
scheduling:
  userScheduler:
    enabled: false # Requires Cluster Roles if set to true
    
prePuller:
  hook:
    enabled: false
  continuous:
    enabled: false

rbac:
  create: false

When trying to log in, the hub pod crashes. The logs say:

[D 2023-03-09 15:03:26.818 JupyterHub application:837] Looking for /usr/local/etc/jupyterhub/jupyterhub_config in /srv/jupyterhub
Loading /usr/local/etc/jupyterhub/secret/values.yaml
No config at /usr/local/etc/jupyterhub/existing-secret/values.yaml
[D 2023-03-09 15:03:27.196 JupyterHub application:858] Loaded config file: /usr/local/etc/jupyterhub/jupyterhub_config.py
[I 2023-03-09 15:03:27.276 JupyterHub app:2775] Running JupyterHub version 3.0.0
[I 2023-03-09 15:03:27.276 JupyterHub app:2805] Using Authenticator: jupyterhub.auth.DummyAuthenticator-3.0.0
[I 2023-03-09 15:03:27.276 JupyterHub app:2805] Using Spawner: kubespawner.spawner.KubeSpawner-4.2.0
[I 2023-03-09 15:03:27.276 JupyterHub app:2805] Using Proxy: jupyterhub.proxy.ConfigurableHTTPProxy-3.0.0
[D 2023-03-09 15:03:27.277 JupyterHub app:1783] Connecting to db: sqlite:///jupyterhub.sqlite
[D 2023-03-09 15:03:27.296 JupyterHub orm:967] Stamping empty database with alembic revision 651f5419b74d
[I 2023-03-09 15:03:27.297 alembic.runtime.migration migration:204] Context impl SQLiteImpl.
[I 2023-03-09 15:03:27.297 alembic.runtime.migration migration:207] Will assume non-transactional DDL.
[I 2023-03-09 15:03:27.514 alembic.runtime.migration migration:618] Running stamp_revision -> 651f5419b74d
[D 2023-03-09 15:03:27.514 alembic.runtime.migration migration:818] new branch insert 651f5419b74d
[D 2023-03-09 15:03:27.591 JupyterHub orm:967] Stamping empty database with alembic revision 651f5419b74d
[I 2023-03-09 15:03:27.592 alembic.runtime.migration migration:204] Context impl SQLiteImpl.
[I 2023-03-09 15:03:27.592 alembic.runtime.migration migration:207] Will assume non-transactional DDL.
[D 2023-03-09 15:03:29.542 JupyterHub app:2035] Loading roles into database
[I 2023-03-09 15:03:29.861 JupyterHub roles:173] Role jupyterhub-idle-culler added to database
[I 2023-03-09 15:03:29.920 JupyterHub app:1934] Not using allowed_users. Any authenticated user will be allowed.
[D 2023-03-09 15:03:30.081 JupyterHub app:2274] Purging expired APITokens
[D 2023-03-09 15:03:30.084 JupyterHub app:2274] Purging expired OAuthCodes
[D 2023-03-09 15:03:30.086 JupyterHub app:2110] Loading role assignments from config
[D 2023-03-09 15:03:30.179 JupyterHub app:2433] Initializing spawners
[D 2023-03-09 15:03:30.182 JupyterHub app:2564] Loaded users:
[I 2023-03-09 15:03:30.182 JupyterHub app:2844] Initialized 0 spawners in 0.003 seconds
[I 2023-03-09 15:03:30.184 JupyterHub app:3057] Not starting proxy
[D 2023-03-09 15:03:30.184 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-09 15:03:30.189 JupyterHub proxy:957] Omitting non-jupyterhub route '/'
[I 2023-03-09 15:03:30.190 JupyterHub app:3093] Hub API listening on http://:8081/hub/
[I 2023-03-09 15:03:30.190 JupyterHub app:3095] Private Hub API connect url http://hub:8081/hub/
[I 2023-03-09 15:03:30.190 JupyterHub app:3104] Starting managed service jupyterhub-idle-culler
[I 2023-03-09 15:03:30.190 JupyterHub service:385] Starting service 'jupyterhub-idle-culler': ['python3', '-m', 'jupyterhub_idle_culler', '--url=http://localhost:8081/hub/api', '--timeout=3600', '--cull-every=600', '--concurrency=10']
[I 2023-03-09 15:03:30.193 JupyterHub service:133] Spawning python3 -m jupyterhub_idle_culler --url=http://localhost:8081/hub/api --timeout=3600 --cull-every=600 --concurrency=10
[D 2023-03-09 15:03:30.200 JupyterHub spawner:1369] Polling subprocess every 30s
[D 2023-03-09 15:03:30.200 JupyterHub proxy:392] Fetching routes to check
[D 2023-03-09 15:03:30.200 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-09 15:03:30.202 JupyterHub proxy:957] Omitting non-jupyterhub route '/'
[D 2023-03-09 15:03:30.202 JupyterHub proxy:395] Checking routes
[I 2023-03-09 15:03:30.203 JupyterHub proxy:480] Adding route for Hub: / => http://hub:8081
[D 2023-03-09 15:03:30.203 JupyterHub proxy:884] Proxy: Fetching POST http://proxy-api:8001/api/routes/
[I 2023-03-09 15:03:30.205 JupyterHub app:3162] JupyterHub is now running, internal Hub API at http://hub:8081/hub/
[D 2023-03-09 15:03:30.206 JupyterHub app:2768] It took 3.395 seconds for the Hub to start
[D 2023-03-09 15:03:30.316 JupyterHub base:275] Recording first activity for <APIToken('f365...', service='jupyterhub-idle-culler', client_id='jupyterhub')>
[I 2023-03-09 15:03:30.489 JupyterHub log:186] 200 GET /hub/api/ (jupyterhub-idle-culler@::1) 179.15ms
[D 2023-03-09 15:03:30.493 JupyterHub scopes:796] Checking access via scope list:users
[D 2023-03-09 15:03:30.493 JupyterHub scopes:610] Unrestricted access to /hub/api/users via list:users
[I 2023-03-09 15:03:30.500 JupyterHub log:186] 200 GET /hub/api/users?state=[secret] (jupyterhub-idle-culler@::1) 9.47ms
[D 2023-03-09 15:03:31.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.99ms
[W 2023-03-09 15:03:31.852 JupyterHub web:1796] 403 GET /hub/metrics (10.65.29.48): Access to metrics requires scope 'read:metrics'
[D 2023-03-09 15:03:31.853 JupyterHub base:1342] No template for 403
[W 2023-03-09 15:03:31.877 JupyterHub log:186] 403 GET /hub/metrics (@10.65.29.48) 26.10ms
[D 2023-03-09 15:03:33.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.81ms
[D 2023-03-09 15:03:35.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.83ms
[D 2023-03-09 15:03:37.393 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.88ms
[D 2023-03-09 15:04:19.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.93ms
[D 2023-03-09 15:04:21.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.88ms
[I 2023-03-09 15:04:22.242 JupyterHub log:186] 302 GET / -> /hub/ (@::ffff:130.236.59.86) 0.94ms
[I 2023-03-09 15:04:22.261 JupyterHub log:186] 302 GET /hub/ -> /hub/login?next=%2Fhub%2F (@::ffff:130.236.59.86) 0.74ms
[I 2023-03-09 15:04:22.306 JupyterHub log:186] 200 GET /hub/login?next=%2Fhub%2F (@::ffff:130.236.59.86) 27.64ms
[D 2023-03-09 15:04:22.387 JupyterHub log:186] 200 GET /hub/static/css/style.min.css?v=bff49b4a161afb17ee3b71927ce7d6c4e5b0e4b9ef6f18ca3e356a05f29e69776d3a76aee167060dd2ae2ee62d3cfdcf203b4b0090b1423f7d629ea7daa3f9da (@::ffff:130.236.59.86) 1.70ms
[D 2023-03-09 15:04:22.388 JupyterHub log:186] 200 GET /hub/static/components/jquery/dist/jquery.min.js?v=69528a4518bf43f615fb89a3a0a06c138c771fe0647a0a0cfde9b8e8d3650aa3539946000e305b78d79f371615ee0894a74571202b6a76b6ea53b89569e64d5c (@::ffff:130.236.59.86) 0.99ms
[D 2023-03-09 15:04:22.390 JupyterHub log:186] 200 GET /hub/static/components/requirejs/require.js?v=bd1aa102bdb0b27fbf712b32cfcd29b016c272acf3d864ee8469376eaddd032cadcf827ff17c05a8c8e20061418fe58cf79947049f5c0dff3b4f73fcc8cad8ec (@::ffff:130.236.59.86) 1.18ms
[D 2023-03-09 15:04:22.390 JupyterHub log:186] 200 GET /hub/static/components/bootstrap/dist/js/bootstrap.min.js?v=a014e9acc78d10a0a7a9fbaa29deac6ef17398542d9574b77b40bf446155d210fa43384757e3837da41b025998ebfab4b9b6f094033f9c226392b800df068bce (@::ffff:130.236.59.86) 1.47ms
[D 2023-03-09 15:04:22.405 JupyterHub log:186] 200 GET /hub/logo (@::ffff:130.236.59.86) 0.71ms
[D 2023-03-09 15:04:22.477 JupyterHub log:186] 200 GET /hub/static/favicon.ico?v=fde5757cd3892b979919d3b1faa88a410f28829feb5ba22b6cf069f2c6c98675fceef90f932e49b510e74d65c681d5846b943e7f7cc1b41867422f0481085c1f (@::ffff:130.236.59.86) 0.57ms
[D 2023-03-09 15:04:23.394 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.84ms
[D 2023-03-09 15:04:25.395 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.85ms
[D 2023-03-09 15:04:25.724 JupyterHub roles:281] Assigning default role to User ad
[I 2023-03-09 15:04:25.728 JupyterHub roles:238] Adding role user for User: ad
[D 2023-03-09 15:04:25.844 JupyterHub roles:281] Assigning default role to User ad
[D 2023-03-09 15:04:25.848 JupyterHub base:559] Setting cookie jupyterhub-session-id: {'httponly': True, 'path': '/'}
[D 2023-03-09 15:04:25.849 JupyterHub base:563] Setting cookie for ad: jupyterhub-hub-login
[D 2023-03-09 15:04:25.849 JupyterHub base:559] Setting cookie jupyterhub-hub-login: {'httponly': True, 'path': '/hub/'}
[I 2023-03-09 15:04:25.849 JupyterHub base:810] User logged in: ad
[I 2023-03-09 15:04:25.850 JupyterHub log:186] 302 POST /hub/login?next=%2Fhub%2F -> /hub/ (ad@::ffff:130.236.59.86) 129.53ms
[D 2023-03-09 15:04:25.851 JupyterHub log:186] 200 GET /hub/static/components/font-awesome/fonts/fontawesome-webfont.woff2?v=4.7.0 (@::ffff:130.236.59.86) 1.04ms
[D 2023-03-09 15:04:25.872 JupyterHub base:275] Recording first activity for <User(ad 0/1 running)>
[D 2023-03-09 15:04:25.921 JupyterHub user:430] Creating <class 'kubespawner.spawner.KubeSpawner'> for ad:
[I 2023-03-09 15:04:25.927 JupyterHub log:186] 302 GET /hub/ -> /hub/spawn (ad@::ffff:130.236.59.86) 57.97ms
[E 2023-03-09 15:04:25.937 JupyterHub reflector:385] Initial list of pods failed
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
[E 2023-03-09 15:04:25.938 JupyterHub spawner:2402] Reflector for pods failed to start.
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2400, in catch_reflector_start
await f
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
Task was destroyed but it is pending!
task: <Task pending name='Task-151' coro=<shared_client.<locals>.close_client_task() running at /usr/local/lib/python3.9/site-packages/kubespawner/clients.py:58> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7fe96f12e490>()]>>
Exception ignored in: <coroutine object shared_client.<locals>.close_client_task at 0x7fe96f07fbc0>
RuntimeError: coroutine ignored GeneratorExit
Task exception was never retrieved
future: <Task finished name='Task-153' coro=<KubeSpawner._start_reflector.<locals>.catch_reflector_start() done, defined at /usr/local/lib/python3.9/site-packages/kubespawner/spawner.py:2398> exception=SystemExit(1)>
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2400, in catch_reflector_start
await f
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 383, in start
await self._list_and_update()
File "/usr/local/lib/python3.9/site-packages/kubespawner/reflector.py", line 233, in _list_and_update
for p in initial_resources["items"]
KeyError: 'items'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/jupyterhub/app.py", line 3313, in launch_instance
loop.start()
File "/usr/local/lib/python3.9/site-packages/tornado/platform/asyncio.py", line 215, in start
self.asyncio_loop.run_forever()
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 601, in run_forever
self._run_once()
File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1905, in _run_once
handle._run()
File "/usr/local/lib/python3.9/asyncio/events.py", line 80, in _run
self._context.run(self._callback, *self._args)
File "/usr/local/lib/python3.9/site-packages/kubespawner/spawner.py", line 2403, in catch_reflector_start
sys.exit(1)
SystemExit: 1 

What service account are you using for the hub? The hub needs permissions to manage pods.

I used rbac.create: false with binderhub, so I thought I needed it here as well.
I set rbac:create: true for jupyterhub and it goes slightly further now, but the newly created server pod fails after logging in.

This is how my pods look:

hub-56667f97cf-pjk9d    1/1     Running                 0            7m29s
jupyter-test            0/1     Init:CrashLoopBackOff   1 (5s ago)   16s  
proxy-748cb7478-wzk4c   1/1     Running                 0            7m29s

The hub pod says:

[D 2023-03-10 09:18:36.132 JupyterHub application:837] Looking for /usr/local/etc/jupyterhub/jupyterhub_config in /srv/jupyterhub
Loading /usr/local/etc/jupyterhub/secret/values.yaml
No config at /usr/local/etc/jupyterhub/existing-secret/values.yaml
[D 2023-03-10 09:18:36.487 JupyterHub application:858] Loaded config file: /usr/local/etc/jupyterhub/jupyterhub_config.py
[I 2023-03-10 09:18:36.559 JupyterHub app:2775] Running JupyterHub version 3.0.0
[I 2023-03-10 09:18:36.559 JupyterHub app:2805] Using Authenticator: jupyterhub.auth.DummyAuthenticator-3.0.0
[I 2023-03-10 09:18:36.559 JupyterHub app:2805] Using Spawner: kubespawner.spawner.KubeSpawner-4.2.0
[I 2023-03-10 09:18:36.559 JupyterHub app:2805] Using Proxy: jupyterhub.proxy.ConfigurableHTTPProxy-3.0.0
[D 2023-03-10 09:18:36.560 JupyterHub app:1783] Connecting to db: sqlite:///jupyterhub.sqlite
[D 2023-03-10 09:18:36.582 JupyterHub orm:967] Stamping empty database with alembic revision 651f5419b74d
[I 2023-03-10 09:18:36.583 alembic.runtime.migration migration:204] Context impl SQLiteImpl.
[I 2023-03-10 09:18:36.583 alembic.runtime.migration migration:207] Will assume non-transactional DDL.
[I 2023-03-10 09:18:36.604 alembic.runtime.migration migration:618] Running stamp_revision  -> 651f5419b74d
[D 2023-03-10 09:18:36.604 alembic.runtime.migration migration:818] new branch insert 651f5419b74d
[D 2023-03-10 09:18:36.623 JupyterHub orm:967] Stamping empty database with alembic revision 651f5419b74d
[I 2023-03-10 09:18:36.624 alembic.runtime.migration migration:204] Context impl SQLiteImpl.
[I 2023-03-10 09:18:36.624 alembic.runtime.migration migration:207] Will assume non-transactional DDL.
[D 2023-03-10 09:18:36.960 JupyterHub app:2035] Loading roles into database
[I 2023-03-10 09:18:37.015 JupyterHub roles:173] Role jupyterhub-idle-culler added to database
[I 2023-03-10 09:18:37.028 JupyterHub app:1934] Not using allowed_users. Any authenticated user will be allowed.
[D 2023-03-10 09:18:37.068 JupyterHub app:2274] Purging expired APITokens
[D 2023-03-10 09:18:37.070 JupyterHub app:2274] Purging expired OAuthCodes
[D 2023-03-10 09:18:37.072 JupyterHub app:2110] Loading role assignments from config
[D 2023-03-10 09:18:37.119 JupyterHub app:2433] Initializing spawners
[D 2023-03-10 09:18:37.121 JupyterHub app:2564] Loaded users:
    
[I 2023-03-10 09:18:37.121 JupyterHub app:2844] Initialized 0 spawners in 0.002 seconds
[I 2023-03-10 09:18:37.122 JupyterHub app:3057] Not starting proxy
[D 2023-03-10 09:18:37.123 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-10 09:18:37.128 JupyterHub proxy:957] Omitting non-jupyterhub route '/'
[I 2023-03-10 09:18:37.128 JupyterHub app:3093] Hub API listening on http://:8081/hub/
[I 2023-03-10 09:18:37.128 JupyterHub app:3095] Private Hub API connect url http://hub:8081/hub/
[I 2023-03-10 09:18:37.128 JupyterHub app:3104] Starting managed service jupyterhub-idle-culler
[I 2023-03-10 09:18:37.128 JupyterHub service:385] Starting service 'jupyterhub-idle-culler': ['python3', '-m', 'jupyterhub_idle_culler', '--url=http://localhost:8081/hub/api', '--timeout=3600', '--cull-every=600', '--concurrency=10']
[I 2023-03-10 09:18:37.131 JupyterHub service:133] Spawning python3 -m jupyterhub_idle_culler --url=http://localhost:8081/hub/api --timeout=3600 --cull-every=600 --concurrency=10
[D 2023-03-10 09:18:37.137 JupyterHub spawner:1369] Polling subprocess every 30s
[D 2023-03-10 09:18:37.137 JupyterHub proxy:392] Fetching routes to check
[D 2023-03-10 09:18:37.137 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-10 09:18:37.139 JupyterHub proxy:957] Omitting non-jupyterhub route '/'
[D 2023-03-10 09:18:37.139 JupyterHub proxy:395] Checking routes
[I 2023-03-10 09:18:37.140 JupyterHub proxy:480] Adding route for Hub: / => http://hub:8081
[D 2023-03-10 09:18:37.140 JupyterHub proxy:884] Proxy: Fetching POST http://proxy-api:8001/api/routes/
[I 2023-03-10 09:18:37.142 JupyterHub app:3162] JupyterHub is now running, internal Hub API at http://hub:8081/hub/
[D 2023-03-10 09:18:37.143 JupyterHub app:2768] It took 1.016 seconds for the Hub to start
[D 2023-03-10 09:18:37.256 JupyterHub base:275] Recording first activity for <APIToken('d81b...', service='jupyterhub-idle-culler', client_id='jupyterhub')>
[I 2023-03-10 09:18:37.275 JupyterHub log:186] 200 GET /hub/api/ (jupyterhub-idle-culler@::1) 23.99ms
[D 2023-03-10 09:18:37.278 JupyterHub scopes:796] Checking access via scope list:users
[D 2023-03-10 09:18:37.278 JupyterHub scopes:610] Unrestricted access to /hub/api/users via list:users
[I 2023-03-10 09:18:37.284 JupyterHub log:186] 200 GET /hub/api/users?state=[secret] (jupyterhub-idle-culler@::1) 7.90ms
[D 2023-03-10 09:18:37.814 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.76ms
[D 2023-03-10 09:18:38.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.67ms
[D 2023-03-10 09:18:40.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:18:42.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.73ms
[D 2023-03-10 09:18:44.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.67ms
[D 2023-03-10 09:18:46.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:18:48.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:18:50.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.75ms
[W 2023-03-10 09:18:51.796 JupyterHub web:1796] 403 GET /hub/metrics (10.65.29.48): Access to metrics requires scope 'read:metrics'
[D 2023-03-10 09:18:51.796 JupyterHub base:1342] No template for 403
[W 2023-03-10 09:18:51.827 JupyterHub log:186] 403 GET /hub/metrics (@10.65.29.48) 31.44ms
[D 2023-03-10 09:18:52.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:18:54.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.92ms
[D 2023-03-10 09:18:56.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.71ms
[D 2023-03-10 09:18:58.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.71ms
[D 2023-03-10 09:19:00.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.67ms
[D 2023-03-10 09:19:02.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.73ms
[D 2023-03-10 09:19:04.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:19:06.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[D 2023-03-10 09:19:08.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:19:10.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.74ms
[I 2023-03-10 09:19:11.544 JupyterHub log:186] 302 GET / -> /hub/ (@::ffff:130.236.59.86) 0.88ms
[W 2023-03-10 09:19:11.568 JupyterHub base:387] Invalid or expired cookie token
[I 2023-03-10 09:19:11.569 JupyterHub log:186] 302 GET /hub/ -> /hub/login?next=%2Fhub%2F (@::ffff:130.236.59.86) 1.26ms
[I 2023-03-10 09:19:11.617 JupyterHub log:186] 200 GET /hub/login?next=%2Fhub%2F (@::ffff:130.236.59.86) 22.10ms
[D 2023-03-10 09:19:12.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:19:14.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:19:16.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[D 2023-03-10 09:19:18.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.71ms
[D 2023-03-10 09:19:19.719 JupyterHub roles:281] Assigning default role to User test
[I 2023-03-10 09:19:19.723 JupyterHub roles:238] Adding role user for User: test
[D 2023-03-10 09:19:19.754 JupyterHub roles:281] Assigning default role to User test
[D 2023-03-10 09:19:19.760 JupyterHub base:563] Setting cookie for test: jupyterhub-hub-login
[D 2023-03-10 09:19:19.760 JupyterHub base:559] Setting cookie jupyterhub-hub-login: {'httponly': True, 'path': '/hub/'}
[I 2023-03-10 09:19:19.760 JupyterHub base:810] User logged in: test
[I 2023-03-10 09:19:19.761 JupyterHub log:186] 302 POST /hub/login?next=%2Fhub%2F -> /hub/ (test@::ffff:130.236.59.86) 45.51ms
[D 2023-03-10 09:19:19.776 JupyterHub base:275] Recording first activity for <User(test 0/1 running)>
[D 2023-03-10 09:19:19.793 JupyterHub user:430] Creating <class 'kubespawner.spawner.KubeSpawner'> for test:
[I 2023-03-10 09:19:19.800 JupyterHub log:186] 302 GET /hub/ -> /hub/spawn (test@::ffff:130.236.59.86) 25.94ms
[I 2023-03-10 09:19:19.813 JupyterHub reflector:274] watching for pods with label selector='component=singleuser-server' in namespace henbj71-binderhub
[D 2023-03-10 09:19:19.813 JupyterHub reflector:281] Connecting pods watcher
[I 2023-03-10 09:19:19.817 JupyterHub reflector:274] watching for events with field selector='involvedObject.kind=Pod' in namespace henbj71-binderhub
[D 2023-03-10 09:19:19.817 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:19:19.820 JupyterHub scopes:796] Checking access via scope servers
[D 2023-03-10 09:19:19.820 JupyterHub scopes:623] Argument-based access to /hub/spawn via servers
[D 2023-03-10 09:19:19.820 JupyterHub pages:213] Triggering spawn with default options for test
[D 2023-03-10 09:19:19.821 JupyterHub base:934] Initiating spawn for test
[D 2023-03-10 09:19:19.821 JupyterHub base:938] 0/64 concurrent spawns
[D 2023-03-10 09:19:19.821 JupyterHub base:943] 0 active servers
[I 2023-03-10 09:19:19.845 JupyterHub provider:651] Creating oauth client jupyterhub-user-test
[D 2023-03-10 09:19:19.888 JupyterHub user:743] Calling Spawner.start for test
[I 2023-03-10 09:19:19.891 JupyterHub spawner:2509] Attempting to create pvc claim-test, with timeout 3
[I 2023-03-10 09:19:19.895 JupyterHub log:186] 302 GET /hub/spawn -> /hub/spawn-pending/test (test@::ffff:130.236.59.86) 78.60ms
[D 2023-03-10 09:19:19.918 JupyterHub scopes:796] Checking access via scope servers
[D 2023-03-10 09:19:19.918 JupyterHub scopes:623] Argument-based access to /hub/spawn-pending/test via servers
[I 2023-03-10 09:19:19.918 JupyterHub pages:394] test is pending spawn
[I 2023-03-10 09:19:19.921 JupyterHub log:186] 200 GET /hub/spawn-pending/test (test@::ffff:130.236.59.86) 12.17ms
[I 2023-03-10 09:19:19.922 JupyterHub spawner:2469] Attempting to create pod jupyter-test, with timeout 3
[D 2023-03-10 09:19:20.044 JupyterHub scopes:796] Checking access via scope read:servers
[D 2023-03-10 09:19:20.044 JupyterHub scopes:623] Argument-based access to /hub/api/users/test/server/progress via read:servers
[D 2023-03-10 09:19:20.045 JupyterHub spawner:2308] progress generator: jupyter-test
[D 2023-03-10 09:19:20.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.77ms
[D 2023-03-10 09:19:22.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.73ms
[D 2023-03-10 09:19:24.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.77ms
[D 2023-03-10 09:19:26.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.95ms
[D 2023-03-10 09:19:28.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.92ms
[D 2023-03-10 09:19:29.892 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:19:29.892 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:19:29.895 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:19:29.895 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:19:30.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.98ms
[D 2023-03-10 09:19:32.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.97ms
[D 2023-03-10 09:19:34.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.81ms
[D 2023-03-10 09:19:36.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.76ms
[D 2023-03-10 09:19:37.142 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-10 09:19:37.146 JupyterHub proxy:395] Checking routes
[D 2023-03-10 09:19:38.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:19:39.903 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:19:39.903 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:19:39.906 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:19:39.906 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:19:40.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:19:42.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:19:44.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.77ms
[D 2023-03-10 09:19:46.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.64ms
[D 2023-03-10 09:19:48.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:19:49.913 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:19:49.913 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:19:49.916 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:19:49.916 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:19:50.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[W 2023-03-10 09:19:51.796 JupyterHub web:1796] 403 GET /hub/metrics (10.65.29.48): Access to metrics requires scope 'read:metrics'
[D 2023-03-10 09:19:51.796 JupyterHub base:1342] No template for 403
[W 2023-03-10 09:19:51.797 JupyterHub log:186] 403 GET /hub/metrics (@10.65.29.48) 1.41ms
[D 2023-03-10 09:19:52.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.58ms
[D 2023-03-10 09:19:54.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:19:56.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:19:58.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.71ms
[D 2023-03-10 09:19:59.921 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:19:59.922 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:19:59.925 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:19:59.926 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:00.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:20:02.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.71ms
[D 2023-03-10 09:20:04.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[D 2023-03-10 09:20:06.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 1.09ms
[D 2023-03-10 09:20:08.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:20:09.931 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:09.931 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:20:09.935 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:20:09.935 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:10.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.64ms
[D 2023-03-10 09:20:12.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:20:14.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.91ms
[D 2023-03-10 09:20:16.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.99ms
[D 2023-03-10 09:20:18.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.88ms
[D 2023-03-10 09:20:19.949 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:19.949 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:20:19.953 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:20:19.953 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:20.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.97ms
[D 2023-03-10 09:20:22.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.84ms
[D 2023-03-10 09:20:24.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.67ms
[D 2023-03-10 09:20:26.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:20:28.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:20:29.959 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:29.960 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:20:29.967 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:20:29.967 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:30.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.77ms
[D 2023-03-10 09:20:32.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:20:34.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.77ms
[D 2023-03-10 09:20:36.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[D 2023-03-10 09:20:37.143 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-10 09:20:37.146 JupyterHub proxy:395] Checking routes
[D 2023-03-10 09:20:38.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:20:39.969 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:39.969 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:20:39.979 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:20:39.979 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:40.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.64ms
[D 2023-03-10 09:20:42.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.74ms
[D 2023-03-10 09:20:44.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:20:46.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:20:48.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.62ms
[D 2023-03-10 09:20:49.979 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:49.979 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:20:49.989 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:20:49.989 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:20:50.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[W 2023-03-10 09:20:51.796 JupyterHub web:1796] 403 GET /hub/metrics (10.65.29.48): Access to metrics requires scope 'read:metrics'
[D 2023-03-10 09:20:51.796 JupyterHub base:1342] No template for 403
[W 2023-03-10 09:20:51.797 JupyterHub log:186] 403 GET /hub/metrics (@10.65.29.48) 1.38ms
[D 2023-03-10 09:20:52.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:20:54.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.73ms
[D 2023-03-10 09:20:56.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.64ms
[D 2023-03-10 09:20:58.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.64ms
[D 2023-03-10 09:20:59.990 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:20:59.990 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:00.001 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:00.001 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:00.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.79ms
[D 2023-03-10 09:21:02.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:04.775 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:21:06.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:21:08.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.62ms
[D 2023-03-10 09:21:10.000 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:21:10.001 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:10.013 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:10.013 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:10.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:12.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:21:14.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.68ms
[D 2023-03-10 09:21:16.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:18.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:21:20.011 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:21:20.011 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:20.023 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:20.023 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:20.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.67ms
[D 2023-03-10 09:21:22.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.70ms
[D 2023-03-10 09:21:24.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:21:26.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:21:28.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.75ms
[D 2023-03-10 09:21:30.022 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:21:30.022 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:30.034 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:30.034 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:30.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:32.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.73ms
[D 2023-03-10 09:21:34.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:36.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms
[D 2023-03-10 09:21:37.143 JupyterHub proxy:884] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2023-03-10 09:21:37.147 JupyterHub proxy:395] Checking routes
[D 2023-03-10 09:21:38.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.65ms
[D 2023-03-10 09:21:40.032 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:21:40.032 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:40.046 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:40.047 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:40.773 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.79ms
[D 2023-03-10 09:21:42.776 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.97ms
[D 2023-03-10 09:21:44.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.69ms
[D 2023-03-10 09:21:46.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.75ms
[D 2023-03-10 09:21:48.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.66ms
[D 2023-03-10 09:21:50.053 JupyterHub reflector:362] pods watcher timeout
[D 2023-03-10 09:21:50.053 JupyterHub reflector:281] Connecting pods watcher
[D 2023-03-10 09:21:50.057 JupyterHub reflector:362] events watcher timeout
[D 2023-03-10 09:21:50.057 JupyterHub reflector:281] Connecting events watcher
[D 2023-03-10 09:21:50.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.63ms
[W 2023-03-10 09:21:51.796 JupyterHub web:1796] 403 GET /hub/metrics (10.65.29.48): Access to metrics requires scope 'read:metrics'
[D 2023-03-10 09:21:51.796 JupyterHub base:1342] No template for 403
[W 2023-03-10 09:21:51.797 JupyterHub log:186] 403 GET /hub/metrics (@10.65.29.48) 1.43ms
[D 2023-03-10 09:21:52.774 JupyterHub log:186] 200 GET /hub/health (@130.236.59.152) 0.72ms

I also removed singleuser.image from my yaml-file to make sure that wasn’t the issue, but it did not work.

What do the logs for jupyter-test show?

I never get any logs. When I run kubectl -n mynamespace logs jupyter-test
I get this error:

Defaulted container "notebook" out of: notebook, block-cloud-metadata (init)
Error from server (BadRequest): container "notebook" in pod "jupyter-test" is waiting to start: PodInitializing

It sounds like you caught it in the middle of a restart. If you keep retrying you should be able to get the logs after an attempted restart. Some tools such as stern will attempt to automatically keep track of restarting pods.

I did find some logs after digging some more:

modprobe: can't change directory to '/lib/modules': No such file or directory
iptables v1.8.8 (legacy): can't initialize iptables table `filter': Table does not exist (do you need to insmod?)
Perhaps iptables or your kernel needs to be upgraded.

I solved this by adding the following to my config.yaml:

jupyterhub:
  singleuser:
    cloudMetadata:
      blockWithIptables: false