Not able to start the server

I had a working installation of Z2JH installed on microk8s on fedora 31. I have had a problem with my system and I have reinstalled everything from scratch using the same configuration files, and now the spawn fails when I try to start the server. I have enabled the debug mode and there is warning in the log of the spawner that I’m not sure if it may have some relation with the problem.
The message is:
[W 2020-04-22 07:38:15.404 SingleUserNotebookApp _version:50] jupyterhub version 1.1.0 != jupyterhub-singleuser version 0.9.1. This could cause failure to authenticate and result in redirect loops!

That message is a warning. It’s not necessarily a problem, but to be safe you could upgrade the version of jupyterhub inside your singleuser server.

Do you have any other log messages from your singleuser-server and jupyterhub pods?

@manics, thanks for your answer, I have been able to remove the previous warning reinstalling again to the latest versions but the problem remains.

I have not been able to find any meaningful error in the logs, I have though that may be I had a problem with the routes and looking to the microk8s documentation I have executed
sudo iptables -P FORWARD ACCEPT

This is what I can see with kubectl logs hub-6bc99b4476-6c25f -n jhub

[D 2020-04-25 10:53:40.891 JupyterHub application:556] Looking for /etc/jupyterhub/jupyterhub_config in /srv/jupyterhub
Loading /etc/jupyterhub/config/values.yaml
Loading /etc/jupyterhub/secret/values.yaml
[D 2020-04-25 10:53:41.055 JupyterHub application:578] Loaded config file: /etc/jupyterhub/jupyterhub_config.py
[I 2020-04-25 10:53:41.070 JupyterHub app:2240] Running JupyterHub version 1.1.0
[I 2020-04-25 10:53:41.070 JupyterHub app:2271] Using Authenticator: dummyauthenticator.dummyauthenticator.DummyAuthenticator
[I 2020-04-25 10:53:41.070 JupyterHub app:2271] Using Spawner: kubespawner.spawner.KubeSpawner
[I 2020-04-25 10:53:41.070 JupyterHub app:2271] Using Proxy: jupyterhub.proxy.ConfigurableHTTPProxy-1.1.0
[I 2020-04-25 10:53:41.071 JupyterHub app:1349] Loading cookie_secret from /srv/jupyterhub/jupyterhub_cookie_secret
[D 2020-04-25 10:53:41.071 JupyterHub app:1516] Connecting to db: sqlite:///jupyterhub.sqlite
[D 2020-04-25 10:53:41.079 JupyterHub orm:749] database schema version found: 4dc2d5a8c53c
[D 2020-04-25 10:53:41.082 JupyterHub orm:749] database schema version found: 4dc2d5a8c53c
[W 2020-04-25 10:53:41.084 JupyterHub app:1579] JupyterHub.hub_connect_port is deprecated as of 0.9. Use JupyterHub.hub_connect_url to fully specify the URL for connecting to the Hub.
[W 2020-04-25 10:53:41.085 JupyterHub app:1624] No admin users, admin interface will be unavailable.
[W 2020-04-25 10:53:41.085 JupyterHub app:1626] Add any administrative users to `c.Authenticator.admin_users` in config.
10
[D 2020-04-25 10:53:41.114 JupyterHub app:1914] Initializing spawners
[D 2020-04-25 10:53:41.116 JupyterHub app:2011] Loading state for cmxela from db
[D 2020-04-25 10:53:41.116 JupyterHub app:2011] Loading state for alexmc from db
[D 2020-04-25 10:53:41.116 JupyterHub app:2038] Loaded users:
      cmxela
      alexmc
[I 2020-04-25 10:53:41.116 JupyterHub app:2311] Initialized 0 spawners in 0.003 seconds
[I 2020-04-25 10:53:41.118 JupyterHub app:2520] Not starting proxy
[D 2020-04-25 10:53:41.118 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:53:41.119 JupyterHub app:2556] Hub API listening on http://0.0.0.0:8081/hub/
[I 2020-04-25 10:53:41.119 JupyterHub app:2558] Private Hub API connect url http://10.152.183.43:8081/hub/
[I 2020-04-25 10:53:41.119 JupyterHub app:2571] Starting managed service cull-idle
[I 2020-04-25 10:53:41.120 JupyterHub service:335] Starting service 'cull-idle': ['python3', '/etc/jupyterhub/cull_idle_servers.py', '--url=http://127.0.0.1:8081/hub/api', '--timeout=3600', '--cull-every=600', '--concurrency=10']
[I 2020-04-25 10:53:41.121 JupyterHub service:121] Spawning python3 /etc/jupyterhub/cull_idle_servers.py --url=http://127.0.0.1:8081/hub/api --timeout=3600 --cull-every=600 --concurrency=10
[D 2020-04-25 10:53:41.125 JupyterHub spawner:1113] Polling subprocess every 30s
[D 2020-04-25 10:53:41.125 JupyterHub proxy:315] Fetching routes to check
[D 2020-04-25 10:53:41.126 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:53:41.127 JupyterHub proxy:320] Checking routes
[I 2020-04-25 10:53:41.127 JupyterHub app:2631] JupyterHub is now running at http://10.152.183.4:443/
[D 2020-04-25 10:53:41.127 JupyterHub app:2233] It took 0.239 seconds for the Hub to start
[D 2020-04-25 10:53:41.254 JupyterHub base:285] Recording first activity for <APIToken('fb85...', service='cull-idle')>
[I 2020-04-25 10:53:41.261 JupyterHub log:174] 200 GET /hub/api/users (cull-idle@127.0.0.1) 10.61ms
[I 2020-04-25 10:53:41.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.77ms
[I 2020-04-25 10:53:51.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.71ms
[I 2020-04-25 10:54:01.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.72ms
[I 2020-04-25 10:54:11.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.72ms
[I 2020-04-25 10:54:21.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.72ms
[I 2020-04-25 10:54:31.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.68ms
[D 2020-04-25 10:54:41.128 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:54:41.129 JupyterHub proxy:320] Checking routes
[I 2020-04-25 10:54:41.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.68ms
[I 2020-04-25 10:54:51.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.70ms
[I 2020-04-25 10:55:01.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.71ms
[I 2020-04-25 10:55:11.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.70ms
[I 2020-04-25 10:55:21.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.68ms
[I 2020-04-25 10:55:31.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.79ms
[D 2020-04-25 10:55:41.128 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:55:41.130 JupyterHub proxy:320] Checking routes
[I 2020-04-25 10:55:41.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.70ms
[I 2020-04-25 10:55:51.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.41ms
[I 2020-04-25 10:56:01.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.67ms
[I 2020-04-25 10:56:11.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.70ms
[D 2020-04-25 10:56:18.414 JupyterHub base:330] Refreshing auth for alexmc
[D 2020-04-25 10:56:18.414 JupyterHub user:242] Creating <class 'kubespawner.spawner.KubeSpawner'> for alexmc:
[I 2020-04-25 10:56:18.430 JupyterHub reflector:199] watching for pods with label selector='component=singleuser-server' in namespace jhub
[D 2020-04-25 10:56:18.431 JupyterHub reflector:202] Connecting pods watcher
[I 2020-04-25 10:56:18.449 JupyterHub reflector:199] watching for events with field selector='involvedObject.kind=Pod' in namespace jhub
[D 2020-04-25 10:56:18.450 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:56:18.489 JupyterHub log:174] 200 GET /hub/spawn-pending/alexmc (alexmc@10.1.40.1) 86.54ms
[D 2020-04-25 10:56:18.575 JupyterHub log:174] 200 GET /favicon.ico (@10.1.40.1) 1.28ms
[D 2020-04-25 10:56:19.905 JupyterHub pages:195] Triggering spawn with default options for alexmc
[D 2020-04-25 10:56:19.905 JupyterHub base:825] Initiating spawn for alexmc
[D 2020-04-25 10:56:19.905 JupyterHub base:832] 0/64 concurrent spawns
[D 2020-04-25 10:56:19.905 JupyterHub base:837] 0 active servers
[D 2020-04-25 10:56:19.933 JupyterHub user:556] Calling Spawner.start for alexmc
[W 2020-04-25 10:56:19.935 JupyterHub base:950] User alexmc is slow to start (timeout=0)
[I 2020-04-25 10:56:19.935 JupyterHub log:174] 302 GET /hub/spawn/alexmc -> /hub/spawn-pending/alexmc (alexmc@10.1.40.1) 33.39ms
[I 2020-04-25 10:56:19.942 JupyterHub pages:347] alexmc is pending spawn
[I 2020-04-25 10:56:19.946 JupyterHub log:174] 200 GET /hub/spawn-pending/alexmc (alexmc@10.1.40.1) 6.72ms
[I 2020-04-25 10:56:19.951 JupyterHub spawner:1769] PVC claim-alexmc already exists, so did not create new pvc.
[D 2020-04-25 10:56:20.018 JupyterHub spawner:1601] progress generator: jupyter-alexmc
[I 2020-04-25 10:56:21.002 JupyterHub log:174] 200 GET /hub/api (@10.1.40.1) 0.76ms
[D 2020-04-25 10:56:21.731 JupyterHub spawner:1850] pod jupyter-alexmc events before launch: 2020-04-25 10:56:19+00:00 [Normal] Successfully assigned jhub/jupyter-alexmc to localhost.localdomain
    2020-04-25 10:56:20+00:00 [Normal] Container image "jupyter/datascience-notebook:177037d09156" already present on machine
    2020-04-25 10:56:20+00:00 [Normal] Created container notebook
    2020-04-25 10:56:20+00:00 [Normal] Started container notebook
[D 2020-04-25 10:56:21.735 JupyterHub spawner:1113] Polling subprocess every 30s
[I 2020-04-25 10:56:21.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.68ms
[D 2020-04-25 10:56:28.436 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:56:28.436 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:56:28.488 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:56:28.488 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:56:31.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.78ms
[D 2020-04-25 10:56:38.442 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:56:38.442 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:56:38.510 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:56:38.510 JupyterHub reflector:202] Connecting events watcher
[D 2020-04-25 10:56:41.128 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:56:41.129 JupyterHub proxy:320] Checking routes
[I 2020-04-25 10:56:41.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.74ms
[D 2020-04-25 10:56:48.449 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:56:48.449 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:56:48.531 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:56:48.531 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:56:51.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.73ms
[W 2020-04-25 10:56:55.061 JupyterHub user:692] alexmc's server never showed up at http://10.1.40.30:8888/user/alexmc/ after 30 seconds. Giving up
[D 2020-04-25 10:56:55.061 JupyterHub user:739] Stopping alexmc
[I 2020-04-25 10:56:55.061 JupyterHub spawner:1866] Deleting pod jupyter-alexmc
[D 2020-04-25 10:56:58.456 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:56:58.456 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:56:58.551 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:56:58.551 JupyterHub reflector:202] Connecting events watcher
[D 2020-04-25 10:57:00.112 JupyterHub user:767] Deleting oauth client jupyterhub-user-alexmc
[D 2020-04-25 10:57:00.121 JupyterHub user:770] Finished stopping alexmc
[E 2020-04-25 10:57:00.125 JupyterHub gen:599] Exception in Future <Task finished coro=<BaseHandler.spawn_single_user.<locals>.finish_user_spawn() done, defined at /usr/local/lib/python3.6/dist-packages/jupyterhub/handlers/base.py:845> exception=TimeoutError("Server at http://10.1.40.30:8888/user/alexmc/ didn't respond in 30 seconds",)> after timeout
    Traceback (most recent call last):
      File "/usr/local/lib/python3.6/dist-packages/tornado/gen.py", line 593, in error_callback
        future.result()
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/handlers/base.py", line 852, in finish_user_spawn
        await spawn_future
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/user.py", line 668, in spawn
        await self._wait_up(spawner)
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/user.py", line 715, in _wait_up
        raise e
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/user.py", line 683, in _wait_up
        http=True, timeout=spawner.http_timeout, ssl_context=ssl_context
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/utils.py", line 234, in wait_for_http_server
        timeout=timeout,
      File "/usr/local/lib/python3.6/dist-packages/jupyterhub/utils.py", line 177, in exponential_backoff
        raise TimeoutError(fail_message)
    TimeoutError: Server at http://10.1.40.30:8888/user/alexmc/ didn't respond in 30 seconds
    
[I 2020-04-25 10:57:00.127 JupyterHub log:174] 200 GET /hub/api/users/alexmc/server/progress (alexmc@10.1.40.1) 40114.87ms
[I 2020-04-25 10:57:01.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.72ms
[D 2020-04-25 10:57:08.463 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:08.463 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:08.568 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:08.568 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:57:11.848 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.72ms
[D 2020-04-25 10:57:18.468 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:18.468 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:18.588 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:18.588 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:57:21.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.90ms
[D 2020-04-25 10:57:28.473 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:28.473 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:28.605 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:28.606 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:57:31.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.84ms
[D 2020-04-25 10:57:38.478 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:38.478 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:38.631 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:38.631 JupyterHub reflector:202] Connecting events watcher
[D 2020-04-25 10:57:41.128 JupyterHub proxy:766] Proxy: Fetching GET http://10.152.183.157:8001/api/routes
[I 2020-04-25 10:57:41.129 JupyterHub proxy:320] Checking routes
[I 2020-04-25 10:57:41.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.87ms
[D 2020-04-25 10:57:48.483 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:48.483 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:48.653 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:48.653 JupyterHub reflector:202] Connecting events watcher
[I 2020-04-25 10:57:51.849 JupyterHub log:174] 200 GET /hub/health (@10.1.40.1) 0.87ms
[D 2020-04-25 10:57:58.489 JupyterHub reflector:270] pods watcher timeout
[D 2020-04-25 10:57:58.489 JupyterHub reflector:202] Connecting pods watcher
[D 2020-04-25 10:57:58.674 JupyterHub reflector:270] events watcher timeout
[D 2020-04-25 10:57:58.674 JupyterHub reflector:202] Connecting events watcher

Is there anything in your K8s logs for the failed singleuser pod jupyter-alexmc? Does kubectl -n <NAMESPACE> describe pod jupyter-alexmc show anything? If it’s already deleted you may need to try and catch it in the middle of spawning.