Z2JH : Invalid client_id parameter value

Hi All,

This issue is observed intermittently for a few users. After successful OAuth2 authentication, this error log is seen on the hub pod:

[D 2023-11-26 10:47:41.771 JupyterHub base:342] Refreshing auth for <username>
[D 2023-11-26 10:47:41.772 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[W 2023-11-26 10:47:41.776 JupyterHub web:1869] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=eyJ1dWlkIjogIjJmZjQ...hYiJ9 (10.2.136.231): Invalid client_id parameter value.
[W 2023-11-26 10:47:41.777 JupyterHub log:191] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] (<username>@10.2.136.231) 18.87ms
[I 2023-11-26 10:47:50.399 JupyterHub log:191] 302 GET /hub/ -> /user/<username>/ (<username>@10.2.136.231) 3.06ms
[D 2023-11-26 10:47:51.192 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[W 2023-11-26 10:47:51.193 JupyterHub web:1869] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=eyJ1dWlkIj...wYmcifQ (10.2.136.231): Invalid client_id parameter value.
[W 2023-11-26 10:47:51.195 JupyterHub log:191] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] (<username>@10.2.136.231) 5.54ms
[D 2023-11-26 10:50:29.937 JupyterHub scopes:690] Argument-based access to /hub/api/users/<username>/activity via users:activity
[D 2023-11-26 10:50:29.942 JupyterHub users:879] Not updating activity for <User(<username> 1/1 running)>: 2023-11-26T06:58:09.623259Z < 2023-11-26T10:47:41.766690Z
[D 2023-11-26 10:50:29.942 JupyterHub users:900] Not updating server activity on <username>/: 2023-11-26T06:58:09.623259Z < 2023-11-26T10:47:41.766690Z
[I 2023-11-26 10:50:29.943 JupyterHub log:191] 200 POST /hub/api/users/<username>/activity (<username>@172.16.2.82) 16.99ms
[D 2023-11-26 10:55:37.672 JupyterHub base:342] Refreshing auth for <username>

The user is locked out of their lab. Also on a hub pod restart, the user is able to work with their lab pod for sometime.

This error means that there is no oauth client registered with that client id in the database. For example, this could happen if the user’s server has been shutdown. Can you find any other earlier logs for that user’s server and confirm whether it’s running or not? Can you also find any information about how long the server was running?

Check for an earlier logs like:

  • Removed client jupyterhub-user-<username>

Hi @minrk , thank you for taking time to answer this.

Here are the logs for the “username”:


[I 2023-11-16 08:37:33.633 JupyterHub base:835] User logged in: <username>
[I 2023-11-16 08:37:33.886 JupyterHub log:191] 302 GET /hub/ -> /user/<username>/ (<username>@10.2.136.168) 10.14ms
[D 2023-11-16 08:37:34.674 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[D 2023-11-16 08:37:34.675 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/<username>/oauth_callback for client jupyterhub-user-<username>.
[D 2023-11-16 08:37:34.675 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:34.675 JupyterHub provider:495] validate_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:34.678 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes ['read:users:name!user', 'read:users:groups!user', 'access:servers!server=<username>/'] for client 'jupyterhub-user-<username>' (<OAuthClient(identifier='jupyterhub-user-<username>')>).
[D 2023-11-16 08:37:34.679 JupyterHub provider:622] Allowing request for scope(s) for jupyterhub-user-<username>:  access:servers!server=<username>/,read:users:name!user,read:users:groups!user
[D 2023-11-16 08:37:34.680 JupyterHub auth:320] Skipping oauth confirmation for <User(<username> 1/1 running)> accessing Server at /user/<username>/
[D 2023-11-16 08:37:34.681 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[D 2023-11-16 08:37:34.682 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/<username>/oauth_callback for client jupyterhub-user-<username>.
[D 2023-11-16 08:37:34.682 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:34.682 JupyterHub provider:495] validate_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:34.683 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes {'read:users:name!user', 'read:users:groups!user', 'access:servers!server=<username>/'} for client 'jupyterhub-user-<username>' (<OAuthClient(identifier='jupyterhub-user-<username>')>).
[D 2023-11-16 08:37:34.684 JupyterHub provider:622] Allowing request for scope(s) for jupyterhub-user-<username>:  access:servers!server=<username>/,read:users:name!user,read:users:groups!user
[D 2023-11-16 08:37:34.685 JupyterHub provider:246] Saving authorization code jupyterhub-user-<username>, TbG..., (), {}
[I 2023-11-16 08:37:34.692 JupyterHub log:191] 302 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] -> /user/<username>/oauth_callback?code=[secret]&state=[secret] (<username>@10.2.136.168) 20.58ms
[D 2023-11-16 08:37:35.149 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:533] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:35.149 JupyterHub provider:116] confirm_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:35.150 JupyterHub provider:344] Saving bearer token {'access_token': 'REDACTED', 'expires_in': 1209600, 'token_type': 'Bearer', 'scope': 'access:servers!server=<username>/ read:users:name!user read:users:groups!user', 'refresh_token': 'REDACTED'}
[D 2023-11-16 08:37:35.158 JupyterHub provider:204] Deleting oauth code TbG... for jupyterhub-user-<username>
[I 2023-11-16 08:37:35.168 JupyterHub log:191] 200 POST /hub/api/oauth2/token (<username>@172.16.2.8) 36.43ms
[D 2023-11-16 08:37:35.177 JupyterHub base:297] Recording first activity for <APIToken('Vve1...', user='<username>', client_id='jupyterhub-user-<username>')>
[I 2023-11-16 09:05:08.143 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 12.10ms
[D 2023-11-16 09:20:21.088 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 09:20:21.089 JupyterHub log:191] 302 GET /hub/ -> /user/<username>/ (<username>@10.2.136.168) 12.35ms
[D 2023-11-16 09:20:21.869 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[W 2023-11-16 09:20:21.870 JupyterHub web:1869] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=eyJ1dWlkIjogImU5...9zaW5nc2lkL2xhYj8ifQ (10.2.136.168): Invalid client_id parameter value.
[W 2023-11-16 09:20:21.872 JupyterHub log:191] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] (<username>@10.2.136.168) 5.69ms

There are no other log lines containing <username> between 08:37 and 09:20?

Had JupyterHub restarted during that time? What Spawner are you using?

Can you share pip list or conda list for your hub environment?

Hi @minrk Yes there are more log lines for the “username” :


[I 2023-11-16 08:37:33.633 JupyterHub base:835] User logged in: <username>
[I 2023-11-16 08:37:33.886 JupyterHub log:191] 302 GET /hub/ -> /user/<username>/ (<username>@10.2.136.168) 10.14ms
[D 2023-11-16 08:37:34.674 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[D 2023-11-16 08:37:34.675 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/<username>/oauth_callback for client jupyterhub-user-<username>.
[D 2023-11-16 08:37:34.675 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:34.675 JupyterHub provider:495] validate_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:34.678 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes ['read:users:name!user', 'read:users:groups!user', 'access:servers!server=<username>/'] for client 'jupyterhub-user-<username>' (<OAuthClient(identifier='jupyterhub-user-<username>')>).
[D 2023-11-16 08:37:34.679 JupyterHub provider:622] Allowing request for scope(s) for jupyterhub-user-<username>:  access:servers!server=<username>/,read:users:name!user,read:users:groups!user
[D 2023-11-16 08:37:34.680 JupyterHub auth:320] Skipping oauth confirmation for <User(<username> 1/1 running)> accessing Server at /user/<username>/
[D 2023-11-16 08:37:34.681 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[D 2023-11-16 08:37:34.682 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/<username>/oauth_callback for client jupyterhub-user-<username>.
[D 2023-11-16 08:37:34.682 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:34.682 JupyterHub provider:495] validate_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:34.683 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes {'read:users:name!user', 'read:users:groups!user', 'access:servers!server=<username>/'} for client 'jupyterhub-user-<username>' (<OAuthClient(identifier='jupyterhub-user-<username>')>).
[D 2023-11-16 08:37:34.684 JupyterHub provider:622] Allowing request for scope(s) for jupyterhub-user-<username>:  access:servers!server=<username>/,read:users:name!user,read:users:groups!user
[D 2023-11-16 08:37:34.685 JupyterHub provider:246] Saving authorization code jupyterhub-user-<username>, TbG..., (), {}
[I 2023-11-16 08:37:34.692 JupyterHub log:191] 302 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] -> /user/<username>/oauth_callback?code=[secret]&state=[secret] (<username>@10.2.136.168) 20.58ms
[D 2023-11-16 08:37:35.149 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:533] Using provided redirect_uri /user/<username>/oauth_callback
[D 2023-11-16 08:37:35.149 JupyterHub provider:116] confirm_redirect_uri: client_id=jupyterhub-user-<username>, redirect_uri=/user/<username>/oauth_callback
[D 2023-11-16 08:37:35.150 JupyterHub provider:344] Saving bearer token {'access_token': 'REDACTED', 'expires_in': 1209600, 'token_type': 'Bearer', 'scope': 'access:servers!server=<username>/ read:users:name!user read:users:groups!user', 'refresh_token': 'REDACTED'}
[D 2023-11-16 08:37:35.158 JupyterHub provider:204] Deleting oauth code TbG... for jupyterhub-user-<username>
[I 2023-11-16 08:37:35.168 JupyterHub log:191] 200 POST /hub/api/oauth2/token (<username>@172.16.2.8) 36.43ms
[D 2023-11-16 08:37:35.177 JupyterHub base:297] Recording first activity for <APIToken('Vve1...', user='<username>', client_id='jupyterhub-user-<username>')>
[I 2023-11-16 08:37:35.187 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 14.73ms
[D 2023-11-16 08:42:40.140 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 08:42:40.141 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 11.48ms
[D 2023-11-16 08:47:41.403 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 08:47:41.403 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 11.62ms
[D 2023-11-16 08:50:12.499 JupyterHub scopes:677] Unrestricted access to /hub/api/users/<username> via read:users
[I 2023-11-16 08:50:12.501 JupyterHub log:191] 200 GET /hub/api/users/<username> (refresher@172.16.4.6) 9.64ms
[D 2023-11-16 08:52:45.609 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 08:52:45.609 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 13.50ms
[D 2023-11-16 09:05:08.142 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 09:05:08.143 JupyterHub log:191] 200 GET /hub/api/user (<username>@172.16.2.8) 12.10ms
[D 2023-11-16 09:20:21.088 JupyterHub base:342] Refreshing auth for <username>
[I 2023-11-16 09:20:21.089 JupyterHub log:191] 302 GET /hub/ -> /user/<username>/ (<username>@10.2.136.168) 12.35ms
[D 2023-11-16 09:20:21.869 JupyterHub provider:420] Validating client id jupyterhub-user-<username>
[W 2023-11-16 09:20:21.870 JupyterHub web:1869] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=eyJ1dWlkIjogImU5MDY4Mm...9zaW5nc2lkL2xhYj8ifQ (10.2.136.168): Invalid client_id parameter value.
[W 2023-11-16 09:20:21.872 JupyterHub log:191] 400 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-<username>&redirect_uri=%2Fuser%2F<username>%2Foauth_callback&response_type=code&state=[secret] (<username>@10.2.136.168) 5.69ms

We are using KubeSpawner. And Jhub didnt restart

pip list:

Package                          Version
-------------------------------- ------------
aiohttp                          3.8.1
aiosignal                        1.2.0
alembic                          1.9.1
async-generator                  1.10
async-timeout                    4.0.2
attrs                            22.2.0
azure-core                       1.23.1
azure-identity                   1.9.0
bcrypt                           4.0.0
cachetools                       5.3.2
certifi                          2022.12.7
certipy                          0.1.3
cffi                             1.15.1
charset-normalizer               2.1.1
cryptography                     39.0.0
cx-Oracle                        8.3.0
escapism                         1.0.1
frozenlist                       1.3.1
google-auth                      2.23.3
greenlet                         2.0.1
idna                             3.4
importlib-metadata               6.0.0
importlib-resources              6.1.0
Jinja2                           3.1.2
jsonschema                       4.17.3
jupyter-telemetry                0.1.0
jupyterhub                       4.0.0
jupyterhub-firstuseauthenticator 1.0.0
jupyterhub-hmacauthenticator     1.0
jupyterhub-idle-culler           1.2.1
jupyterhub-kubespawner           4.2.0
jupyterhub-ldapauthenticator     1.3.2
jupyterhub-ltiauthenticator      1.2.0
jupyterhub-nativeauthenticator   1.1.0
jupyterhub-tmpauthenticator      0.6
kubernetes                       28.1.0
kubernetes-asyncio               24.2.2
ldap3                            2.9.1
Mako                             1.2.4
MarkupSafe                       2.1.1
msal                             1.24.1
msal-extensions                  0.3.1
our_pvt_auth_class               0.1.4
multidict                        6.0.4
mwoauth                          0.3.8
nullauthenticator                1.0.0
numpy                            1.24.4
oauthenticator                   15.1.0
oauthlib                         3.2.2
onetimepass                      1.0.1
packaging                        23.0
pamela                           1.0.0
pandas                           2.0.3
pip                              23.3.1
pkgutil_resolve_name             1.3.10
portalocker                      2.8.2
prometheus-client                0.15.0
psycopg2-binary                  2.9.3
py-spy                           0.3.14
pyasn1                           0.4.8
pyasn1-modules                   0.3.0
pycparser                        2.21
pycurl                           7.45.2
PyJWT                            2.4.0
PyMySQL                          1.0.2
pyOpenSSL                        23.0.0
pyparsing                        3.0.9
pyrsistent                       0.19.3
python-dateutil                  2.8.2
python-json-logger               2.0.4
python-slugify                   8.0.1
pytz                             2023.3.post1
PyYAML                           6.0
requests                         2.28.1
requests-oauthlib                1.3.1
rsa                              4.9
ruamel.yaml                      0.17.21
ruamel.yaml.clib                 0.2.7
setuptools                       56.0.0
six                              1.16.0
SQLAlchemy                       1.4.46
sqlalchemy-cockroachdb           1.4.4
statsd                           3.3.0
text-unidecode                   1.3
tornado                          6.3.3
traitlets                        5.8.1
typing_extensions                4.8.0
tzdata                           2023.3
urllib3                          1.26.14
websocket-client                 1.6.4
yarl                             1.9.2
zipp                             3.11.0

I’m afraid I don’t have an idea for what to check next. You could try upgrading, since some packages (especially kubespawner and jupyterhub) are a bit out of date, but I don’t currently see how the changes since your versions would affect this.

Somehow the oauth client is getting deleted, but it’s strange that that would happen without shutting down the user’s server.

Sure let me try upgrading everything. Thanks @minrk for your help! Will post here again soon with further updates :slight_smile:

@minrk Do you think this error might occur on one of the deployments if we have registered two urls (for 2 different K8s deployments of Z2JH) with Auth0 under the same auth0_client_id and a user logs in and out of both the deployments at the same time in some order that might break it? Both deployments have separate Hub DBs.

That is conceivable, I’ll have to think more about how it might happen. This is specifically the oauth between the Hub and the single-user server, which is not connected to the oauth between the hub and your oauth provider (more info in the docs), but having two sessions for hubs on sufficiently similar domains could enter into it.

Do the two hubs run on the same or similar domains? Can you share either their actual URLs or two URLs that have the same relationship (e.g. hub1.example.com and hub2.example.com vs hubs.example.com/hub1 and hubs.example.com/hub2)?

@minrk , the domains are sufficiently different : hub1.example.com and hub2.example.com. But have the same auth0_client_id. Some points to note:

  • error occurs on hub2 only. hub1 is on K8s 1.24 and hub2 on K8s 1.27. Both deployments use all same images
  • restarting the hub or the user server fixes the issue for some time before it happens again
  • hub1 and hub2 have separate hub dbs

Hi @minrk , any thoughts on this?

If they both have the same auth0 client, does that mean they both have the same redirect uri with auth0?

I don’t have any great ideas right now, but if you can send complete debug-level logs containing username (possibly even from both the hub and the single-user server), I might be able to make sense of it.

It really should mean that the oauth client is being deleted, which should generally mean that the server has been stopped. But I haven’t seen any log lines to that effect.

Hi @minrk,
We have a few new observations with respect to this issue.

Under what circumstances inactive user-pods should be culled, but the user-pods still be up and the database looks as follows???

spawners table :-
12 12 “{”“pod_name”“: ““jupyter-username””}” “2024-02-06 09:45:30.758641” “2024-02-06 15:18:32.638” “{”“profile”“: ““my-profile””}”

servers table is empty

oauth_clients table also is empty

But the user-pod is up :hushed: and here are the logs:

[D 2024-02-15 12:44:50.869 SingleUserLabApp mixins:491] Notifying Hub of activity 2024-02-06T10:51:15.413727Z
[E 2024-02-15 12:44:50.873 SingleUserLabApp mixins:511] Error notifying Hub of activity
    Traceback (most recent call last):
      File "/opt/conda/lib/python3.8/site-packages/jupyterhub/singleuser/mixins.py", line 509, in notify
        await client.fetch(req)
      File "/opt/conda/lib/python3.8/asyncio/tasks.py", line 349, in __wakeup
        future.result()
    ConnectionRefusedError: [Errno 111] Connection refused
[E 2024-02-15 12:44:50.873 SingleUserLabApp mixins:536] Error notifying Hub of activity
    Traceback (most recent call last):
      File "/opt/conda/lib/python3.8/site-packages/jupyterhub/singleuser/mixins.py", line 534, in keep_activity_updated
        await self.notify_activity()
      File "/opt/conda/lib/python3.8/site-packages/jupyterhub/singleuser/mixins.py", line 516, in notify_activity
        await exponential_backoff(
      File "/opt/conda/lib/python3.8/site-packages/jupyterhub/utils.py", line 237, in exponential_backoff
        raise asyncio.TimeoutError(fail_message)
    asyncio.exceptions.TimeoutError: Failed to notify Hub of activity
[D 2024-02-15 12:49:20.982 SingleUserLabApp mixins:491] Notifying Hub of activity 2024-02-06T10:51:15.413727Z
[D 2024-02-15 12:53:55.058 SingleUserLabApp mixins:491] Notifying Hub of activity 2024-02-06T10:51:15.413727Z
[D 2024-02-15 12:58:39.312 SingleUserLabApp mixins:491] Notifying Hub of activity 2024-02-06T10:51:15.413727Z
[D 2024-02-15 13:03:30.615 SingleUserLabApp mixins:491] Notifying Hub of activity 2024-02-06T10:51:15.413727Z

Here are the hub logs for this user-pod:

[D 2024-02-15 12:43:14.183 JupyterHub proxy:880] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2024-02-15 12:43:14.184 JupyterHub proxy:953] Omitting non-jupyterhub route '/'
[D 2024-02-15 12:43:14.184 JupyterHub proxy:392] Checking routes
[W 2024-02-15 12:43:14.185 JupyterHub proxy:468] Deleting stale route /user/username/
[D 2024-02-15 12:43:14.185 JupyterHub proxy:880] Proxy: Fetching DELETE http://proxy-api:8001/api/routes/user/username
[I 2024-02-15 12:43:14.187 JupyterHub app:3242] JupyterHub is now running, internal Hub API at http://hub:8081/hub/
[D 2024-02-15 12:43:14.188 JupyterHub app:2847] It took 3.102 seconds for the Hub to start
[D 2024-02-15 12:43:14.279 JupyterHub base:297] Recording first activity for <APIToken('66bd...', service='jupyterhub-idle-culler', client_id='jupyterhub')>
[I 2024-02-15 12:43:14.286 JupyterHub log:191] 200 GET /hub/api/ (jupyterhub-idle-culler@::1) 9.60ms
[D 2024-02-15 12:43:14.292 JupyterHub scopes:863] Checking access via scope list:users
--
[D 2024-02-15 12:45:04.323 JupyterHub reflector:362] events watcher timeout
[D 2024-02-15 12:45:04.323 JupyterHub reflector:281] Connecting events watcher
[D 2024-02-15 12:45:04.457 JupyterHub reflector:362] pods watcher timeout
[D 2024-02-15 12:45:04.457 JupyterHub reflector:281] Connecting pods watcher
[D 2024-02-15 12:45:06.604 JupyterHub log:191] 200 GET /hub/health (@172.16.4.1) 0.62ms
[D 2024-02-15 12:45:08.310 JupyterHub base:342] Refreshing auth for username
[D 2024-02-15 12:45:08.317 JupyterHub scopes:863] Checking access via scope users:activity
[D 2024-02-15 12:45:08.317 JupyterHub scopes:690] Argument-based access to /hub/api/users/username/activity via users:activity
[D 2024-02-15 12:45:08.319 JupyterHub users:879] Not updating activity for <User(username 0/2 running)>: 2024-02-06T14:27:15.018783Z < 2024-02-14T06:57:21.708712Z
[D 2024-02-15 12:45:08.319 JupyterHub users:900] Not updating server activity on username/: 2024-02-06T14:27:15.018783Z < 2024-02-14T06:57:21.708712Z
[I 2024-02-15 12:45:08.320 JupyterHub log:191] 200 POST /hub/api/users/username/activity (username@172.16.3.35) 28.14ms
[D 2024-02-15 12:45:14.188 JupyterHub proxy:880] Proxy: Fetching GET http://proxy-api:8001/api/routes
[D 2024-02-15 12:45:14.191 JupyterHub proxy:953] Omitting non-jupyterhub route '/'
[D 2024-02-15 12:45:14.199 JupyterHub proxy:392] Checking routes
[D 2024-02-15 12:45:14.200 JupyterHub utils:278] Server at http://refresher:8081/services/refresher/ responded with 404
[D 2024-02-15 12:45:14.200 JupyterHub app:2493] External service refresher running at http://refresher:8081/

Culler configuration in the deployment-values file looks as follows:

cull:
  adminUsers: false # --cull-admin-users
  removeNamedServers: true # --remove-named-servers
  timeout: 72000
  every: 600
  concurrency: 10 # --concurrency
  maxAge: 0 # --max-age

The culler defined is as follows:

if get_config("cull.enabled", False):
    jupyterhub_idle_culler_role = {
        "name": "jupyterhub-idle-culler",
        "scopes": [
            "list:users",
            "read:users:activity",
            "read:servers",
            "delete:servers",
            # "admin:users", # dynamically added if --cull-users is passed
        ],
        # assign the role to a jupyterhub service, so it gains these permissions
        "services": ["jupyterhub-idle-culler"],
    }

    cull_cmd = ["python3", "-m", "jupyterhub_idle_culler"]
    base_url = c.JupyterHub.get("base_url", "/")
    cull_cmd.append("--url=http://localhost:8081" + url_path_join(base_url, "hub/api"))

    cull_timeout = get_config("cull.timeout")
    if cull_timeout:
        cull_cmd.append(f"--timeout={cull_timeout}")

    cull_every = get_config("cull.every")
    if cull_every:
        cull_cmd.append(f"--cull-every={cull_every}")

    cull_concurrency = get_config("cull.concurrency")
    if cull_concurrency:
        cull_cmd.append(f"--concurrency={cull_concurrency}")

    if get_config("cull.users"):
        cull_cmd.append("--cull-users")
        jupyterhub_idle_culler_role["scopes"].append("admin:users")

    if not get_config("cull.adminUsers"):
        cull_cmd.append("--cull-admin-users=false")

    if get_config("cull.removeNamedServers"):
        cull_cmd.append("--remove-named-servers")

    cull_max_age = get_config("cull.maxAge")
    if cull_max_age:
        cull_cmd.append(f"--max-age={cull_max_age}")

    c.JupyterHub.services.append(
        {
            "name": "jupyterhub-idle-culler",
            "command": cull_cmd,
        }
    )
    c.JupyterHub.load_roles.append(jupyterhub_idle_culler_role)

This should not happen. This is an “orphaned pod” situation, where JupyterHub believes a user’s server has stopped when it hasn’t. Since you have kubespawner 4.2, upgrading z2jh and kubespawner may fix it. We have had some issues related to orphaned pods recently.

The key bit of logs to find is when JupyterHub is deciding to stop the server, or deciding that the server has stopped. Do you still have the logs of jupyterhub startup before [D 2024-02-15 12:43:14.183 JupyterHub proxy:880]?

There might be something like username appears to have stopped while the Hub was down. That would be the bug, or a similar one, that has been fixed in the latest releases of kubespawner/z2jh.

Thanks @minrk , we have upgraded all versions of the pkgs

Package                          Version
-------------------------------- ------------
aiohttp                          3.9.1
aiosignal                        1.3.1
alembic                          1.13.1
async-generator                  1.10
async-timeout                    4.0.3
attrs                            23.2.0
azure-core                       1.29.7
azure-identity                   1.15.0
bcrypt                           4.1.2
cachetools                       5.3.2
certifi                          2023.11.17
certipy                          0.1.3
cffi                             1.16.0
charset-normalizer               3.3.2
click                            8.1.7
cloudpickle                      3.0.0
cryptography                     42.0.1
cx-Oracle                        8.3.0
dask                             2024.1.0
dask-gateway                     2022.11.0
distributed                      2024.1.0
docstring-parser                 0.15
escapism                         1.0.1
frozenlist                       1.4.1
fsspec                           2023.12.2
future                           0.18.3
google-auth                      2.27.0
greenlet                         3.0.3
idna                             3.6
importlib-metadata               7.0.1
Jinja2                           3.1.3
JPype1                           1.5.0
jsonschema                       4.21.1
jsonschema-specifications        2023.12.1
jupyter-telemetry                0.1.0
jupyterhub                       4.0.0
jupyterhub-firstuseauthenticator 1.0.0
jupyterhub-hmacauthenticator     1.0
jupyterhub-idle-culler           1.2.1
jupyterhub-kubespawner           6.2.0
jupyterhub-ldapauthenticator     1.3.2
jupyterhub-ltiauthenticator      1.6.2
jupyterhub-nativeauthenticator   1.2.0
jupyterhub-tmpauthenticator      1.0.0
kazoo                            2.9.0
kubernetes                       29.0.0
kubernetes_asyncio               29.0.0
ldap3                            2.9.1
locket                           1.0.0
lxml                             5.1.0
Mako                             1.3.1
MarkupSafe                       2.1.4
msal                             1.26.0
msal-extensions                  1.1.0
msgpack                          1.0.7
multidict                        6.0.4
mwoauth                          0.4.0
NamedAtomicLock                  1.1.3
nullauthenticator                1.0.0
numpy                            1.26.3
oauthenticator                   15.1.0
oauthlib                         3.2.2
onetimepass                      1.0.1
packaging                        23.2
pamela                           1.1.0
pandas                           2.2.0
partd                            1.4.1
pip                              23.3.2
portalocker                      2.8.2
prometheus-client                0.19.0
psutil                           5.9.8
psycopg2-binary                  2.9.9
py-spy                           0.3.14
pyasn1                           0.5.1
pyasn1-modules                   0.3.0
pycparser                        2.21
pycryptodome                     3.20.0
pycurl                           7.45.2
PyJWT                            2.8.0
PyMySQL                          1.1.0
pyOpenSSL                        24.0.0
pyparsing                        3.1.1
pyrsistent                       0.20.0
python-dateutil                  2.8.2
python-json-logger               2.0.7
python-slugify                   8.0.1
pytz                             2023.3.post1
PyYAML                           6.0.1
referencing                      0.32.1
requests                         2.31.0
requests-oauthlib                1.3.1
rpds-py                          0.17.1
rsa                              4.9
ruamel.yaml                      0.18.5
ruamel.yaml.clib                 0.2.8
scipy                            1.12.0
setuptools                       58.1.0
six                              1.16.0
sortedcontainers                 2.4.0
SQLAlchemy                       2.0.25
sqlalchemy-cockroachdb           2.0.2
statsd                           4.0.1
tblib                            3.0.0
text-unidecode                   1.3
toolz                            0.12.1
tornado                          6.4
traitlets                        5.14.1
typing_extensions                4.9.0
tzdata                           2023.4
urllib3                          2.1.0
websocket-client                 1.7.0
yarl                             1.9.4
zict                             3.0.0
zipp                             3.17.0

Will keep monitoring and post here when I have more updates :slight_smile:

PS: I wonder if the orphaned pods issue as described in the post is exactly the same as our issue or not. It looks like it’s the same, but we are still on z2jh version 2 and the post talks about the bug introduced in version 3.

If you’ve never touched z2jh 3.0.0-alpha.1 or later, I’m fairly certain its not that bug - it was quite unique.

Yes, you are correct. Unfortunately, it was an oversight at our end. Two hubs were sharing the same postgres db :confused: so one hub cleaned up the other’s user pods and this error was a result of that. Thanks for your help! :slight_smile: @consideRatio @minrk

1 Like