I’m trying to create a flow that I can see which kernels are taking up the most CPU/RAM/Resources on my server. On the original Jupyter Notebook, I would do this through api/sessions and then do some awk/ps/bash magic to link those sessions/kernels to processes etc.
Though with JupyterLab, I have a service config like in https://jupyterhub.readthedocs.io/en/stable/reference/rest.html with this as my config:
c.JupyterHub.services = [
{
"name": "service-admin",
"api_token": "my-token",
},
]
c.JupyterHub.load_roles = [
{
"name": "service-role",
"scopes": [
# specify the permissions the token should have
"admin:users",
"admin:servers",
"proxy"
],
"services": [
# assign the service the above permissions
"service-admin",
],
}
]
and this works fine for looking at endpoints found here JupyterHub REST API — JupyterHub documentation, but I want to access each lab server’s API endpoint so I can grab everyone’s sessions/kernels.
There’s a blurb in the API docs saying
The same API token can also authorize access to the Jupyter Notebook REST API provided by notebook servers managed by JupyterHub if it has the necessary
access:users:servers
scope:
but that link is broken and any number of Authorization tags/guessing via the proxy URLs result in 404 pages on my end. The closest I’ve gotten to any kind of result is going to /user/{my user}/api/status
and getting this:
{
"message": "Forbidden",
"reason": null
}
Instead of a 404 result. What am I missing? Am I able to get a flow where I can access each lab’s api/sessions?