It forced me to change some parameters to the new model, respectively:
rbac:
enabled: true
to
rbac:
create: true
Now when people try to start their server via our url it looks like it’s starting but the progress bar doesn’t move and doesn’t auto redirect them to their lab env. When you refresh the page it throws you to /user/<user>/tree destination instead of /user/<user>/lab as it used to.
When you change the url to /user//lab it works as intended.
Could you please help fixing our current values.yaml so it works the same way as it used to before the upgrade? Many thanks!
Do you get the same error when you launch the jupyter/datascience-notebook image, or only with your custom one? What version of the datascience-notebook image are you using? latest could be an old version if it was pulled a while ago, so in general it’s best to use a more specific tag.
Can you turn on debug/verbose logging, and show us the logs for the hub and the launched pod?
Hi manics, you were absolutely right, thanks! Our custom image was sourcing an old docker image, there was even a message about it in logs jupyterhub version 3.0.0 != jupyterhub-singleuser version 1.4.2.
It almost behaves the way it should except one thing - starting a server from our url doesn’t bring you to /user/<user>/lab automatically, still showing the progress bar until you refresh the page which brings you correctly to /user/<user>/lab instead of the previously wrong path /user/<user>/tree
Your server is starting up.
You will be redirected automatically when it's ready for you.
Also when I try to stop my server from the menu I get this error message
API request failed (403): Action is not authorized with current scopes; requires any of [delete:servers]
Here are some logs where 21:40 the server was started and 21:42 the page was refreshed
[I 2022-10-13 21:40:00.895 JupyterHub log:186] 302 GET / -> /hub/ (@::ffff:11.11.11.254) 0.89ms
[I 2022-10-13 21:40:01.004 JupyterHub log:186] 302 GET /user/something/api/contents?content=1&1665697211837 -> /hub/user/something/api/contents?content=1&1665697211837 (@::ffff:11.11.11.254) 0.83ms
[I 2022-10-13 21:40:01.118 JupyterHub log:186] 302 GET /hub/ -> /hub/spawn (something@::ffff:11.11.11.138) 2.92ms
[W 2022-10-13 21:40:01.227 JupyterHub base:1409] Failing suspected API request to not-running server: /hub/user/something/api/contents
[W 2022-10-13 21:40:01.228 JupyterHub log:186] 424 GET /hub/user/something/api/contents?content=1&1665697211837 (something@::ffff:11.11.11.138) 3.85ms
[I 2022-10-13 21:40:01.354 JupyterHub log:186] 200 GET /hub/spawn (something@::ffff:11.11.11.254) 16.19ms
[I 2022-10-13 21:40:04.443 JupyterHub provider:651] Creating oauth client jupyterhub-user-something
[I 2022-10-13 21:40:04.473 JupyterHub spawner:2509] Attempting to create pvc claim-something, with timeout 3
[I 2022-10-13 21:40:04.476 JupyterHub log:186] 302 POST /hub/spawn -> /hub/spawn-pending/something (something@::ffff:11.11.11.254) 71.83ms
[I 2022-10-13 21:40:04.505 JupyterHub spawner:2525] PVC claim-something already exists, so did not create new pvc.
[I 2022-10-13 21:40:04.512 JupyterHub spawner:2469] Attempting to create pod jupyter-something, with timeout 3
[I 2022-10-13 21:40:04.697 JupyterHub pages:394] something is pending spawn
[I 2022-10-13 21:40:04.698 JupyterHub log:186] 200 GET /hub/spawn-pending/something (something@::ffff:11.11.11.138) 3.75ms
[W 2022-10-13 21:40:04.993 JupyterHub base:89] Blocking Cross Origin API request. Referer: https://jupyter.something-dev.something.com/hub/spawn-pending/something, Host: jupyter.something-dev.something.com, Host URL: http://jupyter.something-dev.something.com/hub/
[W 2022-10-13 21:40:04.993 JupyterHub scopes:804] Not authorizing access to /hub/api/users/something/server/progress. Requires any of [read:servers], not derived from scopes []
[W 2022-10-13 21:40:04.994 JupyterHub web:1796] 403 GET /hub/api/users/something/server/progress (::ffff:11.11.11.138): Action is not authorized with current scopes; requires any of [read:servers]
[W 2022-10-13 21:40:04.994 JupyterHub log:186] 403 GET /hub/api/users/something/server/progress (@::ffff:11.11.11.138) 3.17ms
[I 2022-10-13 21:40:15.662 JupyterHub log:186] 200 GET /hub/api (@11.11.11.67) 0.73ms
[I 2022-10-13 21:40:15.682 JupyterHub log:186] 200 POST /hub/api/users/something/activity (something@11.11.11.67) 14.65ms
[I 2022-10-13 21:40:19.663 JupyterHub base:963] User something took 15.257 seconds to start
[I 2022-10-13 21:40:19.663 JupyterHub proxy:333] Adding user something to proxy /user/something/ => http://11.11.11.67:8888
[I 2022-10-13 21:42:01.893 JupyterHub log:186] 302 GET /hub/spawn-pending/something -> /user/something/ (something@::ffff:11.11.11.254) 13.32ms
[W 2022-10-13 21:42:02.569 JupyterHub auth:298] Service Server at /user/something/ requested scopes access:servers!server=something/,read:users:groups!user,read:users:name!user,access:servers!user=something for user something, granting only access:servers!server=something.
[I 2022-10-13 21:42:02.579 JupyterHub log:186] 302 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-something&redirect_uri=%2Fuser%2Fsomething%2Foauth_callback&response_type=code&state=[secret] -> /user/something/oauth_callback?code=[secret]&state=[secret] (something.something
[I 2022-10-13 21:42:02.911 JupyterHub log:186] 200 POST /hub/api/oauth2/token (something@11.11.11.67) 37.46ms
[I 2022-10-13 21:42:02.925 JupyterHub log:186] 200 GET /hub/api/user (something@11.11.11.67) 11.60ms
notebook [I 2022-10-13 21:42:02.115 SingleUserLabApp log:186] 302 GET /user/something/ -> /user/something/lab? (@::ffff:11.11.11.254) 0.69ms
notebook [I 2022-10-13 21:42:02.339 SingleUserLabApp log:186] 302 GET /user/something/lab -> /hub/api/oauth2/authorize?client_id=jupyterhub-user-something&redirect_uri=%2Fuser%2Fsomething%2Foauth_callback&response_type=code&state=[secret] (@::ffff:11.11.11.138) 3.08ms
notebook [I 2022-10-13 21:42:02.926 SingleUserLabApp auth:1227] Logged-in user {'admin': True, 'kind': 'user', 'name': 'something', 'groups': [], 'session_id': 'f46cf3aa404847ce8a41de409cc8225b', 'scopes': ['access:servers!server=something/', 'read:users:groups!user=something', 'rea
notebook [I 2022-10-13 21:42:02.926 SingleUserLabApp log:186] 302 GET /user/something/oauth_callback?code=[secret]&state=[secret] -> /user/something/lab (@::ffff:11.11.11.138) 56.05ms
notebook [I 2022-10-13 21:42:03.159 SingleUserLabApp log:186] 200 GET /user/something/lab (something@::ffff:11.11.11.254) 13.57ms
notebook [I 2022-10-13 21:42:04.138 SingleUserLabApp log:186] 200 GET /user/something/api/kernelspecs?1665697323882 (something@::ffff:11.11.11.254) 94.89ms
notebook [I 2022-10-13 21:42:04.199 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings?1665697323885 (something@::ffff:11.11.11.138) 59.09ms
notebook [I 2022-10-13 21:42:04.200 SingleUserLabApp log:186] 200 GET /user/something/api/kernels?1665697323887 (something@::ffff:11.11.11.254) 56.95ms
notebook [I 2022-10-13 21:42:04.201 SingleUserLabApp log:186] 200 GET /user/something/api/sessions?1665697323887 (something@::ffff:11.11.11.254) 57.00ms
Stream closed EOF for jupyterhub/jupyter-something (block-cloud-metadata)
notebook [I 2022-10-13 21:42:04.201 SingleUserLabApp log:186] 200 GET /user/something/api/terminals?1665697323887 (something@::ffff:11.11.11.138) 56.11ms
notebook [I 2022-10-13 21:42:04.360 SingleUserLabApp log:186] 200 GET /user/something/api/kernelspecs?1665697324197 (something@::ffff:11.11.11.254) 2.08ms
notebook [I 2022-10-13 21:42:05.705 SingleUserLabApp log:186] 200 GET /user/something/lab/api/translations/en?1665697325525 (something@::ffff:11.11.11.138) 16.03ms
notebook [I 2022-10-13 21:42:05.941 SingleUserLabApp log:186] 200 GET /user/something/lab/api/listings/@jupyterlab/extensionmanager-extension/listings.json?1665697325780 (something@::ffff:11.11.11.138) 1.10ms
notebook [I 2022-10-13 21:42:05.944 SingleUserLabApp log:186] 200 GET /user/something/api/contents?content=1&1665697325782 (something@::ffff:11.11.11.254) 3.10ms
notebook [I 2022-10-13 21:42:05.969 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/cell-toolbar-extension:plugin?1665697325800 (something@::ffff:11.11.11.138) 7.32ms
notebook [I 2022-10-13 21:42:05.992 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/docmanager-extension:plugin?1665697325801 (something@::ffff:11.11.11.254) 22.19ms
notebook [I 2022-10-13 21:42:05.995 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/shortcuts-extension:shortcuts?1665697325800 (something@::ffff:11.11.11.254) 21.05ms
notebook [I 2022-10-13 21:42:06.000 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/mainmenu-extension:plugin?1665697325801 (something@::ffff:11.11.11.138) 22.30ms
notebook [I 2022-10-13 21:42:06.003 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/htmlviewer-extension:plugin?1665697325801 (something@::ffff:11.11.11.138) 21.62ms
notebook [I 2022-10-13 21:42:06.006 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/notebook-extension:panel?1665697325801 (something@::ffff:11.11.11.254) 20.70ms
notebook [I 2022-10-13 21:42:06.194 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/filebrowser-extension:widget?1665697326019 (something@::ffff:11.11.11.138) 12.15ms
notebook [I 2022-10-13 21:42:06.290 SingleUserLabApp log:186] 200 GET /user/something/lab/api/translations/?1665697326104 (something@::ffff:11.11.11.254) 25.41ms
notebook [I 2022-10-13 21:42:06.528 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/shortcuts-extension:shortcuts?1665697326156 (something@::ffff:11.11.11.254) 77.29ms
notebook [I 2022-10-13 21:42:06.535 SingleUserLabApp log:186] 200 GET /user/something/lab/api/workspaces/default?1665697326155 (something@::ffff:11.11.11.138) 80.05ms
notebook [I 2022-10-13 21:42:06.551 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/docmanager-extension:plugin?1665697326184 (something@::ffff:11.11.11.254) 94.74ms
notebook [I 2022-10-13 21:42:06.556 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/fileeditor-extension:plugin?1665697326184 (something@::ffff:11.11.11.254) 94.40ms
notebook [I 2022-10-13 21:42:06.571 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/csvviewer-extension:tsv?1665697326184 (something@::ffff:11.11.11.138) 87.99ms
notebook [I 2022-10-13 21:42:06.584 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/csvviewer-extension:csv?1665697326184 (something@::ffff:11.11.11.138) 96.85ms
notebook [I 2022-10-13 21:42:06.593 SingleUserLabApp log:186] 200 GET /user/something/lab/api/settings/@jupyterlab/application-extension:context-menu?1665697326185 (something@::ffff:11.11.11.254) 101.90ms
notebook [I 2022-10-13 21:42:06.636 LabApp] Build is up to date
notebook [I 2022-10-13 21:42:06.637 SingleUserLabApp log:186] 200 GET /user/something/lab/api/build?1665697325772 (something@::ffff:11.11.11.254) 704.06ms
I1013 21:39:50.012656 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="52.061µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:32972" resp=200
I1013 21:40:00.045201 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="81.066µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:53872" resp=200
I1013 21:40:00.061304 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="78.096µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:53882" resp=200
I1013 21:40:04.544866 1 eventhandlers.go:118] "Add event for unscheduled pod" pod="jupyterhub/jupyter-something"
I1013 21:40:04.544924 1 scheduling_queue.go:933] "About to try and schedule pod" pod="jupyterhub/jupyter-something"
I1013 21:40:04.544935 1 scheduler.go:443] "Attempting to schedule pod" pod="jupyterhub/jupyter-something"
I1013 21:40:04.545099 1 binder.go:817] "All bound volumes for pod match with node" pod="jupyterhub/jupyter-something" node="ip-11-11-11-138.ec2.internal"
I1013 21:40:04.545201 1 binder.go:374] "AssumePodVolumes" pod="jupyterhub/jupyter-something" node="ip-11-11-11-138.ec2.internal"
I1013 21:40:04.545221 1 binder.go:382] "AssumePodVolumes: all PVCs bound and nothing to do" pod="jupyterhub/jupyter-something" node="ip-11-11-11-138.ec2.internal"
I1013 21:40:04.545301 1 default_binder.go:52] "Attempting to bind pod to node" pod="jupyterhub/jupyter-something" node="ip-11-11-11-138.ec2.internal"
I1013 21:40:04.555478 1 scheduler.go:621] "Successfully bound pod to node" pod="jupyterhub/jupyter-something" node="ip-11-11-11-138.ec2.internal" evaluatedNodes=2 feasibleNodes=1
I1013 21:40:04.555500 1 eventhandlers.go:161] "Delete event for unscheduled pod" pod="jupyterhub/jupyter-something"
I1013 21:40:04.555547 1 eventhandlers.go:186] "Add event for scheduled pod" pod="jupyterhub/jupyter-something"
I1013 21:40:10.012968 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="82.879µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:52800" resp=200
I1013 21:40:10.013733 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="62.875µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:52808" resp=200
I1013 21:40:20.011815 1 httplog.go:129] "HTTP" verb="GET" URI="/healthz" latency="72.114µs" userAgent="kube-probe/1.23+" audit-ID="" srcIP="11.11.11.254:59534" resp=200
Not yet, still just refreshing. It’ll need some rbac setup as well cause now users are not able to terminate their servers. our proxy/load-balancer works everywhere else except here. Did you manage to find a solution?
Hi @Ondatra, have you made any progress. I’m experiencing the same issues.
In my case I think I can track the 403 error when terminating servers due to a deprecation of the PodSecurityPolicy (PSP) v1beta1.
This policy and some dependencies related to it are used in the helm chart for the aws-node-termination-handler.
See `policy/v1beta1` has been deprecated since kubectl version 1.25 · Issue #856 · aws/eks-charts · GitHub for more details.
I’m also experiencing the same problem with the loading bar.
Hey @ebab (and all others on this thread) has anyone found a solution to this issue of seeing the “Your server is starting up” message and not being auto redirected into the notebook?
If we wait and refresh it works it’s way along into an eventual notebook, but takes about 30 - 60 seconds.
I’m running this in a vSphere with Tanzu config so it may be localised issues, but never seen this behaviour when running from a development Mac or a colleagues Win machine.
Needless to say the pod and pv/pvc are all stood up in good time and healthy status.