Culling JupyterLab and OAuth cookies causing errors when user returns days later to use system

Hi there,

We have a really large deployment that we are getting ready to make publically available shortly; however, we have a problem with Jupyter holding onto Oauth session cookies after the user closes their browser.

If we run Jupyter Notebooks instead of Lab we can tell our users “make sure you stop your server and log out”, but in Lab there is no option to log off and telling users to go to Help > Launch Classic Notebook makes for a poor UX.

We have tried to implement culling of user servers and in “just in case it helps” expiring of cookies, but neither seems to be doing what we expect. Culling does kill some of the servers, but the user’s OAuth cookie is retained. The other option we have set is Expiring cookies c.JupyterHub.cookie_max_age_days; however, this setting is in days so every so often we run into issues with cached credentials anyway.

It could be very well that we are missing how these settings are supposed to work which is why I am reaching out.

What exactly is the problem caused by users still being logged in? Can you describe what’s occurring from the user’s perspective and how it differs from what you want/expect? A user being logged in a bit longer than you might expect shouldn’t be a problem, as far as I can tell.

The error we see is this.

It clears once you go to the Control Panel in classic mode, stop your server and logout.

Thank you for helping me out!

Question regarding the pre-spawn hook in JupyterHub. Does pre-spawn get called on login regardless of whether the user has a notebook already running? Does pre-spawn get called at every login in?