Jupyter Kernel HTTP Endpoint: Which information is needed?

For some reason I can’t start a kernel through the API

resp = await self.session.post(
    self.notebook_url / 'api/kernels', 
    headers={'X-XSRFToken': self.xsrf_token})

The full original code is in https://github.com/yuvipanda/hubtraf/blob/80d33a9cdd31c68c318dce18cc24a30dc2769f49/hubtraf/user.py#L138 .

Instead of a working kernel I receive a

{"status": 405, "message": "Method Not Allowed"}

The funny thing is that the code worked in the past, it must be related to some update. Last I ran it in August 2019. docker automatically updates the JupyterHub I am using.

The documentation at https://github.com/jupyter/jupyter/wiki/Jupyter-Notebook-Server-API#kernel-api creates the impression that posting is fine. Here this also seems to be ok: github-DOT-com/jupyter/notebook/blob/master/notebook/services/kernels/handlers.py#L36 (sorry, there is a limitation to two links for a new user).

Does anybody know how solve this? I want to continue to create kernels through API calls.

Thank you so much!

The odds are that the 405 error is really a 404, where it’s not actually hitting the /api/kernels endpoint, it’s getting another endpoint that doesn’t support post requests.

Check the value of the URL actually being requested and the base URL of the notebook server you are talking to. My guess is that self.notebook_url is not what it should be.

-Min

1 Like

Thank you very much for your suggestion, @minrk . I did the same steps in the script (with many debug messages) and in the browser. The results stays the same - if I open the link http:///user//api/kernels, that works perfectly in the browser. If I have not yet opened a Jupyter Notebook, it returns an empty list. If I started a kernel, it shows something like this:

    [
        {
           "id": "cfcba670-9773-44c9-b8a2-38e820f86236", 
           "name": "python3", 
           "last_activity": "2020-03-02T15:41:07.581421Z", 
           "execution_state": "idle", 
           "connections": 1
        }
    ]

I enriched the lines from https://github.com/yuvipanda/hubtraf/blob/80d33a9cdd31c68c318dce18cc24a30dc2769f49/hubtraf/user.py#L137 as follows:

        try:
            api_endpoint = self.notebook_url / 'api/kernels'
            self.log.msg(f'try api endpoint {api_endpoint}')
            resp = await self.session.post(api_endpoint, headers={'X-XSRFToken': self.xsrf_token})
        except Exception as e:
            self.log.msg('Kernel: Start failed {}'.format(str(e)), action='kernel-start', phase='failed', duration=time.monotonic() - start_time)
            raise OperationError()
        if resp.status != 201:
            self.log.msg(f'Kernel: Start failed with response code {resp.status}', action='kernel-start', phase='failed', extra=str(resp), duration=time.monotonic() - start_time)
            raise OperationError()

This resulted in the following output:

2020-03-02 16:39.22 Server: Started (Jupyter Notebook) action=server-start attempt=2 duration=0.26599999999962165 phase=complete username=<USERNAME>
2020-03-02 16:39.22 Kernel: Starting               action=kernel-start phase=start username=<USERNAME>
2020-03-02 16:39.22 try api endpoint <JupyterHub-URL>/user/<USERNAME>/api/kernels username=<USERNAME>
2020-03-02 16:39.22 Kernel: Start failed with response code 405 action=kernel-start duration=0.0 extra=<ClientResponse(<JupyterHub-URL>/user/<USERNAME>/api/kernels) [405 Method Not Allowed]>

I have copied the very same link and it smoothly worked in the browser. I checked the JupyterHub log and it complained about the 405 but offered the information to the browser easily.

My conclusion is that it is not about the endpoint itself but somehow browsers “talk” differently compared to aiohttp in this example.

I played around with an HTTP request creator in my browser and I could get a 403 as a status code when my XSRF Cookie was missing or had an invalid value. But it was never a 405. The value which I have in the code I printed out and it looks very similar to what I have in my browser.

Moreover, I used tcpdump to find out whether the library really issues a POST request and I could find it

As far as I understand the API that should be valid. Or have I skipped a detail?

Based on this:

the code worked in the past, it must be related to some update

My hunch is that it’s the changes to the spawn progress pages and how the Hub handles requests for user URLs that aren’t yet running that hubtraf isn’t accounting for (I have not confirmed this). If that’s the case and the ensure_server step is returning prematurely, then the request would be handled by the Hub instead of the single-user server, which would look like this. Can you share more of the leadup in the logs, including the start of the server request and the logs of the Hub itself? Some component should be logging that 405 request. If it’s the Hub, then ensure_server likely needs updating to make sure it’s checking correctly verifying that the server is running and proxied-to.

This is what I get through docker logs. By the way, I added an additional waiting time of 10 seconds between the break out of the while loop and the return of the asynchronous method ensure_server.:

The JupyterHub:

The JupyterNotebookApp:

Are there any further logs which could be of help?