Using async functions for an nbconvert exporter and bundler at the same time

I have some Python code that converts a notebook to some other format. That code is one async function because I want to use a library that exposes async functions.

I then declared an entrypoint in my package’s to tell nbconvert about my new exporter. In the from_notebook_node() of my exporter class I call my async conversion function like this:

with tempfile.NamedTemporaryFile() as f:
    b =
return (b, resources)

This works. I can run jupyter-nbconvert --to=myformat notebook.ipynb and get a converted notebook.

To make my new exporter available in the “Download as…” menu of the notebook UI I then add export_from_notebook = "MyFormat" to my exporter class. The exporter shows up, but if you try and use it things break.

I’ve lost the exact error but I think it was RuntimeError: Cannot run the event loop while another loop is running. The problem being that the notebook server is a tornado application, so the asyncio eventloop is already running. I can’t get a new one and tell it to run_until_complete. This all makes sense.

I have two questions:

  1. in general is there a way to schedule a asyncly_convert_notebook coroutine to be executed in a running event loop (of tornado) and wait for the result while in a sync function (in this case from_notebook_node)?
  2. what do people do who have async functions that they need to call while converting a notebook and want their extension to be available as a bundler extension and in nbconvert?
  3. is there an entrypoint I can declare in my to get my bundler extension enabled automagically? Right now I need to run jupyter bundlerextension enbable ... to register a bundler extension. Is there something like nbconvert.exporters which allows me to declare a new nbconvert exporter in my packages

I have no idea how to do (1), but this is more a Python asyncio aficionado topic than Jupyter related.

For (2) I’ve now gone with not declaring export_from_notebook on my exporter class and instead adding a explicit _jupyter_bundlerextension_paths() to my module.

I am crossing my freshly washed fingers that someone has an idea :slight_smile:

1 Like

For those interested in (1) and obscure Python things: is (I think) a minimal example of what I think needs to be solved. The example involves no tornado or jupyter code though, so much simplified.

Have you tried using nest_asyncio?

1 Like

I didn’t even know it existed :slight_smile: Might give it a go next time. I had found asgiref asynctosync but after looking at how much code it was decided not to use it.

I have found that the following snippet works. I like it because as a normal human you can understand what is happening, even if you don’t (yet) know why you need the thread pool.

import asyncio
import concurrent.futures

pool = concurrent.futures.ThreadPoolExecutor()

async def asyncs():
    await asyncio.sleep(1)
    return 6

def times_syncs(x):
    six = pool.submit(, asyncs()).result()
    return x * six

async def async_outer():
    return times_syncs(7)

def main():
    x = asyncio.get_event_loop().run_until_complete(async_outer())
    print("What is six times seven?", x)

if __name__ == "__main__":

This seems to work well and for my use case (starting a headless chrome) the cost of an extra thread is negligible.

1 Like

Instead of getting the loop explicitly, you can also use the asyncio run function. As we can see in this example, async/await adds quite some complexity, I guess especially when nesting existing projects.

Yes, both approaches can work. We use nest_asyncio in nbclient too, when we want to execute a notebook from another notebook for instance:
This is because in nbclient all methods are async first, and then have wrappers that run them in the event loop to get a blocking API, so we were also confronted to the “event loop already running” issue.
But launching a thread with its own event loop and running until complete works too. That’s basically what you do with your ThreadPoolExecutor, but you could also start a plain thread and join it.