Lazy loading of jupyter labextensions

Hello community,

I’m currently using JupyterLab version 3.6.3 and I have a question regarding the lazy loading of extensions. I would like to know if the lazy loading feature is supported in this version. Specifically, I’m curious whether the JavaScript files associated with Jupyter labextensions are loaded only when the extension is in use, or if all the JS files associated with the labextensions are re-loaded on a jupyterlab page reload.

Also I observed that the lazy loading is not working when installing the extension via jupyter labextension install command (as a source extension) and working on installing the extension via pip install. Is this the desired behaviour or this obversion is incorrect.

Can you please answer these questions?

Or to keep it short in what cases the lazy loading of jupyter labextensions will happen?

Also to experiment with I was installing this open source jupyterlab-drawio labextension. GitHub - QuantStack/jupyterlab-drawio: A standalone embedding of the FOSS drawio / mxgraph package into jupyterlab

Thank you in advance for your assistance!

The key docs to read are the Overview of Extensions.

  • old-style “source” extensions had to be published on npmjs.org, or your organization’s npmjs.org-like repo
  • new-style “prebuilt” extensions should be packaged as wheels on PyPI, or equivalent
    • it’s good manners to also publish on npmjs.org, if you’d like people to be able to import some part of your code

At any rate, new-start extensions should only use the build-time command prior/during generating a .tar.gz/.whl:

jupyter labextension build

After a user installs the extension and reloads the brower, the .js file referenced in the extension’s package.json will be discovered and loaded, along with all its explicit imports:

"jupyterlab": {
    "extension": "lib/plugin.js"
}

Inside an extension, it is indeed good manners to only load the absolute minimum needed, and defer loading heavy-weight dependencies not covered by core (e.g. it’s fine to import react, anything from @lumino, and most @jupyterlab packages) behind an async boundary, keeping plugin.ts as small as possible, but still well-typed.

Without knowing more about the extension (e.g. mime renderers and jupyter widgets are different), an example of such boundaries are:

  • within a command
  • on a app.started.then
// plugin.ts
import type { ExpensiveThing } from '@heavy/third-party';

async function doTheThing (args) => {
     const {doSomethingExpensive} = await import('./wrapper');
     const thing: ExpensiveThing = await doSomethingExpensive(args);
}

const plugin = {
    activate: async (app, ...otherRequires, ...otherOptional) => {
        await doSomethingCheapButBlocking();
       
        app.started.then()
        
        app.commands.register('my-plugin:command', {
            execute: doTheThing
        });
    }
}

Where the complexity is mostly hidden inside a wrapper:

// wrapper.ts
import { ExpensiveThing } from '@heavy/third-party';

export async function doSomethingExpensive(args) {
    const thing = new ExpensiveThing(args);
    await thing.doSomethingSlowly();
    return thing;
}

This gets more complicated once a plugin provides something, as often such a pattern will have a “ThingManager” that needs to be fully instantiated.

4 Likes